Archive for the ‘Philosophy’ Category

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

I discovered “The Joe Rogan Experience” on YouTube recently and have been sampling from among the nearly 900 pod- or webcasts posted there. I’m hooked. Rogan is an impressive fellow. He clearly enjoys the life of the mind but, unlike many who are absorbed solely in ideas, has not ignored the life of the body. Over time, he’s also developed expertise in multiple endeavors and can participate knowledgeably in discussion on many topics. Webcasts are basically long, free-form, one-on-one conversations. This lack of structure gives the webcast ample time to explore topics in depth or simply meander. Guests are accomplished or distinguished in some way and usually have fame and wealth to match, which often affects content (i.e., Fitzgerald’s observation: “The rich are different than you and me”). One notable bar to entry is having a strong media presence.

Among the recurring themes, Rogan trots out his techno optimism, which is only a step short of techno utopianism. His optimism is based on two interrelated developments in recent history: widespread diffusion of information over networks and rapid advances in medical devices that can be expected to accelerate, to enhance human capabilities, and soon to transform us into supermen, bypassing evolutionary biology. He extols these views somewhat regularly to his guests, but alas, none of the guests I’ve watched seem to be able to fathom the ideas satisfactorily enough to take up the discussion. (The same is true of Rogan’s assertion that money is just information, which is reductive and inaccurate.) They comment or joke briefly and move onto something more comfortable or accessible. Although I don’t share Rogan’s optimism, I would totally engage in discussion of his flirtation with Transhumanism (a term he doesn’t use). That’s why I’m blogging here about Rogan, in addition to my lacking enough conventional distinction and fame to score an invite to be a guest on his webcast. Plus, he openly disdains bloggers, many of whom moderate comments (I don’t) or otherwise channel discussion to control content. Oh, well.


Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

Back in the day, I studied jazz improvisation. Like many endeavors, it takes dedication and continuous effort to develop the ear and learn to function effectively within the constraints of the genre. Most are familiar with the most simple form: the 12-bar blues. Whether more attuned to rhythm, harmony, lyrics, or structure doesn’t much matter; all elements work together to define the blues. As a novice improviser, structure is easy to grasp and lyrics don’t factor in (I’m an instrumentalist), but harmony and rhythm, simple though they may be to understand, are formidable when one is making up a solo on the spot. That’s improvisation. In class one day, after two passes through the chord changes, the instructor asked me how I thought I had done, and I blurted out that I was just trying to fill up the time. Other students heaved a huge sigh of recognition and relief: I had put my thumb on our shared anxiety. None of us were skilled enough yet to be fluent or to actually have something to say — the latter especially the mark of a skilled improvisor — but were merely trying to plug the whole when our turn came.

These days, weekends feel sorta the same way. On Friday night, the next two days often feel like a yawning chasm where I plan what I know from experience will be an improvisation, filling up the available time with shifting priorities, some combination of chores, duties, obligations, and entertainments (and unavoidable bodily functions such as eating, sleeping, etc.). Often enough I go back to work with stories to tell about enviable weekend exploits, but just I often have a nagging feeling that I’m still a novice with nothing much to say or contribute, just filling up the time with noise. And as I contemplate what years and decades may be left to me (if the world doesn’t crack up first), the question arises: what big projects would I like to accomplish before I’m done? That, too, seems an act of improvisation.

I suspect recent retirees face these dilemmas with great urgency until they relax and decide “who cares?” What is left to do, really, before one finally checks out? If careers are completed, children are raised, and most of life’s goals are accomplished, what remains besides an indulgent second childhood of light hedonism? Or more pointedly, what about one’s final years keeps it from feeling like quiet desperation or simply waiting for the Grim Reaper? What last improvisations and flourishes are worth undertaking? I have no answers to these questions. They don’t press upon me just yet with any significance, and I suffered no midlife crisis (so far) that would spur me to address the questions head on. But I can feel them gathering in the back of my mind like a shadow — especially with the specters of American-style fascism, financial and industrial collapse, and NTE looming.

Caveat: Apologies for this overlong post, which random visitors (nearly the only kind I have besides the spambots) may find rather challenging.

The puzzle of consciousness, mind, identity, self, psyche, soul, etc. is an extraordinarily fascinating subject. We use various terms, but they all revolve around a unitary property and yet come from different approaches, methodologies, and philosophies. The term mind is probably the most generic; I tend to use consciousness interchangeably and more often. Scientific American has a entire section of its website devoted to the mind, with subsections on Behavior & Society, Cognition, Mental Health, Neurological Health, and Neuroscience. (Top-level navigation offers links to these sections: The Sciences, Mind, Health, Tech, Sustainability, Education, Video, Podcasts, Blogs, and Store.) I doubt I will explore very deeply because science favors the materialist approach, which I believe misses the forest through the trees. However, the presence of this area of inquiry right at the top of the page indicates how much attention and research the mind/consciousness is currently receiving.

A guest blog at Scientific American by Adam Bear entitled “What Neuroscience Says about Free Will” makes the fashionable argument (these days) that free will doesn’t exist. The blog/article is disclaimed: “The views expressed are those of the author(s) and are not necessarily those of Scientific American.” I find that a little weaselly. Because the subject is still wide open to interpretation and debate, Scientific American should simply offer conflicting points of view without worry. Bear’s arguments rest on the mind’s ability to revise and redate experience occurring within the frame of a few milliseconds to allow for processing time, also known as the postdictive illusion (the opposite of predictive). I wrote about this topic more than four years ago here. Yet another discussion is found here. I admit to being irritated that the questions and conclusions stem from a series of assumptions, primarily that whatever free will is must occur solely in consciousness (whatever that is) as opposed to originating in the subconscious and subsequently transferring into consciousness. Admittedly, we use these two categories — consciousness and the subconscious — to account for the rather limited amount of processing that makes it all the way into awareness vs. the significant amount that remains hidden or submerged. A secondary assumption, the broader project of neuroscience in fact, is that, like free will, consciousness is housed somewhere in the brain or its categorical functions. Thus, fruitful inquiry results from seeking its root, seed, or seat as though the narrative constructed by the mind, the stream of consciousness, were on display to an inner observer or imp in what Daniel Dennett years ago called the Cartesian Theater. That time-worn conceit is the so-called ghost in the machine. (more…)

The last time I blogged about this topic, I took an historical approach, locating the problem (roughly) in time and place. In response to recent blog entries by Dave Pollard at How to Save the World, I’ve delved into the topic again. My comments at his site are the length of most of my own blog entries (3–4 paras.), whereas Dave tends to write in chapter form. I’ve condensed to my self-imposed limit.

Like culture and history, consciousness is a moving train that yields its secrets long after it has passed. Thus, assessing our current position is largely conjectural. Still, I’ll be reckless enough to offer my intuitions for consideration. Dave has been pursuing radical nonduality, a mode of thought characterized by losing one’s sense of self and becoming selfless, which diverges markedly from ego consciousness. That mental posture, described elsewhere by nameless others as participating consciousness, is believed to be what preceded the modern mind. I commented that losing oneself in intense, consuming flow behaviors is commonplace but temporary, a familiar, even transcendent place we can only visit. Its appeals are extremely seductive, however, and many people want to be there full-time, as we once were. The problem is that ego consciousness is remarkably resilient and self-reinforcing. Despite losing oneself from time to time, we can’t be liberated from the self permanently, and pathways to even temporarily getting out of one’s own head are elusive and sometimes self-destructive.

My intuition is that we are fumbling toward just such a quieting of the mind, a new dark age if you will, or what I called self-lite in my discussion with Dave. As we stagger forth, groping blindly in the dark, the transitional phase is characterized by numerous disturbances to the psyche — a crisis of consciousness wholly different from the historical one described previously. The example uppermost in my thinking is people lost down the rabbit hole of their handheld devices and desensitized to the world beyond the screen. Another is the ruined, wasted minds of (arguably) two or more generations of students done great disservice by their parents and educational institutions at all levels, a critical mass of intellectually stunted and distracted young adults by now. Yet another is those radicalized by their close identification with one or more special interest groups, also known as identity politics. A further example is the growing prevalence of confusion surrounding sexual orientation and gender identity. In each example, the individual’s ego is confused, partially suppressed, and/or under attack. Science fiction and horror genres have plenty of instructive examples of people who are no longer fully themselves, their bodies zombified or made into hosts for another entity that takes up residence, commandeering or shunting aside the authentic, original self.

Despite having identified modern ego consciousness as a crisis and feeling no small amount of empathy for those seeking radical nonduality, I find myself in the odd position of defending the modern mind precisely because transitional forms, if I have understood them properly, are so abhorrent. Put another way, while I can see the potential value and allure of extinguishing the self even semi-permanently, I will not be an early adopter. Indeed, if the modern mind took millennia to develop as one of the primary evolutionary characteristics of homo sapiens sapiens, it seems foolish to presume that it can be uploaded into a computer, purposely discarded by an act of will, or devolved in even a few generations. Meanwhile, though the doomer in me recognizes that ego consciousness is partly responsible for bringing us to the brink of (self-)annihilation (financial, geopolitical, ecological), individuality and intelligence are still highly prized where they can be found.

In my travels and readings upon the Intertubes, which proceed in fits and starts, I stumbled across roughly the same term — The NOW! People — used in completely different contexts and with different meanings. Worth some unpacking for idle consideration.

Meaning and Usage the First: The more philosophical of the two, this refers to those who feel anxiety, isolation, estrangement, disenfranchisement, and alienation from the world in stark recognition of the self-other problem and/or mind-body dualism. They seek to lose their identity and the time-boundedness that goes with being a separate self by entering a mental state characterized by the eternal NOW, much as animals without consciousness are believed to think. Projection forward and back more than a few moments in time is foreclosed; one simply exists NOW! Seminars and YouTube videos on radical nonduality are offers by Tony Parsons, Jim Newman, Andreas Müller, and Kenneth Madden, but according to my source (unacknowledged and unlinked), they readily admit that despite study, meditation, openness, and desire to achieve this state of mind, it is not prone to being triggered. It either happens or it doesn’t. Nonetheless, some experiences and behaviors allow individuals to transcend themselves at least to some degree, such as music, dance, and sex.

Meaning and Usage the Second: The more populist and familiar of the two, this refers to people for whom NOW! is always the proper time to do whatever the hell they most urgently desire with no consideration given to those around them. The more mundane instance is someone stopping in a doorway or on an escalator to check their phones for, oh, I dunno, Facebook updates and new e-mail. A similar example is an automobile driver over whom traffic and parking controls have no effect: someone double-parked (flashers optional) in the middle of the road or in a fire lane, some who executes a U-turn in the middle of traffic, or someone who pointlessly jumps the line in congestion just to get a few cars lengths ahead only to sit in yet more traffic. The same disregard and disrespect for others is evident in those who insist on saving seats or places in line, or on the Chicago L, those who occupy seats with bags that really belong on their laps or stand blocking the doorways (typically arms extended looking assiduously at their phones), making everyone climb past them to board or alight the train. These examples are all about someone commandeering public space as personal space at the anonymous expense of anyone else unfortunate enough to be in the same location, but examples multiply quickly beyond these. Courtesy and other social lubricants be damned! I want what I want right NOW! and you can go pound sand.

Both types of NOW! behavior dissolve the thinking, planning, orchestrating, strategizing mind in favor of narrowing thought and perception to this very moment. The first gives away willfulness and desire in favor of tranquility and contentedness, whereas the second demonstrates single-minded pursuit of a single objective without thought of consequence, especially to others. Both types of NOW! People also fit within the Transhumanist paradigm, which has among its aims leaving behind worldly concerns to float freely as information processors. If I were charitable about The NOW! People, I might say they lose possession of themselves by absorption into a timeless, mindless present; if less charitable, I might say that annihilation of the self (however temporary) transforms them into automatons.

The sole appeal I can imagine to retreating from oneself to occupy the eternal moment, once one has glimpsed, sensed, or felt the bitter loneliness of selfhood, is cessation of suffering. To cross over into selflessness is to achieve liberation from want, or in the Buddhist sense, Nirvana. Having a more Romantic aesthetic, my inclination is instead to go deeper and to seek the full flower of humanity in all its varieties. That also means recognizing, acknowledging, and embracing darker aspects of human experience, and yes, no small amount of discomfort and suffering. Our psycho-spiritual capacity demands it implicitly. But it takes strong character to go toward extremes of light and dark. The NOW! People narrow their range radically and may well be the next phase of human consciousness if I read the tea leaves correctly.

I hate to sound a conspiratorial note, and you’re free to disregard what follows, but it seems worthwhile to take further notice of the rash of violence last week.

In a commentary by John Whitehead at The Rutherford Institute, blame for what Whitehead calls “America’s killing fields” is laid at the feet of a variety of entities, including numerous elected officials and taxpayer-funded institutions. The more important quote appearing right at the top is this:

We have long since passed the stage at which a government of wolves would give rise to a nation of sheep. As I point out in my book Battlefield America: The War on the American People, what we now have is a government of psychopaths that is actively breeding a nation of psychopathic killers. [links redacted]

While this may read as unsupported hyperbole to some, I rather suspect Whitehead tells a truth hidden in plain sight — one we refuse to acknowledge because it’s so unsavory. Seeing that Whitehead gave it book-length consideration, so I’m inclined to grant his contention. One can certainly argue about intent, objectives, mechanisms, and techniques. Those are open to endless interpretation. I would rather concentrate on results, which speak for themselves. The fact is that in the U.S., Western Europe, and the Middle East, a growing number of people are in effect wind-up toys being radicalized and set loose. Significantly, recent perpetrators of violence are not only the disenfranchised but also police, current and former military, politicians, and pundits whose mindsets are not directed to diplomacy but instead establish “taking out the enemy” as the primary response to conflict. The enemy is also being redefined irrationally to include groups identified by race, religion, vocation, political persuasion, etc. (always has been, in fact, though the more virulent manifestations were driven underground for a time).

Childhood wind-up toys are my chosen metaphor because they’re mindless, pointless devices that are energized, typically by tightening a spring, and released for idle entertainment to move around and bump into things harmlessly until they sputter out. Maniacal mass killers “bump into” targets selected randomly via simple proximity to some venue associated with the killer’s pet peeve, so victims are typically in the wrong place at the wrong time. Uniformed police might be the exception. One might ask who or what is doing the winding of the spring. The pull quote above says it’s a government of psychopaths breeding yet more psychopaths. That is certainly true with respect to the ruling classes — what used to be the aristocracy in older cultures but now is more nearly a kleptocracy in the U.S. — and members of a monstrous security apparatus (military, civil police, intelligence services, etc.) now that the U.S. has effectively become a garrison state. Self-reinforcing structures have hardened over time, and their members perpetuate them. I’ve even heard suspicions that citizens are being “chipped,” that is, programmed in the sense of psyops to explode into mayhem with unpredictable certainty, though for what purpose I can only imagine.

The simpler explanation that makes more sense to me is that our culture is crazy-making. We no longer function well in a hypercomplex world — especially one so overloaded with information — without losing our grounding, our grip on truth, meaning, and value, and going mad. Contemporary demands on the nervous system have outstripped biological adaptation, so we respond to constant strain and stress with varying levels of dysfunction. No doubt some folks handle their difficulties better than others; it’s the ones who snap their springs who are of grave concern these days. Again, the mechanism isn’t all that important, as the example from Nice, France, demonstrates. Rather, it’s about loss of orientation that allows someone to rationalize killing a bunch of people all at once as somehow a good idea. Sadly, there is no solution so long as our collective attention is trained on the wrong things, perpetuating a network of negative feedback loops that makes us all loopy and a few of us highly dangerous. Welcome to the asylum.

According to Jean-Paul Sartre, the act of negation (a/k/a nihilation) is a necessary step in distinguishing foreground objects from background. A plethora of definitions and formal logic ensure that his philosophical formulations are of only academic interest to us nowadays, since philosophy in general has dropped out of currency in the public sphere and below awareness or concern even among most educated adults. With that in mind, I thought perhaps I should reinforce the idea of negation in my own modest (albeit insignificant) way. Negation, resistance, and dissent have familial relations, but they are undoubtedly distinct in some ways, too. However, I have no interest in offering formal treatments of terminology and so will gloss over the point and decline to offer definitions. Lump ’em all in together, I say. However, I will make a distinction between passive and active negation, which is the point of this blog post.

Although the information environment and the corporations that provide electronic access through hardware and connectivity would have us all jacked into the so-called Information Superhighway unceasingly, and many people do just that with enormous relish, I am of a different mind. I recognize that electronic media are especially potent in commanding attention and providing distraction. Stowed away or smuggled in with most messaging is a great deal of perception and opinion shaping that is worse than just unsavory, it’s damaging. So I go beyond passively not wanting handheld (thus nonstop) access to actively wanting not to be connected. Whereas others share excitement about the latest smartphone or tablet and the speed, cost, and capacity of the service provider for the data line on their devices, I don’t demur but insist instead “keep that nonsense away from me.” I must negate those prerogatives, refuse their claims on my attention, and be undisturbed in my private thoughts while away from the computer, the Internet, and the constant flow of information aimed indiscriminately at me and everyone.

Of course, I win no converts with such refusals. When I was shopping for a new phone recently, the default assumption by the sales clerk was that I wanted bells and whistles. She simply could not conceive of my desire to have a phone that is merely a phone, and the store didn’t sell one anyway. Even worse, since all phones are now by default smart phones, I had a data block put on my account to avoid inadvertently connecting to anything that would require a data line. That just blew her mind, like I was forgoing oxygen. But I’m quite clear that any vulnerability to information either tempting me or forced on me is worth avoiding and that time away from the computer and its analogues is absolutely necessary.

Personal anecdote: I was shopping at an appliance retailer (went to look at refrigerators) recently that had an embedded Apple store. At the counter with three models of the iPhone 6, the latest designations, were three kids roughly 8-11 in age (I estimate). They were unattended by parents, who must have believed that if the kids were not causing problems, they were a-okay. The kids themselves were absolutely transfixed — enthralled, really — by the screens, standing silent and motionless (very unlike most kids) with either a fierce concentration or utterly empty heads as they examined the gadgets. They were so zoomed in they took no notice at all of passersby. Parents of earlier generations used the TV as a pacifier or baby sitter the same way, but these kids seemed even more hollow than typical, dull-eyed TV viewers. Only a few days later, at a concert I attended, I watched a child who apparently could not pry his eyes away from the tablet he was carrying — even as he struggled to climb the stairs to his seat. The blue glare of the screen was all over his face.

Both scenes were unspeakably sad, though I might be hard-pressed to convince anyone of that assessment had I intervened. These scenes play out again and again, demonstrating that the youngest among us are the most vulnerable and least able to judge when to turn away, to disconnect. Adults fare no better. Schools and device makers alike have succeeded in selling electronics as “educational devices,” but the reality is that instead of exploring the world around us, people get sucked into a virtual world and the glossy fictions displayed on screens. They ultimately become slaves to their own devices. I mourn for their corrupted mindscapes, distorted and ruined by parents and teachers who ought to be wiser but who themselves have been coopted and hollowed out by mass media.