Archive for the ‘Ethics’ Category

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)

So we’re back at it: bombing places halfway around the world for having the indignity to be at war and fighting it the wrong way. While a legitimate argument exists regarding a human rights violation requiring a response, that is not AFAIK the principal concern or interpretation of events. Rather, it’s about 45 being “presidential” for having ordered missile strikes. It must have been irresistible, with all the flashy metaphorical buttons demanding to be pushed at the first opportunity. I’m disappointed that his pacifist rhetoric prior to the election was merely oppositional, seeking only to score points against Obama. Although I haven’t absorbed a great deal of the media coverage, what I’ve seen squarely refuses to let a crisis go to waste. Indeed, as geopolitics and military escapades goes, we’re like moths to the flame. The most reprehensible media response was MSNBC anchor Brian Williams waxing rhapsodic about the beauty of the missiles as they lit up the air. How many screw-ups does this guy get?

Lessons learned during the 20th century that warfare is not just a messy, unfortunate affair but downright ugly, destructive, pointless, and self-defeating are unjustifiably forgotten. I guess it can’t be helped: it’s nympho-warmaking. We can’t stop ourselves; gotta have it. Consequences be damned. How many screw-ups do we get?

At least Keith Olbermann, the current king of righteous media indignation, had the good sense to put things in their proper context and condemn our actions (as I do). He also accused the military strike of being a stunt, which calls into question whether the provocation was a false flag operation. That’s what Putin is reported as saying. Personally, I cannot take a position on the matter, being at the mercy of the media and unable to gather any first-hand information. Doubts and disillusionment over what’s transpired and the endless spin cycle plague me. There will never be closure.

This past Thursday was an occasion of protest for many immigrant laborers who did not show up to work. Presumably, this action was in response to recent executive attacks on immigrants and hoped to demonstrate how businesses would suffer without immigrant labor doing jobs Americans frequently do not want. Tensions between the ownership and laboring classes have a long, tawdry history I cannot begin to summarize. As with other contextual failures, I daresay the general public believes incorrectly that such conflicts date from the 19th century when formal sociopolitical theories like Marxism were published, which intersect heavily with labor economics. An only slightly better understanding is that the labor movement commenced in the United Kingdom some fifty years after the Industrial Revolution began, such as with the Luddites. I pause to remind that the most basic, enduring, and abhorrent labor relationship, extending back millennia, is slavery, which ended in the U.S. only 152 years ago but continues even today in slightly revised forms around the globe.

Thursday’s work stoppage was a faint echo of general strikes and unionism from the middle of the 20th century. Gains in wages and benefits, working conditions, and negotiating position transferred some power from owners to laborers during that period, but today, laborers must sense they are back on their heels, defending conditions fought for by their grandparents but ultimately losing considerable ground. Of course, I’m sympathetic to labor, considering I’m not in the ownership class. (It’s all about perspective.) I must also admit, however, to once quitting a job after only one day that was simply too, well, laborious. I had that option at the time, though it ultimately led nearly to bankruptcy for me — a life lesson that continues to inform my attitudes. As I survey the scene today, however, I suspect many laborers — immigrants and native-born Americans alike — have the unenviable choice of accepting difficult, strenuous labor for low pay or being unemployed. Gradual reduction of demand for labor has two main causes: globalization and automation.

(more…)

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

Once in a while, a comment sticks with me and requires additional response, typically in the form of a new post. This is one of those comments. I wasn’t glib in my initial reply, but I thought it was inadequate. When looking for something more specific about Neil Postman, I found Janet Sternberg’s presentation called Neil Postman’s Advice on How to Live the Rest of Your Life (link to PDF). The 22 recommendations that form Postman’s final lecture given to his students read like aphorisms and the supporting paragraphs are largely comical, but they nonetheless suggest ways of coping with the post-truth world. Postman developed this list before Stephen Colbert had coined the term truthiness. I am listing only the recommendations and withholding additional comment, though there is plenty to reinforce or dispute. See what you think.

  1. Do not go to live in California.
  2. Do not watch TV news shows or read any tabloid newspapers.
  3. Do not read any books by people who think of themselves as “futurists,”
    such as Alvin Toffler.
  4. Do not become a jogger. If you are one, stop immediately.
  5. If you are married, stay married.
  6. If you are a man, get married as soon as possible. If you are a woman,
    you need not be in a hurry.
  7. Establish as many regular routines as possible.
  8. Avoid multiple and simultaneous changes in your personal life.
  9. Remember: It is more likely than not that as you get older you will get
    dumber.
  10. Keep your opinions to a minimum.
  11. Carefully limit the information input you will allow.
  12. Seek significance in your work, friends, and family, where potency and
    output are still possible.
  13. Read’s Law: Do not trust any group larger than a squad, that is, about
    a dozen.
  14. With exceptions to be noted further ahead, avoid whenever possible
    reading anything written after 1900.
  15. Confine yourself, wherever possible, to music written prior to 1850.
  16. Weingartner’s Law: 95% of everything is nonsense.
  17. Truman’s Law: Under no circumstances ever vote for a Republican.
  18. Take religion more seriously than you have.
  19. Divest yourself of your belief in the magical powers of numbers.
  20. Once a year, read a book by authors like George Orwell, E.B. White, or
    Bertrand Russell.
  21. Santha Rama Rau’s Law: Patriotism is a squalid emotion.
  22. Josephson’s Law: New is rotten.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

The Internets/webs/tubes have been awfully active spinning out theories and conspiracies with respect to Democratic presidential nominee Hillary Clinton (are those modifiers even necessary?) and the shoe ready to drop if and when Julian Assange releases information in his possession reputed to spell the end of her candidacy and political career. Assange has been unaccountably coy: either he has the goods or he doesn’t. There’s no reason to tease and hype. Hillary has been the subject of intense scrutiny for 25+ years. With so much smoke billowing in her wake, one might conclude burning embers must exist. But our current political culture demonstrates that one can get away with unthinkably heinous improprieties, evasions, and crimes so long as one trudges steadfastly through all the muck. Some even make a virtue out of intransigence. Go figure.

If I were charitable, I would say that Hillary has been unfairly maligned and that her 2010 remark “Can’t we just drone this guy?” is either a fabrication or taken out of context. Maybe it was a throwaway joke, uttered in a closed meeting and forgotten except for someone who believed it might be useful later. Who can ever know? But I’m not so charitable. No one in a position of authority can afford to be flip about targeting political irritants. Hillary impresses as someone who, underneath all the noise, would not lose any sleep over droning her detractors.

There is scarcely anything on the political landscape as divisive as when someone blows the whistle on illicit government actions and programs. For instance, some are absolutely convinced that Edward Snowden is a traitor and ought to receive a death sentence (presumably after a trial, but not necessarily). Others understand his disclosures as the act of a patriot of the highest order, motivated not by self-interest but by love of country and the sincere belief in the public’s right to know. The middle ground between these extremes is a veritable wasteland — one I happen to occupy. Julian Assange is similarly divisive, and like Snowden, he appears to believe that the truth will eventually come out and indeed must. What I can’t quite reconcile is the need for secrecy and the willingness of the general public to accept leaders who habitually operate behind such veils. Talk of transparency is usually just subterfuge. If we’re truly the good guys and our ideals are superior to those of our detractors, why not simply trust in those strengths?

rant on/

Monastic pursuit of a singular objective, away from the maddening and distracting rush of modern life, is a character attribute that receives more than its rightful share of attention. In its salutary forms, monastic pursuit is understood as admirable, visionary, iconic (or iconoclastic), and heroic. In creative endeavors, seclusion and disengagement from feedback are preconditions for finding one’s true voice and achieving one’s vision. In sports, the image of the athlete devoted to training for the big event — race, match, tournament — to the exclusion of all else is by now a tired trope. Indeed, in this Olympics season, athlete profiles — puff pieces of extraordinary predictability — typically depict competitors in isolation, absolutely no one else at the gym, in the pool, on the track, etc., as though everyone goes it alone without the support or presence of coaches or teammates. Over-specialization and -achievement are such that spectators are conditioned to expect successful individuals, champions, to bleed (quite literally) as a mark of devotion to their respective fields.

At some point, however, monastic pursuit morphs into something more recognizably maniacal. The author retreating to his cabin in the woods to write the great American novel becomes the revolutionary hermit composing his political manifesto. Healthy competition among rivals turns into decidedly unsportsmanlike conduct. (Lance Armstrong is the poster boy not just for doping but also for the sociopathy he displayed mistreating teammates and perpetuating the lie as vehemently and as long as he did. Further examples compound quickly in sports). Business leaders, discontented with (sometime obscene) profitability, target others in their market sector with the intent of driving them out of business and establishing monopolies. (This contrasts markedly with the ideology of self-correcting markets many CEOs falsely espouse.) In politics, high-minded campaigns and elected politicians formed around sound policy and good governance lose out to such dirty tricks as character assassination, rigged and stolen elections, partisanship, and reflexive obstructionism of projects that enjoy popular support. In journalism, fair and balanced reporting inverts to constant harping on preferred talking points to control narratives through sheer force of repetition. You get the idea.

It’s difficult to say from where this intemperate impulse arises, but we’re undoubtedly in a phase of history where nearly every field of endeavor manifests its own version of the arms race. Some might argue that in a cost-benefit analysis, we’re all better off because we enjoy fruits not obtainable without (some folks at least) taking a scorched-earth approach, raising the bar, and driving everyone to greater heights. The willingness of some to distort and disgrace themselves hideously may be a high price to pay, especially when it’s for simple entertainment, but so long as we aren’t paying the price personally, we’re willing spectators to whatever glory and train wrecks occur. I would argue that, ultimately, we’re all paying the price. Routine competition and conflict resolution have grown so unhinged that, just to be in the game, competitors must be prepared to go all in (poker lingo) at even modest provocation. As a result, for just one example, the spirit of America’s erstwhile pastime (baseball) has been so corrupted that balanced players and fans (!) stay away and are replaced by goons. A true level playing field probably never existed. Now, however, whoever can muster the most force (financial, rhetorical, criminal) wins the trophy, and we’re each in turn encouraged to risk all in our own monastic pursuit.

rant off/

In my travels and readings upon the Intertubes, which proceed in fits and starts, I stumbled across roughly the same term — The NOW! People — used in completely different contexts and with different meanings. Worth some unpacking for idle consideration.

Meaning and Usage the First: The more philosophical of the two, this refers to those who feel anxiety, isolation, estrangement, disenfranchisement, and alienation from the world in stark recognition of the self-other problem and/or mind-body dualism. They seek to lose their identity and the time-boundedness that goes with being a separate self by entering a mental state characterized by the eternal NOW, much as animals without consciousness are believed to think. Projection forward and back more than a few moments in time is foreclosed; one simply exists NOW! Seminars and YouTube videos on radical nonduality are offers by Tony Parsons, Jim Newman, Andreas Müller, and Kenneth Madden, but according to my source (unacknowledged and unlinked), they readily admit that despite study, meditation, openness, and desire to achieve this state of mind, it is not prone to being triggered. It either happens or it doesn’t. Nonetheless, some experiences and behaviors allow individuals to transcend themselves at least to some degree, such as music, dance, and sex.

Meaning and Usage the Second: The more populist and familiar of the two, this refers to people for whom NOW! is always the proper time to do whatever the hell they most urgently desire with no consideration given to those around them. The more mundane instance is someone stopping in a doorway or on an escalator to check their phones for, oh, I dunno, Facebook updates and new e-mail. A similar example is an automobile driver over whom traffic and parking controls have no effect: someone double-parked (flashers optional) in the middle of the road or in a fire lane, some who executes a U-turn in the middle of traffic, or someone who pointlessly jumps the line in congestion just to get a few cars lengths ahead only to sit in yet more traffic. The same disregard and disrespect for others is evident in those who insist on saving seats or places in line, or on the Chicago L, those who occupy seats with bags that really belong on their laps or stand blocking the doorways (typically arms extended looking assiduously at their phones), making everyone climb past them to board or alight the train. These examples are all about someone commandeering public space as personal space at the anonymous expense of anyone else unfortunate enough to be in the same location, but examples multiply quickly beyond these. Courtesy and other social lubricants be damned! I want what I want right NOW! and you can go pound sand.

Both types of NOW! behavior dissolve the thinking, planning, orchestrating, strategizing mind in favor of narrowing thought and perception to this very moment. The first gives away willfulness and desire in favor of tranquility and contentedness, whereas the second demonstrates single-minded pursuit of a single objective without thought of consequence, especially to others. Both types of NOW! People also fit within the Transhumanist paradigm, which has among its aims leaving behind worldly concerns to float freely as information processors. If I were charitable about The NOW! People, I might say they lose possession of themselves by absorption into a timeless, mindless present; if less charitable, I might say that annihilation of the self (however temporary) transforms them into automatons.

The sole appeal I can imagine to retreating from oneself to occupy the eternal moment, once one has glimpsed, sensed, or felt the bitter loneliness of selfhood, is cessation of suffering. To cross over into selflessness is to achieve liberation from want, or in the Buddhist sense, Nirvana. Having a more Romantic aesthetic, my inclination is instead to go deeper and to seek the full flower of humanity in all its varieties. That also means recognizing, acknowledging, and embracing darker aspects of human experience, and yes, no small amount of discomfort and suffering. Our psycho-spiritual capacity demands it implicitly. But it takes strong character to go toward extremes of light and dark. The NOW! People narrow their range radically and may well be the next phase of human consciousness if I read the tea leaves correctly.

Sorry, there seems to be no end to the ink spilled over the presumptive winner of the Republican presidential nomination, Donald Trump. Everyone has a pet theory, and I’m no different. Actually, I have several competing theories, none of which are particularly exclusive from the others. My theory du jour is basically that Trump represents the schoolyard bully, though his sandbox is quite a lot bigger than those in grade school. His campaign came right out of the gate intimidating and bullying others in the most egregious way, so it was easy to believe for a long while that he would either undo himself or a bigger bully would come along to knock him down. Well, neither happened.

What seems to be more typical instead is that, in addition to indulgence in gladiatorial games and blood sport (i.e., the debates) that never lost their base appeal to the masses, a surprising number of supporters at all levels have fallen in behind the uberbully, happy to stand in his shadow lest his roving eye land upon them. So there are equal parts glee at witnessing others get bullied and relief that at least it’s not oneself on the receiving end. Before all is said and done, which could be years, I rather expect lots of people to seek refuge in Trump’s shadow, however temporary. The blood lust probably won’t wear off anytime soon, either. That’s who we’ve become, if indeed we were ever any other sort of people (which is arguable).

As an armchair social critic with neither audience nor influence, I can only wring my hands and offer a few pithy remarks. They amount to nothing. Likely, I’ll get sand kicked in my face (or worse), too, since I lack immunity. Further, I am not so willing to line up behind someone to save myself. I’ve had that experience before, though in small measure and less manifestly, and it was troubling to recognize in myself a failure of character. The troubling times coming will surely test all of us sorely. I can only hope that, when forced to decide, I demonstrate higher integrity than my own past. Others will make their own choices.