This is a continuation from part 1.

A long, tortured argument could be offered how we (in the U.S.) are governed by a narrow class of plutocrats (both now and at the founding) who not-so-secretly distrust the people and the practice of direct democracy, employing instead mechanisms found in the U.S. Constitution (such as the electoral college) to transfer power away from the people to so-called experts. I won’t indulge in a history lesson or other analysis, but it should be clear to anyone who bothers to look that typical holders of elected office (and their appointees) more nearly resemble yesteryear’s landed gentry than the proletariat. Rule by elites is thus quite familiar to us despite plenty of lofty language celebrating the common man and stories repeated ad naseum of a few exceptional individuals (exceptional being the important modifier here) who managed to bootstrap their way into the elite from modest circumstances.

Part 1 started with deGrasse Tyson’s recommendation that experts/elites should pitch ideas at the public’s level and ended with my contention that some have lost their public by adopting style or content that fails to connect. In the field of politics, I’ve never quite understood the obsession with how things present to the public (optics) on the one hand and obvious disregard for true consent of the governed on the other. For instance, some might recall pretty serious public opposition before the fact to invasion of Afghanistan and Iraq in response to the 9/11 attacks. The Bush Administration’s propaganda campaign succeeded in buffaloing a fair percentage of the public, many of whom still believe the rank lie that Saddam Hussein had WMDs and represented enough of an existential threat to the U.S. to justify preemptive invasion. Without indulging in conspiratorial conjecture about the true motivations for invasion, the last decade plus has proven that opposition pretty well founded, though it went unheeded.

Read the rest of this entry »

See this exchange where Neil deGrasse Tyson chides Sam Harris for failing to speak to his audience in terms it understands:

The upshot is that lay audiences simply don’t subscribe to or possess the logical, rational, abstract style of discourse favored by Harris. Thus, Harris stands accused of talking past his audience — at least somewhat — especially if his audience is understood to be the general public rather than other well-educated professionals. Subject matter is less important than style but revolves around politics, and worse, identity politics. Everyone has abundant opinions about those, whether informed by rational analysis or merely fed by emotion and personal resonance.

The lesson deGrasse Tyson delivers is both instructive and accurate yet also demands that the level of discourse be lowered to a common denominator (like the reputed 9th-grade speech adopted by the evening news) that regrettably forestalls useful discussion. For his part (briefly, at the end), Harris takes the lesson and does not resort to academic elitism, which would be obvious and easy. Kudos to both, I guess, though I struggle (being somewhat an elitist); the style-over-substance argument really goes against the grain for me. Enhancements to style obviously work, and great communicators use them and are convincing as a result. (I distinctly recall Al Gore looking too much like a rock star in An Inconvenient Truth. Maybe it backfired. I tend to think that style could not overcome other blocks to substance on that particular issue.) Slick style also allows those with nefarious agendas to hoodwink the public into believing nonsense.

Read the rest of this entry »

The Internets/webs/tubes have been awfully active spinning out theories and conspiracies with respect to Democratic presidential nominee Hillary Clinton (are those modifiers even necessary?) and the shoe ready to drop if and when Julian Assange releases information in his possession reputed to spell the end of her candidacy and political career. Assange has been unaccountably coy: either he has the goods or he doesn’t. There’s no reason to tease and hype. Hillary has been the subject of intense scrutiny for 25+ years. With so much smoke billowing in her wake, one might conclude burning embers must exist. But our current political culture demonstrates that one can get away with unthinkably heinous improprieties, evasions, and crimes so long as one trudges steadfastly through all the muck. Some even make a virtue out of intransigence. Go figure.

If I were charitable, I would say that Hillary has been unfairly maligned and that her 2010 remark “Can’t we just drone this guy?” is either a fabrication or taken out of context. Maybe it was a throwaway joke, uttered in a closed meeting and forgotten except for someone who believed it might be useful later. Who can ever know? But I’m not so charitable. No one in a position of authority can afford to be flip about targeting political irritants. Hillary impresses as someone who, underneath all the noise, would not lose any sleep over droning her detractors.

There is scarcely anything on the political landscape as divisive as when someone blows the whistle on illicit government actions and programs. For instance, some are absolutely convinced that Edward Snowden is a traitor and ought to receive a death sentence (presumably after a trial, but not necessarily). Others understand his disclosures as the act of a patriot of the highest order, motivated not by self-interest but by love of country and the sincere belief in the public’s right to know. The middle ground between these extremes is a veritable wasteland — one I happen to occupy. Julian Assange is similarly divisive, and like Snowden, he appears to believe that the truth will eventually come out and indeed must. What I can’t quite reconcile is the need for secrecy and the willingness of the general public to accept leaders who habitually operate behind such veils. Talk of transparency is usually just subterfuge. If we’re truly the good guys and our ideals are superior to those of our detractors, why not simply trust in those strengths?

Caveat: Apologies for this overlong post, which random visitors (nearly the only kind I have besides the spambots) may find rather challenging.

The puzzle of consciousness, mind, identity, self, psyche, soul, etc. is an extraordinarily fascinating subject. We use various terms, but they all revolve around a unitary property and yet come from different approaches, methodologies, and philosophies. The term mind is probably the most generic; I tend to use consciousness interchangeably and more often. Scientific American has a entire section of its website devoted to the mind, with subsections on Behavior & Society, Cognition, Mental Health, Neurological Health, and Neuroscience. (Top-level navigation offers links to these sections: The Sciences, Mind, Health, Tech, Sustainability, Education, Video, Podcasts, Blogs, and Store.) I doubt I will explore very deeply because science favors the materialist approach, which I believe misses the forest through the trees. However, the presence of this area of inquiry right at the top of the page indicates how much attention and research the mind/consciousness is currently receiving.

A guest blog at Scientific American by Adam Bear entitled “What Neuroscience Says about Free Will” makes the fashionable argument (these days) that free will doesn’t exist. The blog/article is disclaimed: “The views expressed are those of the author(s) and are not necessarily those of Scientific American.” I find that a little weaselly. Because the subject is still wide open to interpretation and debate, Scientific American should simply offer conflicting points of view without worry. Bear’s arguments rest on the mind’s ability to revise and redate experience occurring within the frame of a few milliseconds to allow for processing time, also known as the postdictive illusion (the opposite of predictive). I wrote about this topic more than four years ago here. Yet another discussion is found here. I admit to being irritated that the questions and conclusions stem from a series of assumptions, primarily that whatever free will is must occur solely in consciousness (whatever that is) as opposed to originating in the subconscious and subsequently transferring into consciousness. Admittedly, we use these two categories — consciousness and the subconscious — to account for the rather limited amount of processing that makes it all the way into awareness vs. the significant amount that remains hidden or submerged. A secondary assumption, the broader project of neuroscience in fact, is that, like free will, consciousness is housed somewhere in the brain or its categorical functions. Thus, fruitful inquiry results from seeking its root, seed, or seat as though the narrative constructed by the mind, the stream of consciousness, were on display to an inner observer or imp in what Daniel Dennett years ago called the Cartesian Theater. That time-worn conceit is the so-called ghost in the machine. Read the rest of this entry »

/rant on

With a new round of presidential debates upon us (not really debates if one understands the nature of debate or indeed moderation — James Howard Kunstler called it “the gruesome spectacle of the so-called debate between Trump and Clinton in an election campaign beneath the dignity of a third-world shit-hole”), it’s worthwhile to keep in the front of one’s mind that the current style of public discourse does not aim to provide useful or actionable information with regard to either the candidates or the issues. Rather, the idea is to pummel the hapless listener, watcher, or reader into a quivering jangle of confusion by maintaining a nonstop onslaught of soundbites, assertions, accusations, grandstanding, and false narratives. Our information environment abets this style of machine-gun discourse, with innumerable feeds from InstaGoogTwitFaceTube (et cetera), all vying simultaneously for our limited attention and thereby guaranteeing that virtually nothing makes a strong impression before the next bit of BS displaces it in a rapid succession of predigested morsels having no nutritional content or value for earnest consumers of information (as opposed to mouth-breathers seeking emotional salve for their worst biases and bigotry). Many feeds are frankly indecipherable, such as when the message is brutally truncated and possessed of acronyms and hashtags, the screen is cluttered with multiple text scrolls, or panel participants talk over each other to claim more screen time (or merely raise their asshole quotient by being the most obnoxious). But no matter so long as the double barrels keep firing.

I caught Republican nominee Donald Trump’s campaign manager Kellyann Conway being interviewed by some banal featherweight pulling punches (sorry, no link, but she’s eminently searchable). Conway proved adept at deflecting obvious contradictions and reversals (and worse) of the Trump campaign by launching so many ideological bombs that nothing the interviewer raised actually landed. Questions and conflicts just floated away, unaddressed and unanswered. Her bizarre, hyperverbal incoherence is similar to the candidate’s stammering word salad, and ironically, both give new meaning to the decades-old term “Teflon” when applied to politics. Nothing sticks because piling on more and more complete wrongness and cognitive dissonance overwhelms and bewilders anyone trying to track the discussion. Trump and Conway are hardly alone in this, of course, though their mastery is notable (but not admirable). Talking heads gathered in panel discussions on, say, The View or Real Time with Bill Maher, just about any klatch occupying news and morning-show couches, and hosts of satirical news shows (some mentioned here) exhibit the same behavior: a constant barrage of high-speed inanity (and jokes, omigod the jokes!) that discourages consideration of an idea before driving pellmell onto the next.

Thoughtful persons might pause to wonder whether breathless, even virtuoso delivery results from or creates our abysmally short attention spans and lack of serious discussion of problems plaguing the nation. Well, why can’t it be both? Modern media is all now fast media, delivering hit-and-run spectacle to overloaded nervous systems long habituated to being goosed every few moments. (Or as quoted years ago, “the average Hollywood movie has become indistinguishable from a panic attack.”) Our nervous systems can’t handle it, obviously. We have become insatiable information addicts seeking not just the next fix but a perpetual fix, yet the impatient demand for immediate gratification — Internet always at our fingertips — is never quelled. Some new bit will be added to the torrent of foolishness sooner than it can be pulled down. And so we stumble like zombies, blindly and willingly, into a surreality of our own making, heads down and faces blue from the glare of the phone/tablet/computer. Of course, the shitshow is brightly festooned with buffoon candidates holding court over the masses neither intends to serve faithfully in office. Their special brand of insanity is repeated again and again throughout the ranks of media denizens (celebrity is a curse, much like obscene wealth, or didn’t you know that?) and is seeping into the ground water to poison all of us.

/rant off

In what has become a predictable status quo, President Obama recently renewed our official state of emergency with respect to the so-called War on Terror. It’s far too late to declare a new normal; we’ve been in this holding pattern for 16 years now. The article linked above provides this useful context:

There are now 32 states of national emergency pending in the United States, with the oldest being a 1979 emergency declared by President Jimmy Carter to impose sanctions during the Iran hostage crisis. Most are used to impose economic sanctions — mostly as a formality, because Congress requires it under the International Emergency Economic Powers Act.

In his term in office, Obama has declared 13 new emergencies, continued 21 declared by his predecessors and revoked just two, which imposed sanctions on Liberia and Russia.

Pro forma renewal of multiple states of national emergency is comparable to the 55-year-old U.S. embargo against Cuba, due for reauthorization next month, though indications are that the embargo may finally be relaxed or deauthorized. Both are examples of miserably failed policy, but they confer a semblance of power on the executive branch. Everyone knows by now that no one relinquishes power willingly, so Obama, like chief executives before him, keeps on keeping on ad nauseum.

Considering Obama’s credential as a Constitutional scholar, relatively unique among U.S. presidents, one might expect him to weigh his options with greater circumspection and with an eye toward restoring suspended civil liberties. However, he has shown little interest in doing so (as far as I know). In combination with the election only a couple months away, the U.S. appears to be in a position similar to Germany in 1932 — ready and willing to elect a despot (take your pick …) and continue its slide into fascism. Can’t even imagine avoiding that outcome now.

The surprising number of ongoing emergencies makes me point to James Howard Kunstler and his book The Long Emergency (2006). Though I haven’t read the book (I’m a failed doomer, I suppose), my understanding is that his prediction of a looming and lingering emergency is based on two intertwined factors currently playing out in geopolitics: peak oil and global warming. (“Climate change” is now preferred over “global warming.”) Those two dire threats (and the California drought) have faded somewhat from the headlines, partially due to fatigue, replaced primarily by terrorism and economic stresses, but the dangers never went away. Melting icecaps and glaciers are probably the clearest incontrovertible indications of anthropogenic global warming, which is poised to trigger nonlinear climate change and hasten the Sixth Extinction. We don’t know when, precisely, though time is growing short. Similarly, reports on energy production and consumption are subject to considerable falsification in the public sphere, making it impossible to know just how close in time we are to a new energy crisis. That inevitability has also been the target of a disinformation campaign, but even a rudimentary understanding of scientific principles is sufficient to enable clear thinkers to penetrate the fog.

I have no plans to return to doom blogging with any vigor. One emergency stacked upon the next, ready to collapse in a cascade of woe, has defeated me, and I have zero expectation that any real, meaningful response can be formulated and executed, especially while we are distracted with terrorism and creeping fascism.

A couple of posts ago, I used the phrase “pay to play” in reference to our bought-and-paid-for system of political patronage. This is one of those open secrets we all recognize but gloss over because, frankly, in a capitalist economy, anything that can be monetized and corrupted will be. Those who are thus paid to play enjoy fairly handsome rewards for doing not very much, really. Yet the paradigm is self-reinforcing, much like the voting system, with promises of increased efficiency and effectiveness with greater levels of participation. Nothing of the sort has proven to be true; it’s simply a goad we continue to hear, some believing in the carrot quite earnestly, others holding their noses and ponying up their dollars and votes, and still others so demoralized and disgusted with the entire pointless constellation of lies and obfuscations that refusing to participate feels like the only honest response. (Periodic arguments levied my way that voting is quite important have failed to convince me that my vote matters a whit. Rather, it takes a bizarre sort of doublethink to conclude that casting my ballot is meaningful. Of late, I’ve succumbed to sustained harangues and shown up to vote, but my heart’s not in it.) I can’t distinguish so well anymore between true believers and mere manipulators except to observe that the former are more likely to be what few civic-minded voters remain and the latter are obviously candidates and their PR hacks. Journalists? Don’t get me started.

The phrase put me in mind of two other endeavors (beyond politics) where a few professionals enjoy being paid to play: sports and performing arts. Both enjoy heavy subscription among the masses early in life, as student sports and performing groups offer training and experience. The way most of us start out, in fact, we actually pay to play through classes, lessons, training, dues, and memberships that provide access to experts and put us in position to reap rewards later in life. Maybe you attended tennis camp or music camp as a kid, or you paid for a college education (defrayed perhaps by activity scholarships) majoring in athletics or theater. Lots of variations exist, and they’re not limited to youth. As an endurance athlete, I continue to pay entrance fees to race organizers for the opportunity to race on courses with support that would otherwise be unavailable without the budget provided by participants, sponsorship notwithstanding. Chicago’s popular 16-inch softball leagues are pay-to-play sports.

A second phase might be giving it away for free. As with paying to play, pure enjoyment of the endeavor works as a strong motivation and justification. This is probably more common in the community-level performing arts, where participation is just plain fun. And who knows? Exposure might lead to a big break or discovery. It’s also what motivates quite a lot of amateur athletes, especially for sports that have not gone mainstream. Olympic athletes (tertiary events) might fall roughly into this category, especially when their primary incomes are derived elsewhere. A third phase is being paid to play. If the audience or fan base is big enough, the financial rewards and fame can be considerable. However, those who enter the professional ranks don’t always demonstrate such great prowess, especially early on. More than a few blow up and flame out quickly, unable to sustain the spark that launched their careers. There’s also being paid to play but earning well short of a livable wage, which borders on giving it away for free or at least for too little. A final phase is being paid not to play. A mean interpretation of that would be that one is screwing up or blocking others’ opportunities to the point where it becomes worthwhile to pay someone to not show up or to go away. A more charitable interpretation would be that one’s employment contract includes time-off benefits that require continuous payments even when not playing.

As with my post about the differences between the Participation, Achievement, and Championship Models, I’m now content with numerous endeavors to be either pay to play, play for free, or play for too little. Participation makes it worthwhile under any payment regime, the alternative typically being sitting at home on my couch wasting my time in front of the TV. I never made it to the enviable position of being paid to play or paid not to play. Still, as an individual of some attainment and multiple areas of expertise, I admit finding it irksome to observe some truly awful people out there pulling in attention and wealth despite rather feeble efforts or abilities. The meritocracy may not be dead, but it often looks comatose.

Lessons of History

Posted: September 8, 2016 in Culture

Here’s something I wrote nine years ago that seems still on point.

Creative Destruction

The oft-repeated trope is that those ignorant of history are doomed to repeat it, to which most most of us laconically reply “So what? Big deal.” We’ve taken our eye off the ball and don’t really care anymore about history, being contented with the illusory belief that our current stage of historical development can and will continue undisrupted into the middle of the century, which is probably the longest time horizon we really care about. But there are still plenty of academics and pundits studying history, drawing lessons from it, and sounding the klaxon regarding some threat or imminent transformation or collapse. Actually rousing citizens out of their satiated lethargy is undoubtedly too difficult a task just yet, but the alarm calls at least make for some interesting reading.

Three recent articles make comparisons between the current state of America and historical conditions here and abroad in an attempt to…

View original post 1,027 more words

We all live in perceptual bubbles of varying breadth and focus. Otherwise, we would be omniscient, which none of us is or can be. Two hot topics that lie outside my perceptual bubble are geopolitical struggles in Israel and Northern Ireland. I’ve also read analyses that suggest that our current troubles and involvements in the Middle East are part of a clash of cultures going back two millennia, where the mostly Christian West won the battle back in the Middle Ages but newly gained oil wealth in the Middle East has prompted a resumption of hostilities. I have a mixture of opinions passing acquaintance with geopolitics, and the complexity of the myriad interacting elements keeps me from getting a good fix on what’s proven to be a constantly shifting target. That aspect of modern history is the domain of intelligence agencies, military strategists, and diplomats. I don’t necessarily trust those professionals, though, since they operate with their own perceptual biases. (When your main tool is a bomb hammer, everything tends to look like a target nail.) But I also recognize that I’m in a really lousy position to second-guess or drive from the back seat. Plus, I have zero influence, even at the voting booth.

In the narrower arena of domestic and campaign politics, the news media (journalists) have failed in their legitimate role as the fourth estate, which function is now being performed by their cousins in entertainment media. (I’ll skip the diatribe that journalism has essentially merged with entertainment and utterly lost any claim to objectivity.) Specifically, we live in a surprisingly mature age of political satire replete with shows that deliver news in comic form far better than serious journalists do with straight faces. The model is undoubtedly The Daily Show, which has already spun off The Colbert Report, Last Week Tonight, Full Frontal, and The Nightly Show. Each of these shows features a host considerably smarter than the audience, who proceeds with rapid-fire (though scripted) takedowns of all manner of political dysfunction. Each has its own stylistic tics, but in aggregate, they arguably do a better job of investigative journalism these days than, say, 60 Minutes, Dateline, or 20/20. Better yet, since they don’t pretend to be serious journalism, they can dispense with bogus claims to objectivity and simply go ahead to indulge in righteous indignation and silly stunts, exposing corruption, stupidity, and inanity in all their shameful manifestations. Political humor has now become a form of gallows humor. Read the rest of this entry »

Lingua Nova 02

Posted: August 20, 2016 in Idle Nonsense, Nomenclature, Writing

From time to time, I indulge my predilection for nomenclature and neologism. Those collected below are not novel so much as repurposed.

code blue: the closing of ranks and circling of wagons undertaken by police departments in the wake of an officer (or officers) killing an unarmed citizen, usually black, accompanied by the insistence that the officer(s) by definition can do no wrong.

Christian fiction: an alternative narrative promulgated by Christian fundamentalists in a field of inquiry not based on faith and belief.

STEAM: the addition of Arts to STEM fields (Science, Technology, Engineering, Mathematics) to acknowledge the fundamental source of creativity and innovation.

nostalgia mining: the creation of entertainments that guilelessly seek to exploit the resource of fond remembrance of days past.

That’s a short list, to be sure. If you feel cheated, here’s an additional list of words sure to confound any interlocutors: espial, telic, desuetude, fubbed, girandoles, catarrh, avulse, gobbet, diaphanouspipping, panicles, cark, cantilene, fisc, phylactery, princox, and funest. I knew only a few of them before stealing most from something I read and can’t imagine using any in speech unless I’m trying not to communicate.