Posts Tagged ‘Politics’

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

So the deed is done: the winning candidate has been duly delivered and solemnly sworn in as President of the United States. As I expected, he wasted no time and repaired to the Oval Office immediately after the inauguration (before the inaugural ball!) to sign an executive order aimed at the Affordable Care Act (a/k/a Obamacare), presumably to “ease the burden” as the legislative branch gets underway repealing and replacing the ACA. My only surprise is that he didn’t have a stack of similar executive orders awaiting signature at the very first opportunity. Of course, the president had not held back in the weeks up to the inauguration from issuing intemperate statements, or for that matter, indulging in his favorite form of attack: tweet storms against his detractors (lots of those). The culmination (on the very short term at least — it’s still only the weekend) may well have been the inaugural address itself, where the president announced that American interests come first (when has that ever not been the case?), which is being interpreted by many around the globe as a declaration of preemptive war.

The convention with each new presidential administration is to focus on the first hundred days. Back in November 2016, just after the election, National Public Radio (NPR) fact-checked the outline for the first hundred days provided by the campaign at the end of October 2016. With history speeding by, it’s unclear what portion of those plans have survived. Time will tell, of course, and I don’t expect it will take long — surely nowhere near 100 days.

So what is the difference between fulfilling one’s destiny and meeting one’s fate? The latter has a rather unsavory character to it, like the implied curse of the granted genie’s wish. The former smells vaguely of success. Both have a distinctly tragic whiff of inevitability. Either way, this new president appears to be hurrying headlong to effect changes promised during his campaign. If any wisdom is to be gathered at this most unpredictable moment, perhaps it should be a line offered today by fellow blogger the South Roane Agrarian (which may have in turn been stolen from the British version of House of Cards): “Beware of old men in a hurry.”

Aside: I was going to call this post “Fools Rush In,” but I already have one with that title and the slight revision above seems more accurate, at least until the bandwagon fills up.

Addendum: Seems I was partially right. There was a stack of executive orders ready to sign. However, they’ve been metered out over the course of the week rather than dumped in the hours shortly after the inauguration. What sort of calculation is behind that is pure conjecture. I might point out, though, that attention is riveted on the new president and will never subside, so there is no need, as in television, to keep priming the pump.

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

I watched John Pilger’s excellent documentary film The War You Don’t See (2010), which deals with perpetual and immoral wars, obfuscations of the governments prosecuting them, and the journalistic media’s failure to question effectively the lies and justifications that got us into war and keeps us there. The documentary reminded me of The Fog of War (2003), Robert McNamara’s rueful rethinking of his activities as Secretary of Defense during the Kennedy and Johnson administrations (thus, the Vietnam War). Seems that lessons a normal, sane person might draw from experience at war fail to find their way into the minds of decision makers, who must somehow believe themselves to be masters of the universe with immense power at their disposal but are really just war criminals overseeing genocides. One telling detail from Pilger’s film is that civilian deaths (euphemistically retermed collateral damage in the Vietnam era) as a percentage of all deaths (including combatants) have increased from 10% (WWI) to 50% (WWII) to 70% (Vietnam) to 90% (Afghanistan and Iraq). That’s one of the reasons why I call them war criminals: we’re depopulating the theaters of war in which we operate.

After viewing the Pilger film, the person sitting next to me asked, “How do you know what he’s saying is true?” More fog. I’m ill-equipped to handle such direct epistemological challenge; it felt to me like a non sequitur. Ultimately, I was relieved to hear that the question was mere devil’s advocacy, but it’s related to the epistemological crisis I’ve blogged about before. Since the date of that blog post, the crisis has only worsened, which is what I expect as legitimate authority is undermined, expertise erodes, and the public sphere devolves into gamification and gotchas (or a series of ongoing cons). If late-stage capitalism has become a nest of corruption, the same is true — with unexpected rapidity — of the computer era and the Information Superhighway (a term no one uses anymore). One early expectation was that enhanced (24/7/365) access to information would yield impressive educational gains, as though the only thing missing were more information, but human nature being what it is, the first valuable innovations resulted from commercializing erotica and porn. Later debate and hand-wringing over the inaccuracy of Wikipedia and the slanted results of Google searches disappeared as everyone simply got used to not being able to trust those sources any too much, just as everyone got used to forfeiting their privacy online.

Today, everything coughed up in our media-saturated information environment is understood either with a grain of salt mountain of skepticism and held in abeyance until solid confirmation can be had (which often never comes) or simply run with because, well, what the hell? Journalists, the well-trained ones possessing integrity anyway, used to be in the first camp, but market forces and the near instantaneity of (faulty, spun) information, given how the Internet has lowered the bar to publication, have pushed journalists into the second camp. As Pilger notes, they have become echo chambers and amplifiers of the utterances of press agents of warmongering governments. Sure, fact checking still occurs, when it’s easy (such as on the campaign trail), but with war reporting in particular, which poses significant hurdles to information gathering, too many reporters simply repeat what they’re told or believe the staging they’re shown.

The U.S. election has come and gone. Our long national nightmare is finally over; another one is set to begin after a brief hiatus. (I’m not talking about Decision 2020, though that spectre has already reared its ugly head.) Although many were completely surprised by the result of the presidential race in particular, having placed their trust in polls, statistical models, and punditry to project a winner (who then lost), my previous post should indicate that I’m not too surprised. Michael Moore did much better taking the temperature of the room (more accurately, the nation) than all the other pundits, and even if the result had differed, the underlying sentiments remain. It’s fair to say, I think, that people voted with their guts more than their heads, meaning they again voted their fears, hates, and above all, for revolution change. No matter that the change in store for us will very likely be destructive and against self-interest. Truth is, it would have had to end with destruction with any of the candidates on the ballot.

Given the result, my mind wandered to Hillary Clinton’s book It Takes a Village, probably because we, the citizens of the Unites States of America, have effectively elected the village idiot to the nation’s highest office. Slicing and dicing the voting tallies between the popular vote, electoral votes, and states and counties carried will no doubt be done to death. Paths to victory and defeat will be offered with the handsome benefit of hindsight. Little of that matters, really, when one considers lessons never learned despite ample opportunity. For me, the most basic lesson is that for any nation of people, leaders must serve the interests of the widest constituency, not those of a narrow class of oligarchs and plutocrats. Donald Trump addressed the people far more successfully than did Hillary Clinton (with her polished political doubletalk) and appealed directly to their interests, however base and misguided.

My previous post called Barstool Wisdom contained this apt quote from The Brothers Karamazov by Dostoevsky:

The more stupid one is, the closer one is to reality. The more stupid one is, the clearer one is. Stupidity is brief and artless, while intelligence squirms and hides itself.

We have already seen that our president-elect has a knack for stating obvious truths no one else dares utter aloud. His clarity in that regard, though coarse, contrasts completely with Hillary’s squirmy evasions. Indeed, her high-handed approach to governance, more comfortable in the shadows, bears a remarkable resemblance to Richard Nixon, who also failed to convince the public that he was not a crook. My suspicion is that as Donald Trump gets better acquainted with statecraft, he will also learn obfuscation and secrecy. Some small measure of that is probably good, actually, though Americans are pining for greater transparency, one of the contemporary buzzwords thrown around recklessly by those with no real interest in it. My greater worry is that through sheer stupidity and bullheadedness, other obvious truths, such as commission of war crimes and limits of various sorts (ecological, energetic, financial, and psychological), will go unheeded. No amount of barstool wisdom can overcome those.

Predictions are fool’s errands. Useful ones, anyway. The future branches in so many possible directions that truly reliable predictions are banal, such as the sun will rise in the east, death, and taxes. (NTE is arguably another term for megadeath, but I gotta reinforce that prediction to keep my doomer bonafides.) Now only a few days prior to the general election finds me anxious that the presidential race is still too close to call. More than a few pundits say that Donald Trump could actually win. At the same time, a Hillary Clinton win gives me no added comfort, really. Moreover, potential squabbles over the outcome threaten to turn the streets into riot zones. I had rather expected such disruptions during or after the two nominating conventions, but they settled on their presumptive nominees without drama.

Polls are often predictive, of course, and despite their acknowledged margins of error, they typically forecast results with enough confidence that many voters don’t bother to vote, safe in the assumption that predicted results (an obvious oxymoron) make moot the need to actually cast one’s vote. (The West Coast must experience this phenomenon more egregiously than the East Coast, except perhaps for California’s rather large population and voting power. Has Hawaii ever mattered?) For that reason alone, I’d like to see a blackout on polling in the weeks leading up to an election (2–3 ought to do), including election day. This would allow us to avoid repeating the experience of the Chicago Daily Tribune publishing the headline “Dewey Defeats Truman” back in 1948.

Analysis of voting patterns and results also dissuades voters from considering anything other than a strategic vote for someone able to actually win, as opposed to supporting worthy candidates polling far enough behind they don’t stand a chance of winning, thus reinforcing a two-party system no one really likes because it keeps delivering supremely lousy candidates. Jesse Ventura, having defied the polls and been elected to office as an independent, has been straightforward about his disdain for the notion that voting outside the two main parties is tantamount to throwing away one’s vote. A related meme is that by voting for independent Ralph Nader in 2000, the Democratic vote was effectively split, handing the win (extraordinarily close and contestable though it was) to George Bush. My thinking aligns with Jesse Ventura, not with those who view votes for Ralph Nader as betrayals.

If the presidential race is still too close for comfort, Michael Moore offers a thoughtful explanation how Trump could win:

This excerpt from Moore’s new film TrumpLand has been taken out of context by many pro-Trump ideologues. I admit the first time I saw it I was unsure whether Moore supports Trump. Additional remarks elsewhere indicate that he does not. The spooky thing is that as emotional appeals go, it’s clear that Trump connects with the people powerfully. But Moore is right about another thing: to vote for Trump is really a giant “fuck you” to the establishment, which won’t end well.

This is a continuation from part 1.

A long, tortured argument could be offered how we (in the U.S.) are governed by a narrow class of plutocrats (both now and at the founding) who not-so-secretly distrust the people and the practice of direct democracy, employing instead mechanisms found in the U.S. Constitution (such as the electoral college) to transfer power away from the people to so-called experts. I won’t indulge in a history lesson or other analysis, but it should be clear to anyone who bothers to look that typical holders of elected office (and their appointees) more nearly resemble yesteryear’s landed gentry than the proletariat. Rule by elites is thus quite familiar to us despite plenty of lofty language celebrating the common man and stories repeated ad naseum of a few exceptional individuals (exceptional being the important modifier here) who managed to bootstrap their way into the elite from modest circumstances.

Part 1 started with deGrasse Tyson’s recommendation that experts/elites should pitch ideas at the public’s level and ended with my contention that some have lost their public by adopting style or content that fails to connect. In the field of politics, I’ve never quite understood the obsession with how things present to the public (optics) on the one hand and obvious disregard for true consent of the governed on the other. For instance, some might recall pretty serious public opposition before the fact to invasion of Afghanistan and Iraq in response to the 9/11 attacks. The Bush Administration’s propaganda campaign succeeded in buffaloing a fair percentage of the public, many of whom still believe the rank lie that Saddam Hussein had WMDs and represented enough of an existential threat to the U.S. to justify preemptive invasion. Without indulging in conspiratorial conjecture about the true motivations for invasion, the last decade plus has proven that opposition pretty well founded, though it went unheeded.


The Internets/webs/tubes have been awfully active spinning out theories and conspiracies with respect to Democratic presidential nominee Hillary Clinton (are those modifiers even necessary?) and the shoe ready to drop if and when Julian Assange releases information in his possession reputed to spell the end of her candidacy and political career. Assange has been unaccountably coy: either he has the goods or he doesn’t. There’s no reason to tease and hype. Hillary has been the subject of intense scrutiny for 25+ years. With so much smoke billowing in her wake, one might conclude burning embers must exist. But our current political culture demonstrates that one can get away with unthinkably heinous improprieties, evasions, and crimes so long as one trudges steadfastly through all the muck. Some even make a virtue out of intransigence. Go figure.

If I were charitable, I would say that Hillary has been unfairly maligned and that her 2010 remark “Can’t we just drone this guy?” is either a fabrication or taken out of context. Maybe it was a throwaway joke, uttered in a closed meeting and forgotten except for someone who believed it might be useful later. Who can ever know? But I’m not so charitable. No one in a position of authority can afford to be flip about targeting political irritants. Hillary impresses as someone who, underneath all the noise, would not lose any sleep over droning her detractors.

There is scarcely anything on the political landscape as divisive as when someone blows the whistle on illicit government actions and programs. For instance, some are absolutely convinced that Edward Snowden is a traitor and ought to receive a death sentence (presumably after a trial, but not necessarily). Others understand his disclosures as the act of a patriot of the highest order, motivated not by self-interest but by love of country and the sincere belief in the public’s right to know. The middle ground between these extremes is a veritable wasteland — one I happen to occupy. Julian Assange is similarly divisive, and like Snowden, he appears to believe that the truth will eventually come out and indeed must. What I can’t quite reconcile is the need for secrecy and the willingness of the general public to accept leaders who habitually operate behind such veils. Talk of transparency is usually just subterfuge. If we’re truly the good guys and our ideals are superior to those of our detractors, why not simply trust in those strengths?