Posts Tagged ‘Recent History’

So we’re back at it: bombing places halfway around the world for having the indignity to be at war and fighting it the wrong way. While a legitimate argument exists regarding a human rights violation requiring a response, that is not AFAIK the principal concern or interpretation of events. Rather, it’s about 45 being “presidential” for having ordered missile strikes. It must have been irresistible, with all the flashy metaphorical buttons demanding to be pushed at the first opportunity. I’m disappointed that his pacifist rhetoric prior to the election was merely oppositional, seeking only to score points against Obama. Although I haven’t absorbed a great deal of the media coverage, what I’ve seen squarely refuses to let a crisis go to waste. Indeed, as geopolitics and military escapades goes, we’re like moths to the flame. The most reprehensible media response was MSNBC anchor Brian Williams waxing rhapsodic about the beauty of the missiles as they lit up the air. How many screw-ups does this guy get?

Lessons learned during the 20th century that warfare is not just a messy, unfortunate affair but downright ugly, destructive, pointless, and self-defeating are unjustifiably forgotten. I guess it can’t be helped: it’s nympho-warmaking. We can’t stop ourselves; gotta have it. Consequences be damned. How many screw-ups do we get?

At least Keith Olbermann, the current king of righteous media indignation, had the good sense to put things in their proper context and condemn our actions (as I do). He also accused the military strike of being a stunt, which calls into question whether the provocation was a false flag operation. That’s what Putin is reported as saying. Personally, I cannot take a position on the matter, being at the mercy of the media and unable to gather any first-hand information. Doubts and disillusionment over what’s transpired and the endless spin cycle plague me. There will never be closure.

First, a few reminders:

  • The United States has been in an undeclared state of war for 15 years, the longest in U.S. history and long enough that young people today can say legitimately, “we’ve always been at war with Oceania.” The wars encompass the entirety of both terms of the Obama Administration.
  • The inciting events were attacks on U.S. soil carried out on September 11, 2001 (popularly, 9/11), which remain shrouded in controversy and conspiracy despite the official narrative assigning patsy blame to al-Qaida operating in Afghanistan and Iraq.
  • On the heels of the attacks, the Bush Administration commenced a propaganda campaign to sell invasion and regime change in those two countries and, over widespread public protest, went ahead and launched preemptive wars, ostensibly because an existential threat existed with respect to weapons of mass destruction (WMDs) possessed by Iraq in particular.
  • The propaganda campaign has since been revealed to have been cooked up and untrue, yet it buffaloed a lot of people into believing (even to this day) that Iraq was somehow responsible for 9/11.
  • Our preemptive wars succeeded quickly in toppling governments and capturing (and executing) their leaders but immediately got bogged down securing a peace that never came.
  • Even with an embarrassing mismatch of force, periodic troop surges and draw downs, trillions of dollars wasted spent prosecuting the wars, and incredible, pointless loss of life (especially on the opposing sides), our objective in the Middle East (other than the oil, stupid!) has never been clear. The prospect of final withdrawal is nowhere on the horizon.

Continuous war — declared or merely waged — has been true of the U.S. my whole life, though one would be hard pressed to argue that it truly represents an immediate threat to U.S. citizens except to those unlucky enough to be deployed in war zones. Still, the monkey-on-the-back is passed from administration to administration. One might hope, based on campaign rhetoric, that the new executive (45) might recognize continuous war as the hot potato it is and dispense with it, but the proposed federal budget, with its $52 billion increase in military spending (+10% over 2016), suggests otherwise. Meanwhile, attention has been turned away from true existential threats that have been bandied about in the public sphere for at least a decade: global warming and climate change leading to Near-Term Extinction (NTE). Proximal threats, largely imagined, have absorbed all our available attention, and depending on whom one polls, our worst fears have already been realized.

The 20th and 21st centuries (so far) have been a series of “hot” wars (as distinguished from the so-called Cold War). Indeed, there has scarcely been a time when the U.S. has not been actively engaged fighting phantoms. If the Cold War was a bloodless, ideological war to stem the nonexistent spread of communism, we have adopted and coopted the language of wartime to launch various rhetorical wars. First was LBJ’s War on Poverty, the only “war” aimed at truly helping people. Nixon got into the act with his War on Drugs, which was punitive. Reagan expanded the War on Drugs, which became the War on Crime. Clinton increased the punitive character of the War on Crime by instituting mandatory minimum sentencing, which had the side effect of establishing what some call the prison-industrial complex, inflating the incarceration rate of Americans to the point that the U.S. is now ranked second in the world behind the Seychelles (!), a ranking far, far higher than any other industrialized nation.

If U.S. authoritarians hadn’t found enough people to punish or sought to convince the public that threats exist on all sides, requiring constant vigilance and a massive security apparatus including military, civil police, and intelligence services comprised of 16 separate agencies (of which we know), Bush coined and declared the War on Terror aimed at punishing those foreign and domestic who dare challenge U.S. hegemony in all things. It’s not called a national security state for nuthin’, folks. I aver that the rhetorical War on Poverty has inverted and now become a War on the Poverty-Stricken. De facto debtors’ prisons have reappeared, predatory lending has become commonplace, and income inequality grows more exaggerated with every passing year, leaving behind large segments of the U.S. population as income and wealth pool in an ever-shrinking number of hands. Admittedly, the trend is global.

At some point, perhaps in the 1960s when The Establishment (or more simply, The Man) became a thing to oppose, the actual Establishment must have decided it was high time to circle the wagons and protect its privileges, essentially going to war with (against, really) the people. Now five decades on, holders of wealth and power demonstrate disdain for those outside their tiny circle, and our the government can no longer be said with a straight face to be of, by, and for the people (paraphrasing the last line of Lincoln’s Gettysburg Address). Rather, the government has been hijacked and turned into something abominable. Yet the people are strangely complicit, having allowed history to creep along with social justice in marked retreat. True threats do indeed exist, though not the ones that receive the lion’s share of attention. I surmise that, as with geopolitics, the U.S. government has brought into being an enemy and conflict that bodes not well for its legitimacy. Which collapse occurs first is anyone’s guess.

This past Thursday was an occasion of protest for many immigrant laborers who did not show up to work. Presumably, this action was in response to recent executive attacks on immigrants and hoped to demonstrate how businesses would suffer without immigrant labor doing jobs Americans frequently do not want. Tensions between the ownership and laboring classes have a long, tawdry history I cannot begin to summarize. As with other contextual failures, I daresay the general public believes incorrectly that such conflicts date from the 19th century when formal sociopolitical theories like Marxism were published, which intersect heavily with labor economics. An only slightly better understanding is that the labor movement commenced in the United Kingdom some fifty years after the Industrial Revolution began, such as with the Luddites. I pause to remind that the most basic, enduring, and abhorrent labor relationship, extending back millennia, is slavery, which ended in the U.S. only 152 years ago but continues even today in slightly revised forms around the globe.

Thursday’s work stoppage was a faint echo of general strikes and unionism from the middle of the 20th century. Gains in wages and benefits, working conditions, and negotiating position transferred some power from owners to laborers during that period, but today, laborers must sense they are back on their heels, defending conditions fought for by their grandparents but ultimately losing considerable ground. Of course, I’m sympathetic to labor, considering I’m not in the ownership class. (It’s all about perspective.) I must also admit, however, to once quitting a job after only one day that was simply too, well, laborious. I had that option at the time, though it ultimately led nearly to bankruptcy for me — a life lesson that continues to inform my attitudes. As I survey the scene today, however, I suspect many laborers — immigrants and native-born Americans alike — have the unenviable choice of accepting difficult, strenuous labor for low pay or being unemployed. Gradual reduction of demand for labor has two main causes: globalization and automation.

(more…)

So the deed is done: the winning candidate has been duly delivered and solemnly sworn in as President of the United States. As I expected, he wasted no time and repaired to the Oval Office immediately after the inauguration (before the inaugural ball!) to sign an executive order aimed at the Affordable Care Act (a/k/a Obamacare), presumably to “ease the burden” as the legislative branch gets underway repealing and replacing the ACA. My only surprise is that he didn’t have a stack of similar executive orders awaiting signature at the very first opportunity. Of course, the president had not held back in the weeks up to the inauguration from issuing intemperate statements, or for that matter, indulging in his favorite form of attack: tweet storms against his detractors (lots of those). The culmination (on the very short term at least — it’s still only the weekend) may well have been the inaugural address itself, where the president announced that American interests come first (when has that ever not been the case?), which is being interpreted by many around the globe as a declaration of preemptive war.

The convention with each new presidential administration is to focus on the first hundred days. Back in November 2016, just after the election, National Public Radio (NPR) fact-checked the outline for the first hundred days provided by the campaign at the end of October 2016. With history speeding by, it’s unclear what portion of those plans have survived. Time will tell, of course, and I don’t expect it will take long — surely nowhere near 100 days.

So what is the difference between fulfilling one’s destiny and meeting one’s fate? The latter has a rather unsavory character to it, like the implied curse of the granted genie’s wish. The former smells vaguely of success. Both have a distinctly tragic whiff of inevitability. Either way, this new president appears to be hurrying headlong to effect changes promised during his campaign. If any wisdom is to be gathered at this most unpredictable moment, perhaps it should be a line offered today by fellow blogger the South Roane Agrarian (which may have in turn been stolen from the British version of House of Cards): “Beware of old men in a hurry.”

Aside: I was going to call this post “Fools Rush In,” but I already have one with that title and the slight revision above seems more accurate, at least until the bandwagon fills up.

Addendum: Seems I was partially right. There was a stack of executive orders ready to sign. However, they’ve been metered out over the course of the week rather than dumped in the hours shortly after the inauguration. What sort of calculation is behind that is pure conjecture. I might point out, though, that attention is riveted on the new president and will never subside, so there is no need, as in television, to keep priming the pump.

I attended a fundraiser a short while back. It’s familiar territory for me, filled with gifts culled from local businesses and corporations to be resold at auction, portable kitchens and bars to feed and libate guests to break down their inhibitions to giving, and lots of high heels and party dresses (with ample cleavage). Men rarely strut and parade the way the women do; tuxedos are the rare except. Secondary and tertiary activities are typical, often a DJ or live band that plays so loudly sensible people would flee the room rather than slowly go deaf. But monstrous volume in the era of amplified everything has dulled that reflex to nothingness. Or people are by now already deaf from habitual exposure to arena-rock volume filtered down to small venues. Folks simply, stupidly tough it out, ending the night with their ears ringing and their voices hoarse from screaming over the noise just to be heard.

Beneficiaries of fundraisers usually fall into two categories that are poorly served by American institutions: those seeking quality educations (including public schools that ought to be better funded through taxation) and folks suffering from catastrophic illness or disease that is ideally meant to be covered by health insurance but in practice is not. Auctioneers do pretty well enticing people to part with their money. It’s a true skill. But then, who goes to a fundraiser determined to hold tightly to their hard-earned cash? (Don’t answer that question.) Silent auctions may accompany the live auction, but the group vibe definitely contributes to some competition to outbid the next person (a wallet- or dick-measuring exercise?). Auction items are mostly luxury items, things normal Americans wouldn’t consider buying except when associated with charitable giving. Getting something for one’s charity (bought under or over its presumed market value) also shifts some portion of the philanthropic burden to those entities donating gifts.

All this is preliminary the most appallingly tone-deaf item offered for auction: a 4-person safari to a game preserve in South Africa to hunt and kill a wildebeest. When the auctioneer described the item, everyone in my vicinity looked at each other as if to say “what the fuck?” Certainly, humans have a long history of hunting game purely for sport (which is to say, not for food), and from the perspective of a South African safari outfitter, wild animals are a natural resource to be exploited the same way that, for instance, mining and agriculture is conducted throughout the world, but the last few years have seen a notable change of heart about killing animals, especially so-called romance animals (mostly large mammals, including whales, less so large fish), without need or admirable purpose. The outcry over an American dentist killing Cecil the Lion was an expression of that sentiment. So, too, was the killing of a gorilla at the Cincinnati Zoo after a child fell into the enclosure. (Personally, considering how few of them exist, I would privilege the life of the gorilla over the child, but that’s a mine field.) Pictures of Donald Trump’s sons standing over their trophy kills have also elicited significant disapproval. We are now acutely aware that wild animals are not an inexhaustible resource (and never were — consider the passenger pigeon).

I judged that bidding on the safari was no more or less robust than other auction items, but I mentioned aloud that if I were to bid on it, I would probably go on the safari but would also insist on merely paintballing the poor wildebeest, a relatively harmless proxy for killing it needlessly. Admittedly, the wildebeest would experience the same existential terror as if it were being hunted to death, but at least it would live. Or it would live until the next safari came round. Hunting and killing a wildebeest or other large game has never been on my bucket list, and its appearance at auction would not suddenly inspire me to add it to the list. That is the province of of a class of fools rich and insulated enough to still regard the world as their playground, with no thought of responsibility, stewardship, or consequences.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

The U.S. election has come and gone. Our long national nightmare is finally over; another one is set to begin after a brief hiatus. (I’m not talking about Decision 2020, though that spectre has already reared its ugly head.) Although many were completely surprised by the result of the presidential race in particular, having placed their trust in polls, statistical models, and punditry to project a winner (who then lost), my previous post should indicate that I’m not too surprised. Michael Moore did much better taking the temperature of the room (more accurately, the nation) than all the other pundits, and even if the result had differed, the underlying sentiments remain. It’s fair to say, I think, that people voted with their guts more than their heads, meaning they again voted their fears, hates, and above all, for revolution change. No matter that the change in store for us will very likely be destructive and against self-interest. Truth is, it would have had to end with destruction with any of the candidates on the ballot.

Given the result, my mind wandered to Hillary Clinton’s book It Takes a Village, probably because we, the citizens of the Unites States of America, have effectively elected the village idiot to the nation’s highest office. Slicing and dicing the voting tallies between the popular vote, electoral votes, and states and counties carried will no doubt be done to death. Paths to victory and defeat will be offered with the handsome benefit of hindsight. Little of that matters, really, when one considers lessons never learned despite ample opportunity. For me, the most basic lesson is that for any nation of people, leaders must serve the interests of the widest constituency, not those of a narrow class of oligarchs and plutocrats. Donald Trump addressed the people far more successfully than did Hillary Clinton (with her polished political doubletalk) and appealed directly to their interests, however base and misguided.

My previous post called Barstool Wisdom contained this apt quote from The Brothers Karamazov by Dostoevsky:

The more stupid one is, the closer one is to reality. The more stupid one is, the clearer one is. Stupidity is brief and artless, while intelligence squirms and hides itself.

We have already seen that our president-elect has a knack for stating obvious truths no one else dares utter aloud. His clarity in that regard, though coarse, contrasts completely with Hillary’s squirmy evasions. Indeed, her high-handed approach to governance, more comfortable in the shadows, bears a remarkable resemblance to Richard Nixon, who also failed to convince the public that he was not a crook. My suspicion is that as Donald Trump gets better acquainted with statecraft, he will also learn obfuscation and secrecy. Some small measure of that is probably good, actually, though Americans are pining for greater transparency, one of the contemporary buzzwords thrown around recklessly by those with no real interest in it. My greater worry is that through sheer stupidity and bullheadedness, other obvious truths, such as commission of war crimes and limits of various sorts (ecological, energetic, financial, and psychological), will go unheeded. No amount of barstool wisdom can overcome those.

Predictions are fool’s errands. Useful ones, anyway. The future branches in so many possible directions that truly reliable predictions are banal, such as the sun will rise in the east, death, and taxes. (NTE is arguably another term for megadeath, but I gotta reinforce that prediction to keep my doomer bonafides.) Now only a few days prior to the general election finds me anxious that the presidential race is still too close to call. More than a few pundits say that Donald Trump could actually win. At the same time, a Hillary Clinton win gives me no added comfort, really. Moreover, potential squabbles over the outcome threaten to turn the streets into riot zones. I had rather expected such disruptions during or after the two nominating conventions, but they settled on their presumptive nominees without drama.

Polls are often predictive, of course, and despite their acknowledged margins of error, they typically forecast results with enough confidence that many voters don’t bother to vote, safe in the assumption that predicted results (an obvious oxymoron) make moot the need to actually cast one’s vote. (The West Coast must experience this phenomenon more egregiously than the East Coast, except perhaps for California’s rather large population and voting power. Has Hawaii ever mattered?) For that reason alone, I’d like to see a blackout on polling in the weeks leading up to an election (2–3 ought to do), including election day. This would allow us to avoid repeating the experience of the Chicago Daily Tribune publishing the headline “Dewey Defeats Truman” back in 1948.

Analysis of voting patterns and results also dissuades voters from considering anything other than a strategic vote for someone able to actually win, as opposed to supporting worthy candidates polling far enough behind they don’t stand a chance of winning, thus reinforcing a two-party system no one really likes because it keeps delivering supremely lousy candidates. Jesse Ventura, having defied the polls and been elected to office as an independent, has been straightforward about his disdain for the notion that voting outside the two main parties is tantamount to throwing away one’s vote. A related meme is that by voting for independent Ralph Nader in 2000, the Democratic vote was effectively split, handing the win (extraordinarily close and contestable though it was) to George Bush. My thinking aligns with Jesse Ventura, not with those who view votes for Ralph Nader as betrayals.

If the presidential race is still too close for comfort, Michael Moore offers a thoughtful explanation how Trump could win:

This excerpt from Moore’s new film TrumpLand has been taken out of context by many pro-Trump ideologues. I admit the first time I saw it I was unsure whether Moore supports Trump. Additional remarks elsewhere indicate that he does not. The spooky thing is that as emotional appeals go, it’s clear that Trump connects with the people powerfully. But Moore is right about another thing: to vote for Trump is really a giant “fuck you” to the establishment, which won’t end well.

This is a continuation from part 1.

A long, tortured argument could be offered how we (in the U.S.) are governed by a narrow class of plutocrats (both now and at the founding) who not-so-secretly distrust the people and the practice of direct democracy, employing instead mechanisms found in the U.S. Constitution (such as the electoral college) to transfer power away from the people to so-called experts. I won’t indulge in a history lesson or other analysis, but it should be clear to anyone who bothers to look that typical holders of elected office (and their appointees) more nearly resemble yesteryear’s landed gentry than the proletariat. Rule by elites is thus quite familiar to us despite plenty of lofty language celebrating the common man and stories repeated ad naseum of a few exceptional individuals (exceptional being the important modifier here) who managed to bootstrap their way into the elite from modest circumstances.

Part 1 started with deGrasse Tyson’s recommendation that experts/elites should pitch ideas at the public’s level and ended with my contention that some have lost their public by adopting style or content that fails to connect. In the field of politics, I’ve never quite understood the obsession with how things present to the public (optics) on the one hand and obvious disregard for true consent of the governed on the other. For instance, some might recall pretty serious public opposition before the fact to invasion of Afghanistan and Iraq in response to the 9/11 attacks. The Bush Administration’s propaganda campaign succeeded in buffaloing a fair percentage of the public, many of whom still believe the rank lie that Saddam Hussein had WMDs and represented enough of an existential threat to the U.S. to justify preemptive invasion. Without indulging in conspiratorial conjecture about the true motivations for invasion, the last decade plus has proven that opposition pretty well founded, though it went unheeded.

(more…)

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/