Posts Tagged ‘Rants’

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

/rant on

With a new round of presidential debates upon us (not really debates if one understands the nature of debate or indeed moderation — James Howard Kunstler called it “the gruesome spectacle of the so-called debate between Trump and Clinton in an election campaign beneath the dignity of a third-world shit-hole”), it’s worthwhile to keep in the front of one’s mind that the current style of public discourse does not aim to provide useful or actionable information with regard to either the candidates or the issues. Rather, the idea is to pummel the hapless listener, watcher, or reader into a quivering jangle of confusion by maintaining a nonstop onslaught of soundbites, assertions, accusations, grandstanding, and false narratives. Our information environment abets this style of machine-gun discourse, with innumerable feeds from InstaGoogTwitFaceTube (et cetera), all vying simultaneously for our limited attention and thereby guaranteeing that virtually nothing makes a strong impression before the next bit of BS displaces it in a rapid succession of predigested morsels having no nutritional content or value for earnest consumers of information (as opposed to mouth-breathers seeking emotional salve for their worst biases and bigotry). Many feeds are frankly indecipherable, such as when the message is brutally truncated and possessed of acronyms and hashtags, the screen is cluttered with multiple text scrolls, or panel participants talk over each other to claim more screen time (or merely raise their asshole quotient by being the most obnoxious). But no matter so long as the double barrels keep firing.

I caught Republican nominee Donald Trump’s campaign manager Kellyann Conway being interviewed by some banal featherweight pulling punches (sorry, no link, but she’s eminently searchable). Conway proved adept at deflecting obvious contradictions and reversals (and worse) of the Trump campaign by launching so many ideological bombs that nothing the interviewer raised actually landed. Questions and conflicts just floated away, unaddressed and unanswered. Her bizarre, hyperverbal incoherence is similar to the candidate’s stammering word salad, and ironically, both give new meaning to the decades-old term “Teflon” when applied to politics. Nothing sticks because piling on more and more complete wrongness and cognitive dissonance overwhelms and bewilders anyone trying to track the discussion. Trump and Conway are hardly alone in this, of course, though their mastery is notable (but not admirable). Talking heads gathered in panel discussions on, say, The View or Real Time with Bill Maher, just about any klatch occupying news and morning-show couches, and hosts of satirical news shows (some mentioned here) exhibit the same behavior: a constant barrage of high-speed inanity (and jokes, omigod the jokes!) that discourages consideration of an idea before driving pellmell onto the next.

Thoughtful persons might pause to wonder whether breathless, even virtuoso delivery results from or creates our abysmally short attention spans and lack of serious discussion of problems plaguing the nation. Well, why can’t it be both? Modern media is all now fast media, delivering hit-and-run spectacle to overloaded nervous systems long habituated to being goosed every few moments. (Or as quoted years ago, “the average Hollywood movie has become indistinguishable from a panic attack.”) Our nervous systems can’t handle it, obviously. We have become insatiable information addicts seeking not just the next fix but a perpetual fix, yet the impatient demand for immediate gratification — Internet always at our fingertips — is never quelled. Some new bit will be added to the torrent of foolishness sooner than it can be pulled down. And so we stumble like zombies, blindly and willingly, into a surreality of our own making, heads down and faces blue from the glare of the phone/tablet/computer. Of course, the shitshow is brightly festooned with buffoon candidates holding court over the masses neither intends to serve faithfully in office. Their special brand of insanity is repeated again and again throughout the ranks of media denizens (celebrity is a curse, much like obscene wealth, or didn’t you know that?) and is seeping into the ground water to poison all of us.

/rant off

rant on/

Monastic pursuit of a singular objective, away from the maddening and distracting rush of modern life, is a character attribute that receives more than its rightful share of attention. In its salutary forms, monastic pursuit is understood as admirable, visionary, iconic (or iconoclastic), and heroic. In creative endeavors, seclusion and disengagement from feedback are preconditions for finding one’s true voice and achieving one’s vision. In sports, the image of the athlete devoted to training for the big event — race, match, tournament — to the exclusion of all else is by now a tired trope. Indeed, in this Olympics season, athlete profiles — puff pieces of extraordinary predictability — typically depict competitors in isolation, absolutely no one else at the gym, in the pool, on the track, etc., as though everyone goes it alone without the support or presence of coaches or teammates. Over-specialization and -achievement are such that spectators are conditioned to expect successful individuals, champions, to bleed (quite literally) as a mark of devotion to their respective fields.

At some point, however, monastic pursuit morphs into something more recognizably maniacal. The author retreating to his cabin in the woods to write the great American novel becomes the revolutionary hermit composing his political manifesto. Healthy competition among rivals turns into decidedly unsportsmanlike conduct. (Lance Armstrong is the poster boy not just for doping but also for the sociopathy he displayed mistreating teammates and perpetuating the lie as vehemently and as long as he did. Further examples compound quickly in sports). Business leaders, discontented with (sometime obscene) profitability, target others in their market sector with the intent of driving them out of business and establishing monopolies. (This contrasts markedly with the ideology of self-correcting markets many CEOs falsely espouse.) In politics, high-minded campaigns and elected politicians formed around sound policy and good governance lose out to such dirty tricks as character assassination, rigged and stolen elections, partisanship, and reflexive obstructionism of projects that enjoy popular support. In journalism, fair and balanced reporting inverts to constant harping on preferred talking points to control narratives through sheer force of repetition. You get the idea.

It’s difficult to say from where this intemperate impulse arises, but we’re undoubtedly in a phase of history where nearly every field of endeavor manifests its own version of the arms race. Some might argue that in a cost-benefit analysis, we’re all better off because we enjoy fruits not obtainable without (some folks at least) taking a scorched-earth approach, raising the bar, and driving everyone to greater heights. The willingness of some to distort and disgrace themselves hideously may be a high price to pay, especially when it’s for simple entertainment, but so long as we aren’t paying the price personally, we’re willing spectators to whatever glory and train wrecks occur. I would argue that, ultimately, we’re all paying the price. Routine competition and conflict resolution have grown so unhinged that, just to be in the game, competitors must be prepared to go all in (poker lingo) at even modest provocation. As a result, for just one example, the spirit of America’s erstwhile pastime (baseball) has been so corrupted that balanced players and fans (!) stay away and are replaced by goons. A true level playing field probably never existed. Now, however, whoever can muster the most force (financial, rhetorical, criminal) wins the trophy, and we’re each in turn encouraged to risk all in our own monastic pursuit.

rant off/

Over at Gin and Tacos, the blogger has an interesting take on perverse incentives that function to inflate grades (and undermine learning), partly by encouraging teachers to give higher grades than deserved at the first hint of pushback from consumers students, parents, or administration. The blog post is more specifically about Why Johnny Can’t Write and references a churlish article in Salon. All well and good. The blog author provides consistently good analysis as a college professor intimate with the rigors of higher education and the often unprepared students deposited in his classroom. However, this comment got my attention in particular. The commentator is obviously a troll, and I generally don’t feed trolls, so I made only one modest comment in the comments section. Because almost no one reads The Spiral Staircase, certainly no one from the estimable Gin and Tacos crowd, I’ll indulge myself, not the troll, by examining briefly the main contention, which is that quality of writing, including correct grammar, doesn’t matter most of the time.

Here’s most of the comment (no link to the commentator’s blog, sorry):

1. Who gives a flying fuck about where the commas go? About 99.999999999999999% of the time, it makes NO DIFFERENCE WHATSOEVER in terms of understanding somebody’s point if they’ve used a comma splice. Is it a technical error? Sure. Just like my unclear pronoun reference in the last sentence. Did you understand what I meant? Unless you were actively trying not to, yes, you did.

2. There’s are hundreds of well-conducted peer-reviewed studies by those of us who actually specialize in writing pedagogy documenting the pointlessness of teaching grammar skills *unless students give a fuck about what they’re writing.* We’ve known this since the early 1980s. So when the guy from the high school English department in the article says they can’t teach grammar because students think it’s boring, he’s unwittingly almost making the right argument. It’s not that it’s boring–it’s that it’s irrelevant until the students have something they want to say. THEN we can talk about how to say it well.

Point one is that people manage to get their points across adequately without proper punctuation, and point two is that teaching grammar is accordingly a pedagogical dead end. Together, they assert that structure, rules, syntax, and accuracy make no difference so long as communication occurs. Whether one takes the hyperbole “99.999999999999999% of the time” as the equivalent of all the time, almost all the time, most of the time, etc. is not of much interest to me. Red herring served by a troll.

/rant on

As I’ve written before, communication divides neatly into receptive and expressive categories: what we can understand when communicated to us and what we can in turn communicate effectively to others. The first category is the larger of the two and is greatly enhanced by concerted study of the second. Thus, reading comprehension isn’t merely a matter of looking up words in the dictionary but learning how it’s customary and correct to express oneself within the rules and guidelines of Standard American English (SAE). As others’ writing and communication becomes more complex, competent reception is more nearly an act of deciphering. Being able to parse sentences, grasp paragraph structure, and follow the logical thread (assuming those elements are handled well) is essential. That’s what being literate means, as opposed to being semi-literate — the fate of lots of adults who never bothered to study.

To state flatly that “good enough” is good enough is to accept two unnecessary limitations: (1) that only a portion of someone’s full, intended message is received and (2) that one’s own message is expressed with no better than adolescent sophistication, if that. Because humans are not mind readers, loss of fidelity between communicated intent and receipt is acknowledged. By further limiting oneself to lazy and unsophisticated usage is, by way of analogy, to reduce the full color spectrum to black and white. Further, the suggestion that students can learn to express themselves properly once they have something to say misses the whole point of education, which is to prepare them with adult skills in advance of need.

As I understand it, modern American schools have shifted their English curricula away from the structural, prescriptive approach toward licentious usage just to get something onto the page, or in a classroom discussion, just to choke something out of students between the hemming ums, ers, uhs, ya knows, and I dunnos. I’d like to say that I’m astonished that researchers could provide cover for not bothering to learn, relieving both teachers and students of the arduous work needed to develop competence, if not mastery. That devaluation tracks directly from teachers and administrators to students and parents, the latter of whom would rather argue for their grades than earn them. Pity the fools who grub for grades without actually learning and are left holding worthless diplomas and degrees — certificates of nonachievement. In all likelihood, they simply won’t understand their own incompetence because they’ve been told all their lives what special flowers they are.

/rant off

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/

A little more content lite (even though my complaint is unavoidable). Saw on Motherboard a report on a first-person, Web-based shopping game about Black Friday zombie mall shoppers. You can play here. It’s pure kitsch but does reinforce the deplorable behaviors of sale-crazed shoppers swarming over each other to get at goodies (especially cheap electronics), sometimes coming to blows. Videos of 2015 Black Friday brawls appeared almost immediately.

We apparently learn nothing year-over-year as we reenact our ritual feeding frenzy, lasting all the way through New Year’s Eve. (I never go out on Black Friday.) I might have guessed that big box retailers face diminishing returns with store displays torn apart, disgruntled shoppers, traumatized employees, and the additional cost of rent-a-cops to herd the masses and maintain order (which obviously doesn’t work in many instances). Yet my e-mail inbox keeps loading up with promotions and advertisements, even a day later. The video game in particular reminds me of Joe Bageant’s great line: “We have embraced the machinery of our undoing as recreation.”

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

I received an e-mail with the usual ranting about some travesty by an anonymous Internet troll. These are always forwarded to me by a family member. I can’t decide whether this rant (grammatical and punctuation errors uncorrected) is more nearly economic or social. We should have a word like socioeconomic to cover both. Oh, wait … My counter-rant follows.

This is why  people are supporting TRUMP! From a Florida ER doctor:

I live and work in a state overrun with Illegal’s. They make more money having kids than we earn working full-time.

Today I had a 25-year old with 8 kids – that’s right 8, all Illegal Anchor Babies and she had the nicest nails, cell phone, hand bag, clothing, etc. She makes about $1,500 monthly for each; you do the math.

I used to say, “We are the dumbest nation on earth,” Now I must say and sadly admit: WE are the dumbest people on earth (that includes ME) for we Elected the Idiot Ideologues who have passed the Bills that allow this. Sorry, but we need a Revolution,

If the Illegal Immigrant is over 65, they can apply for SSI and Medicaid and get more than a woman on Social Security, who worked from 1944 until 2004. She is only getting $791 per month because she was born in 1924 and there’s a ‘catch 22’ (notch) for her. It is interesting that the Federal Government provides a single refugee with a monthly allowance of $1,890. Each can also obtain an additional $580 in Social Assistance, for a total of $2,470 a month. This compares to a single pensioner, who after contributing to the growth and development of America for 40 to 50 years, can only receive a monthly maximum of $1,012 in Old Age Pension and Guaranteed Income Supplement. Maybe our Pensioners should apply as Refugees!

Consider sending this to all your American friends, so we can all be ticked off and maybe get the Refugees cut back to $1,012 and the Pensioners up to $2,470. Then we can enjoy some of the money we were forced to submit to the Government over the last 40 or 50 or 60 years.

PLEASE SEND THIS TO EVERY AMERICAN TAXPAYER YOU  KNOW! We need a real change that will be healthy for America!

No way was that penned by a Florida ER doc. Educated, licensed professionals (doctors, lawyers, engineers, CPAs) do not speak or write incoherent screed and straight-up lies like that — at least until they become presidential candidates. The mention of Florida should invalidate that bogus appeal to authority all by itself, considering what sorts of craziness come out of that state. It’s far more likely that it was written by some anonymous Tea Party supporter, typically a cranky old white person who can feel him- or herself being overwhelmed by an unstoppable demographic wave (just like the rest of us).


Updates to my blogroll are infrequent. I only add blogs that present interesting ideas (with which I don’t always agree) and/or admirable writing. Deletions are typically the result of a change of focus at the linked blog, or regrettably, the result of a blogger becoming abusive or self-absorbed. This time, it’s latter. So alas, another one bites the dust. Dropping off my blogroll — no loss since almost no one reads my blog — is On an Overgrown Path (no link), which is about classical music.

My indignation isn’t about disagreements (we’ve had a few); it’s about inviting discussion in bad faith. I’m very interested in contributing to discussion and don’t mind moderated comments to contend with trolls. However, my comments drive at ideas, not authors, and I’m scarcely a troll. Here’s the disingenuously titled blog post, “Let’s Start a Conversation about Concert Hall Sound,” where the blogger declined to publish my comment, handily blocking conversation. So for maybe the second time in the nearly 10-year history of this blog, I am reproducing the entirety of another’s blog post (minus the profusion of links, since that blogger tends to create link mazes, defying readers to actually explore) followed by my unpublished comment, and then I’ll expound and perhaps rant a bit. Apologies for the uncharacteristic length. (more…)

“Any sufficiently advanced technology is indistinguishable from magic.” –Arthur C. Clarke

/rant on

Jon Evans at TechCrunch has an idiot opinion article titled “Technology Is Magic, Just Ask The Washington Post” that has gotten under my skin. His risible assertion that the WaPo editorial board uses magical thinking misframes the issue whether police and other security agencies ought to have backdoor or golden-key access to end-users’ communications carried over electronic networks. He marshals a few experts in the field of encryption and information security (shortened to “infosec” — my, how hep) who insist that even if such a thing (security that is porous to select people or agencies only) were possible, that demand is incompatible with the whole idea of security and indeed privacy. The whole business strikes me as a straw man argument. Here is Evans’ final paragraph:

If you don’t understand how technology works — especially a technical subgenre as complex and dense as encryption and information security — then don’t write about it. Don’t even have an opinion about what is and isn’t possible; just accept that you don’t know. But if you must opine, then please, at least don’t pretend technology is magic. That attitude isn’t just wrong, it’s actually dangerous.

Evans is pushing on a string, making the issue seem as though agencies that simply want what they want believe in turn that those things come into existence by the snap of one’s fingers, or magically. But in reality beyond hyperbole, absolutely no one believes that science and technology are magic. Rather, software and human-engineered tools are plainly understood as mechanisms we design and fabricate through our own effort even if we don’t understand the complexity of the mechanism under the hood. Further, everyone beyond the age of 5 or 6 loses faith in magical entities such as the Tooth Fairy, unicorns, Fairy God Mothers, etc. at about the same time that Santa Claus is revealed to be a cruel hoax. A sizable segment of the population for whom the Reality Principle takes firm root goes on to lose faith in progress, humanity, religion, and god (which version becomes irrelevant at that point). Ironically, the typically unchallenged thinking that technology delivers, among other things, knowledge, productivity, leisure, and other wholly salutary effects — the very thinking a writer for TechCrunch might exhibit — falls under the same category.

Who are these magical creatures who believe their smartphones, laptops, TVs, vehicles, etc. are themselves magical simply because their now routine operations lie beyond the typical end-user’s technical knowledge? And who besides Arthur C. Clarke is prone to calling out the bogus substitution of magic for mechanism besides ideologues? No one, really. Jon Evans does no one any favors by raising this argument — presumably just to puncture it.

If one were to observe how people actually use the technology now available in, say, handheld devices with 24/7/365 connection to the Internet (so long as the batteries hold out, anyway), it’s not the device that seems magical but the feeling of being connected, knowledgeable, and at the center of activity, with a constant barrage of information (noise, mostly) barreling at them and defying them to turn attention away lest something important be missed. People are so dialed into their devices, they often lose touch with reality, much like the politicians who no longer relate to or empathize with voters, preferring to live in their heads with all the chatter, noise, news, and entertainment fed to them like an endorphin drip. Who cares how the mechanism delivers, so long as supply is maintained? Similarly, who cares how vaporware delivers unjust access? Just give us what we want! Evans would do better to argue against the unjust desire for circumvention of security rather than its presumed magical mechanism. But I guess that idea wouldn’t occur to a technophiliac.

/rant off