Archive for the ‘Writing’ Category

Most of us are familiar with a grandpa, uncle, or father who eventually turns into a cranky old man during late middle age or in his dotage. (Why is it a mostly male phenomenon?) In the last three decades, Clint Eastwood typecast himself as a cranky old man, building on lone-wolf characters (mostly cops, criminals, and cowboys) established earlier in his career. In real life, these guys spout talking points absorbed from mainstream media and narrative managers, or if they are truly lazy and/or can’t articulate anything coherently on their own, merely forward agitprop via e-mail like chain mail of yore. They also demonstrate remarkably forgivable racism, sexism, and bigotry, such as Eastwood’s rather enjoyable and ultimately redeemed character in the film Gran Torino. If interaction with such a fellow is limited to Thanksgiving gatherings once per year, crankiness can be tolerated fairly easily. If interactions are ongoing, then a typical reaction is simply to delete e-mail messages unread, or in the case of unavoidable face-to-face interaction, to chalk it up: Well, that’s just Grandpa Joe or Uncle Bill or Dad. Let him rant; he’s basically harmless now that he’s so old he creaks.

Except that not all of them are so harmless. Only a handful of the so-called Greatest Generation (I tire of the term but it’s solidly established) remain in positions of influence. However, lots of Boomers still wield considerable power despite their advancing age, looming retirement (and death), and basic out-of-touchness with a culture that has left them behind. Nor are their rants and bluster necessarily wrong. See, for instance, this rant by Tom Engelhardt, which begins with these two paragraphs:

Let me rant for a moment. I don’t do it often, maybe ever. I’m not Donald Trump. Though I’m only two years older than him, I don’t even know how to tweet and that tells you everything you really need to know about Tom Engelhardt in a world clearly passing me by. Still, after years in which America’s streets were essentially empty, they’ve suddenly filled, day after day, with youthful protesters, bringing back a version of a moment I remember from my youth and that’s a hopeful (if also, given Covid-19, a scary) thing, even if I’m an old man in isolation in this never-ending pandemic moment of ours.

In such isolation, no wonder I have the urge to rant. Our present American world, after all, was both deeply unimaginable — before 2016, no one could have conjured up President Donald Trump as anything but a joke — and yet in some sense, all too imaginable …

If my own father (who doesn’t read this blog) could articulate ideas as well as Engelhardt, maybe I would stop deleting unread the idiocy he forwards via e-mail. Admittedly, I could well be following in my father’s footsteps, as the tag rants on this blog indicates, but at least I write my own screed. I’m far less accomplished at it than, say, Engelhardt, Andy Rooney (in his day), Ralph Nader, or Dave Barry, but then, I’m only a curmudgeon-in-training, not having fully aged (or elevated?) yet to cranky old manhood.

As the fall presidential election draws near (assuming that it goes forward), the choice in the limited U.S. two-party system is between one of two cranky old men, neither of which is remotely capable of guiding the country through this rough patch at the doomer-anticipated end of human history. Oh, and BTW, echoing Engelhardt’s remark above, 45 has been a joke all of my life — a dark parody of success — and remains so despite occupying the Oval Office. Their primary opponent up to only a couple months ago was Bernie Sanders, himself a cranky old man but far more endearing at it. This is what passes for the best leadership on offer?

Many Americans are ready to move on to someone younger and more vibrant, able to articulate a vision for something, well, different from the past. Let’s skip right on past candidates (names withheld) who parrot the same worn-out ideas as our fathers and grandfathers. Indeed, a meme emerged recently to the effect that the Greatest Generation saved us from various early 20th-century scourges (e.g., Nazis and Reds) only for the Boomers to proceed in their turn to mess up the planet so badly nothing will survive new scourges already appearing. It may not be fair to hang such labels uniformly around the necks of either generation (or subsequent ones); each possesses unique characteristics and opportunities (some achieved, others squandered) borne out of their particular moment in history. But this much is clear: whatever happens with the election and whichever generational cohort assumes power, the future is gonna be remarkably different.

This is an infrequent feature of this blog: additions to and deletions from my blogroll. Other bloggers attract my attention for various reasons, mostly the quality of writing and ideas (interrelated), but over time, some start to repel me. This update has several in both categories.

At Wit’s End, Three-Pound Brain, and Bracing Views were are all added some while back. The first two have new posts very infrequently, but the quality is very high (IMO). The last is far more active and solicits commentary openly. Subject matter at these blogs varies widely, and only the third could be accused of being an outrage engine. It’s a worthwhile read nonetheless if political dysfunction doesn’t ignite in you a firestorm of rage and indignation.

Dropping Creative Destruction, Gin & Tacos and Pharyngula. The first has been dead for a long time; nothing there to see anymore besides the backblog. I thought it might eventually revive, but alas, no. Updates to the second have dropped significantly as authorial attention shifted to podcasting. The commentariat there was especially worthwhile, but with so few new posts, the disappearance of whimsical history lessons, and irritating focus on racehorse politics, the blog has lost my recommendation. The third used to be a fun read, especially for being well argued. The tone shifted at some point toward smug, woke felation service of an in-group, by definition excluding everyone else. Like another unmentioned blog dropped from my blogroll some years ago, the author behaves like an omniscient bully: being absolutely correct about everything all the time. The lack of humility or tolerance for ambiguity — or even the very human admission once in a while “I dunno …” — is exhausting.

Final admission: traffic to and from this blog is chronically low, so no blogger cares about being added or removed from my blogroll. No illusions about that on my part. However, respectable curation is a value worth periodic updates.

A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twin disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

I was never much drawn to genre fiction and frankly can’t remember reading Orwell’s 1984 in middle school. It wasn’t until much later that I read Huxley’s Brave New World (reviewed here) and reread 1984. The latest in the dystopian pantheon to darken my brow is Nevil Shute’s On the Beach. None of these are pleasure reading, exactly. However, being landmarks of the genre, I have at times felt compelled to acquaint myself with them. I never saw the 1959 movie On the Beach with Gregory Peck, but I did see the updated 2000 remake with Armande Assante. It rather destroyed me — forcing me for the first time to contemplate near-term human extinction — but was mostly forgotten. Going back recently to read the original 1957 novel, I had a broad recollection of the scenario but little of the detail.

(more…)

My previous entry on this topic is found here. The quintessential question asked with regard to education (often levied against educators) is “Why can’t Johnnie read?” I believe we now have several answers.

Why Bother With Basics?

A resurrected method of teaching readin’ and writin’ (from the 1930s as it happens) is “freewriting.” The idea is that students who experience writer’s block should dispense with basic elements such as spelling, punctuation, grammar, organization, and style to simply get something on the page, coming back later to revise and correct. I can appreciate the thinking, namely, that students so paralyzed from an inability to produce finished work extemporaneously should focus first on vomiting blasting something onto the page. Whether those who use freewriting actually go back to edit (as I do) is unclear, but it’s not a high hurdle to begin with proper rudiments.

Why Bother Learning Math?

At Michigan State University, the algebra requirement has been dropped from its general education requirements. Considering that algebra is a basic part of most high school curricula, jettisoning algebra from the university core curriculum is astonishing. Again, it’s not a terribly high bar to clear, but for someone granted a degree from an institution of higher learning to fail to do so is remarkable. Though the rationalization offered at the link above is fairly sophisticated, it sounds more like Michigan State is just giving up asking its students to bother learning. The California State University system has adopted a similar approach. Wayne State University also dropped its math requirement and upped the ante by recommending a new diversity requirement (all the buzz with social justice warriors).

Why Bother Learning Music?

The Harvard Crimson reports changes to the music curriculum, lowering required courses for the music concentration from 13 to 10. Notably, most of the quotes in the article are from students relieved to have fewer requirements to satisfy. The sole professor quoted makes a bland, meaningless statement about flexibility. So if you want a Harvard degree with a music concentration, the bar has been lowered. But this isn’t educational limbo, where the difficulty is increased as the bar goes down; it’s a change from higher education to not-so-high-anymore education. Not learning very much about music has never been prohibition to success, BTW. Lots of successful musicians don’t even read music.

Why Bother Learning History?

According to some conservatives, U.S. history curricula, in particular this course offered by The College Board, teach what’s bad about America and undermine American exceptionalism. In 2015, the Oklahoma House Common Education Committee voted 11-4 for emergency House Bill 1380 (authored by Rep. Dan Fisher) “prohibiting the expenditure of funds on the Advanced Placement United States History course.” This naked attempt to sanitize U.S. history and substitute preferred (patriotic) narratives is hardly a new phenomenon in education.

Takeaway

So why can’t Johnnie read, write, know, understand, or think? Simply put, because we’re not bothering to teach him to read, write, know, understand, or think. Johnnie has instead become a consumer of educational services and political football. Has lowering standards ever been a solution to the struggle of getting a worthwhile education? Passing students through just to be rid of them (while collecting tuition) has only produced a mass of miseducated graduates. Similarly, does a certificate, diploma, or granted degree mean anything as a marker of achievement if students can’t be bothered to learn time-honored elements of a core curriculum? The real shocker, of course, is massaging the curriculum itself (U.S. history in this instance) to produce citizens ignorant of their own past and compliant with the jingoism of the present.

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

Lingua Nova 02

Posted: August 20, 2016 in Idle Nonsense, Nomenclature, Writing
Tags:

From time to time, I indulge my predilection for nomenclature and neologism. Those collected below are not novel so much as repurposed.

code blue: the closing of ranks and circling of wagons undertaken by police departments in the wake of an officer (or officers) killing an unarmed citizen, usually black, accompanied by the insistence that the officer(s) by definition can do no wrong.

Christian fiction: an alternative narrative promulgated by Christian fundamentalists in a field of inquiry not based on faith and belief.

STEAM: the addition of Arts to STEM fields (Science, Technology, Engineering, Mathematics) to acknowledge the fundamental source of creativity and innovation.

nostalgia mining: the creation of entertainments that guilelessly seek to exploit the resource of fond remembrance of days past.

That’s a short list, to be sure. If you feel cheated, here’s an additional list of words sure to confound any interlocutors: espial, telic, desuetude, fubbed, girandoles, catarrh, avulse, gobbet, diaphanouspipping, panicles, cark, cantilene, fisc, phylactery, princox, and funest. I knew only a few of them before stealing most from something I read and can’t imagine using any in speech unless I’m trying not to communicate.

Over at Gin and Tacos, the blogger has an interesting take on perverse incentives that function to inflate grades (and undermine learning), partly by encouraging teachers to give higher grades than deserved at the first hint of pushback from consumers students, parents, or administrators. The blog post is more specifically about Why Johnny Can’t Write and references a churlish article in Salon. All well and good. The blog author provides consistently good analysis as a college professor intimate with the rigors of higher education and the often unprepared students deposited in his classroom. However, this comment got my attention in particular. The commentator is obviously a troll, and I generally don’t feed trolls, so I made only one modest comment in the comments section. Because almost no one reads The Spiral Staircase, certainly no one from the estimable Gin and Tacos crowd, I’ll indulge myself, not the troll, by examining briefly the main contention, which is that quality of writing, including correct grammar, doesn’t matter most of the time.

Here’s most of the comment (no link to the commentator’s blog, sorry):

1. Who gives a flying fuck about where the commas go? About 99.999999999999999% of the time, it makes NO DIFFERENCE WHATSOEVER in terms of understanding somebody’s point if they’ve used a comma splice. Is it a technical error? Sure. Just like my unclear pronoun reference in the last sentence. Did you understand what I meant? Unless you were actively trying not to, yes, you did.

2. There’s are hundreds of well-conducted peer-reviewed studies by those of us who actually specialize in writing pedagogy documenting the pointlessness of teaching grammar skills *unless students give a fuck about what they’re writing.* We’ve known this since the early 1980s. So when the guy from the high school English department in the article says they can’t teach grammar because students think it’s boring, he’s unwittingly almost making the right argument. It’s not that it’s boring–it’s that it’s irrelevant until the students have something they want to say. THEN we can talk about how to say it well.

Point one is that people manage to get their points across adequately without proper punctuation, and point two is that teaching grammar is accordingly a pedagogical dead end. Together, they assert that structure, rules, syntax, and accuracy make no difference so long as communication occurs. Whether one takes the hyperbole “99.999999999999999% of the time” as the equivalent of all the time, almost all the time, most of the time, etc. is not of much interest to me. Red herring served by a troll.

/rant on

As I’ve written before, communication divides neatly into receptive and expressive categories: what we can understand when communicated to us and what we can in turn communicate effectively to others. The first category is the larger of the two and is greatly enhanced by concerted study of the second. Thus, reading comprehension isn’t merely a matter of looking up words in the dictionary but learning how it’s customary and correct to express oneself within the rules and guidelines of Standard American English (SAE). As others’ writing and communication becomes more complex, competent reception is more nearly an act of deciphering. Being able to parse sentences, grasp paragraph structure, and follow the logical thread (assuming those elements are handled well) is essential. That’s what being literate means, as opposed to being semi-literate — the fate of lots of adults who never bothered to study.

To state flatly that “good enough” is good enough is to accept two unnecessary limitations: (1) that only a portion of someone’s full, intended message is received and (2) that one’s own message is expressed with no better than adolescent sophistication, if that. Because humans are not mind readers, loss of fidelity between communicated intent and receipt is acknowledged. Further limiting oneself to lazy and unsophisticated usage is, by way of analogy, to reduce the full color spectrum to black and white. Even further, the suggestion that students can learn to express themselves properly once they have something to say misses the whole point of education, which is to prepare them with adult skills in advance of need.

As I understand it, modern American schools have shifted their English curricula away from the structural, prescriptive approach toward licentious usage just to get something onto the page, or in a classroom discussion, just to choke something out of students between the hemming ums, ers, uhs, ya knows, and I dunnos. I’d like to say that I’m astonished that researchers could provide cover for not bothering to learn, relieving both teachers and students of the arduous work needed to develop competence, if not mastery. That devaluation tracks directly from teachers and administrators to students and parents, the latter of whom would rather argue for their grades than earn them. Pity the fools who grub for grades without actually learning and are left holding worthless diplomas and degrees — certificates of nonachievement. In all likelihood, they simply won’t understand their own incompetence because they’ve been told all their lives what special flowers they are.

/rant off