Archive for the ‘Education’ Category

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

I discovered “The Joe Rogan Experience” on YouTube recently and have been sampling from among the nearly 900 pod- or webcasts posted there. I’m hooked. Rogan is an impressive fellow. He clearly enjoys the life of the mind but, unlike many who are absorbed solely in ideas, has not ignored the life of the body. Over time, he’s also developed expertise in multiple endeavors and can participate knowledgeably in discussion on many topics. Webcasts are basically long, free-form, one-on-one conversations. This lack of structure gives the webcast ample time to explore topics in depth or simply meander. Guests are accomplished or distinguished in some way and usually have fame and wealth to match, which often affects content (i.e., Fitzgerald’s observation: “The rich are different than you and me”). One notable bar to entry is having a strong media presence.

Among the recurring themes, Rogan trots out his techno optimism, which is only a step short of techno utopianism. His optimism is based on two interrelated developments in recent history: widespread diffusion of information over networks and rapid advances in medical devices that can be expected to accelerate, to enhance human capabilities, and soon to transform us into supermen, bypassing evolutionary biology. He extols these views somewhat regularly to his guests, but alas, none of the guests I’ve watched seem to be able to fathom the ideas satisfactorily enough to take up the discussion. (The same is true of Rogan’s assertion that money is just information, which is reductive and inaccurate.) They comment or joke briefly and move onto something more comfortable or accessible. Although I don’t share Rogan’s optimism, I would totally engage in discussion of his flirtation with Transhumanism (a term he doesn’t use). That’s why I’m blogging here about Rogan, in addition to my lacking enough conventional distinction and fame to score an invite to be a guest on his webcast. Plus, he openly disdains bloggers, many of whom moderate comments (I don’t) or otherwise channel discussion to control content. Oh, well.

(more…)

Once in a while, a comment sticks with me and requires additional response, typically in the form of a new post. This is one of those comments. I wasn’t glib in my initial reply, but I thought it was inadequate. When looking for something more specific about Neil Postman, I found Janet Sternberg’s presentation called Neil Postman’s Advice on How to Live the Rest of Your Life (link to PDF). The 22 recommendations that form Postman’s final lecture given to his students read like aphorisms and the supporting paragraphs are largely comical, but they nonetheless suggest ways of coping with the post-truth world. Postman developed this list before Stephen Colbert had coined the term truthiness. I am listing only the recommendations and withholding additional comment, though there is plenty to reinforce or dispute. See what you think.

  1. Do not go to live in California.
  2. Do not watch TV news shows or read any tabloid newspapers.
  3. Do not read any books by people who think of themselves as “futurists,”
    such as Alvin Toffler.
  4. Do not become a jogger. If you are one, stop immediately.
  5. If you are married, stay married.
  6. If you are a man, get married as soon as possible. If you are a woman,
    you need not be in a hurry.
  7. Establish as many regular routines as possible.
  8. Avoid multiple and simultaneous changes in your personal life.
  9. Remember: It is more likely than not that as you get older you will get
    dumber.
  10. Keep your opinions to a minimum.
  11. Carefully limit the information input you will allow.
  12. Seek significance in your work, friends, and family, where potency and
    output are still possible.
  13. Read’s Law: Do not trust any group larger than a squad, that is, about
    a dozen.
  14. With exceptions to be noted further ahead, avoid whenever possible
    reading anything written after 1900.
  15. Confine yourself, wherever possible, to music written prior to 1850.
  16. Weingartner’s Law: 95% of everything is nonsense.
  17. Truman’s Law: Under no circumstances ever vote for a Republican.
  18. Take religion more seriously than you have.
  19. Divest yourself of your belief in the magical powers of numbers.
  20. Once a year, read a book by authors like George Orwell, E.B. White, or
    Bertrand Russell.
  21. Santha Rama Rau’s Law: Patriotism is a squalid emotion.
  22. Josephson’s Law: New is rotten.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

I watched John Pilger’s excellent documentary film The War You Don’t See (2010), which deals with perpetual and immoral wars, obfuscations of the governments prosecuting them, and the journalistic media’s failure to question effectively the lies and justifications that got us into war and keeps us there. The documentary reminded me of The Fog of War (2003), Robert McNamara’s rueful rethinking of his activities as Secretary of Defense during the Kennedy and Johnson administrations (thus, the Vietnam War). Seems that lessons a normal, sane person might draw from experience at war fail to find their way into the minds of decision makers, who must somehow believe themselves to be masters of the universe with immense power at their disposal but are really just war criminals overseeing genocides. One telling detail from Pilger’s film is that civilian deaths (euphemistically retermed collateral damage in the Vietnam era) as a percentage of all deaths (including combatants) have increased from 10% (WWI) to 50% (WWII) to 70% (Vietnam) to 90% (Afghanistan and Iraq). That’s one of the reasons why I call them war criminals: we’re depopulating the theaters of war in which we operate.

After viewing the Pilger film, the person sitting next to me asked, “How do you know what he’s saying is true?” More fog. I’m ill-equipped to handle such direct epistemological challenge; it felt to me like a non sequitur. Ultimately, I was relieved to hear that the question was mere devil’s advocacy, but it’s related to the epistemological crisis I’ve blogged about before. Since the date of that blog post, the crisis has only worsened, which is what I expect as legitimate authority is undermined, expertise erodes, and the public sphere devolves into gamification and gotchas (or a series of ongoing cons). If late-stage capitalism has become a nest of corruption, the same is true — with unexpected rapidity — of the computer era and the Information Superhighway (a term no one uses anymore). One early expectation was that enhanced (24/7/365) access to information would yield impressive educational gains, as though the only thing missing were more information, but human nature being what it is, the first valuable innovations resulted from commercializing erotica and porn. Later debate and hand-wringing over the inaccuracy of Wikipedia and the slanted results of Google searches disappeared as everyone simply got used to not being able to trust those sources any too much, just as everyone got used to forfeiting their privacy online.

Today, everything coughed up in our media-saturated information environment is understood either with a grain of salt mountain of skepticism and held in abeyance until solid confirmation can be had (which often never comes) or simply run with because, well, what the hell? Journalists, the well-trained ones possessing integrity anyway, used to be in the first camp, but market forces and the near instantaneity of (faulty, spun) information, given how the Internet has lowered the bar to publication, have pushed journalists into the second camp. As Pilger notes, they have become echo chambers and amplifiers of the utterances of press agents of warmongering governments. Sure, fact checking still occurs, when it’s easy (such as on the campaign trail), but with war reporting in particular, which poses significant hurdles to information gathering, too many reporters simply repeat what they’re told or believe the staging they’re shown.

In what has become a predictable status quo, President Obama recently renewed our official state of emergency with respect to the so-called War on Terror. It’s far too late to declare a new normal; we’ve been in this holding pattern for 16 years now. The article linked above provides this useful context:

There are now 32 states of national emergency pending in the United States, with the oldest being a 1979 emergency declared by President Jimmy Carter to impose sanctions during the Iran hostage crisis. Most are used to impose economic sanctions — mostly as a formality, because Congress requires it under the International Emergency Economic Powers Act.

In his term in office, Obama has declared 13 new emergencies, continued 21 declared by his predecessors and revoked just two, which imposed sanctions on Liberia and Russia.

Pro forma renewal of multiple states of national emergency is comparable to the 55-year-old U.S. embargo against Cuba, due for reauthorization next month, though indications are that the embargo may finally be relaxed or deauthorized. Both are examples of miserably failed policy, but they confer a semblance of power on the executive branch. Everyone knows by now that no one relinquishes power willingly, so Obama, like chief executives before him, keeps on keeping on ad nauseum.

Considering Obama’s credential as a Constitutional scholar, relatively unique among U.S. presidents, one might expect him to weigh his options with greater circumspection and with an eye toward restoring suspended civil liberties. However, he has shown little interest in doing so (as far as I know). In combination with the election only a couple months away, the U.S. appears to be in a position similar to Germany in 1932 — ready and willing to elect a despot (take your pick …) and continue its slide into fascism. Can’t even imagine avoiding that outcome now.

The surprising number of ongoing emergencies makes me point to James Howard Kunstler and his book The Long Emergency (2006). Though I haven’t read the book (I’m a failed doomer, I suppose), my understanding is that his prediction of a looming and lingering emergency is based on two intertwined factors currently playing out in geopolitics: peak oil and global warming. (“Climate change” is now preferred over “global warming.”) Those two dire threats (and the California drought) have faded somewhat from the headlines, partially due to fatigue, replaced primarily by terrorism and economic stresses, but the dangers never went away. Melting icecaps and glaciers are probably the clearest incontrovertible indications of anthropogenic global warming, which is poised to trigger nonlinear climate change and hasten the Sixth Extinction. We don’t know when, precisely, though time is growing short. Similarly, reports on energy production and consumption are subject to considerable falsification in the public sphere, making it impossible to know just how close in time we are to a new energy crisis. That inevitability has also been the target of a disinformation campaign, but even a rudimentary understanding of scientific principles is sufficient to enable clear thinkers to penetrate the fog.

I have no plans to return to doom blogging with any vigor. One emergency stacked upon the next, ready to collapse in a cascade of woe, has defeated me, and I have zero expectation that any real, meaningful response can be formulated and executed, especially while we are distracted with terrorism and creeping fascism.

I’m not paying close attention to the RNC in Cleveland. Actually, I’m ignoring it completely, still hoping that it doesn’t erupt in violence before the closing curtain. Yet I can’t help but hear some relevant news, and I have read a few commentaries. Ultimately, the RNC sounds like a sad, sad nonevent put on by amateurs, with many party members avoiding coming anywhere near. What’s actually transpiring is worthy of out-loud laughter at the embarrassment and indignities suffered by participants. The particular gaffe that caught my attention is cribbing from Michelle Obama in the speech delivered on Monday by Melania Trump. The speech writer, Meredith McIver, has accepted blame for it and characterized it as an innocent mistake.

Maybe someone else has already said or written this, but I suspect innocent plagiarism is probably true precisely because that’s the standard in quite a lot of academe these days. Students get away with it all the time, just not on a national stage. Reworking another’s ideas is far easier than coming up with one’s own original ideas, and Melania Trump has no reputation (near as I can tell) as an original thinker. The article linked to above indicates she admires Michelle Obama, so the plagiarism is from a twisted perspective an encomium.

The failure of Trump campaign officials to review the speech (or if they did review it, then do so effectively) is another LOL gaffe. It doesn’t rise to the level of the McCain campaign’s failure to vet Sarah Palin properly and won’t have any enduring effects, but it does reflect upon the Trump campaign’s ineptitude. My secret fear is that ineptitude is precisely why a lot of folks intend to vote for Trump: so that he can accelerate America’s self-destruction. It’s a negative view, and somewhat devil-may-care, to say “let’s make it a quick crash and get things over with already.” Or maybe it’s darkly funny only until suffering ramps up.

Something I wrote ten years ago at Creative Destruction. Probably worth an update.

Creative Destruction

We’ve all see the reports. U.S. high schoolers rank at or near the bottom in math and science. Admittedly, that link is to a story eight years old, but I doubt rankings have changed significantly. A new study and report are due out next year. See this link.

What interests me is that we live in an era of unprecedented technological advancement. While the U.S. may still be in the vanguard, I wonder how long that can last when the source of inspiration and creativity — human knowledge and understanding — is dying at the roots in American schools. It’s a sad joke, really, that follow-the-directions instructions for setting the clock on a VCR (remember those?) proved so formidable for most end users that a time-setting function is built into more recent recording systems such as TIVO. Technical workarounds may actually enable ever-increasing levels of disability working with our…

View original post 834 more words

Over at Gin and Tacos, the blogger has an interesting take on perverse incentives that function to inflate grades (and undermine learning), partly by encouraging teachers to give higher grades than deserved at the first hint of pushback from consumers students, parents, or administration. The blog post is more specifically about Why Johnny Can’t Write and references a churlish article in Salon. All well and good. The blog author provides consistently good analysis as a college professor intimate with the rigors of higher education and the often unprepared students deposited in his classroom. However, this comment got my attention in particular. The commentator is obviously a troll, and I generally don’t feed trolls, so I made only one modest comment in the comments section. Because almost no one reads The Spiral Staircase, certainly no one from the estimable Gin and Tacos crowd, I’ll indulge myself, not the troll, by examining briefly the main contention, which is that quality of writing, including correct grammar, doesn’t matter most of the time.

Here’s most of the comment (no link to the commentator’s blog, sorry):

1. Who gives a flying fuck about where the commas go? About 99.999999999999999% of the time, it makes NO DIFFERENCE WHATSOEVER in terms of understanding somebody’s point if they’ve used a comma splice. Is it a technical error? Sure. Just like my unclear pronoun reference in the last sentence. Did you understand what I meant? Unless you were actively trying not to, yes, you did.

2. There’s are hundreds of well-conducted peer-reviewed studies by those of us who actually specialize in writing pedagogy documenting the pointlessness of teaching grammar skills *unless students give a fuck about what they’re writing.* We’ve known this since the early 1980s. So when the guy from the high school English department in the article says they can’t teach grammar because students think it’s boring, he’s unwittingly almost making the right argument. It’s not that it’s boring–it’s that it’s irrelevant until the students have something they want to say. THEN we can talk about how to say it well.

Point one is that people manage to get their points across adequately without proper punctuation, and point two is that teaching grammar is accordingly a pedagogical dead end. Together, they assert that structure, rules, syntax, and accuracy make no difference so long as communication occurs. Whether one takes the hyperbole “99.999999999999999% of the time” as the equivalent of all the time, almost all the time, most of the time, etc. is not of much interest to me. Red herring served by a troll.

/rant on

As I’ve written before, communication divides neatly into receptive and expressive categories: what we can understand when communicated to us and what we can in turn communicate effectively to others. The first category is the larger of the two and is greatly enhanced by concerted study of the second. Thus, reading comprehension isn’t merely a matter of looking up words in the dictionary but learning how it’s customary and correct to express oneself within the rules and guidelines of Standard American English (SAE). As others’ writing and communication becomes more complex, competent reception is more nearly an act of deciphering. Being able to parse sentences, grasp paragraph structure, and follow the logical thread (assuming those elements are handled well) is essential. That’s what being literate means, as opposed to being semi-literate — the fate of lots of adults who never bothered to study.

To state flatly that “good enough” is good enough is to accept two unnecessary limitations: (1) that only a portion of someone’s full, intended message is received and (2) that one’s own message is expressed with no better than adolescent sophistication, if that. Because humans are not mind readers, loss of fidelity between communicated intent and receipt is acknowledged. By further limiting oneself to lazy and unsophisticated usage is, by way of analogy, to reduce the full color spectrum to black and white. Further, the suggestion that students can learn to express themselves properly once they have something to say misses the whole point of education, which is to prepare them with adult skills in advance of need.

As I understand it, modern American schools have shifted their English curricula away from the structural, prescriptive approach toward licentious usage just to get something onto the page, or in a classroom discussion, just to choke something out of students between the hemming ums, ers, uhs, ya knows, and I dunnos. I’d like to say that I’m astonished that researchers could provide cover for not bothering to learn, relieving both teachers and students of the arduous work needed to develop competence, if not mastery. That devaluation tracks directly from teachers and administrators to students and parents, the latter of whom would rather argue for their grades than earn them. Pity the fools who grub for grades without actually learning and are left holding worthless diplomas and degrees — certificates of nonachievement. In all likelihood, they simply won’t understand their own incompetence because they’ve been told all their lives what special flowers they are.

/rant off