Harriett Baber has a rather surprising opinion column at The Guardian in response to the question “can we choose what we believe?” I don’t normally care about professions of faith because they can’t be held to a shared standard of logic, evidence, or reason. But when an academic — a professor of philosophy in this case — says that truth doesn’t matter, I take notice:
Truth is overrated. And it’s remarkable that the very individuals who are most vocal in their opposition to religiously motivated puritanism are the most fiercely puritanical when it comes to truth. They condemn Christians for imposing constraints on sensual pleasure but are outraged that we should take pleasure in the consolations of religion instead of squarely facing what they believe to be the hard truths about the human condition.
People in any case overestimate the value of truth and underestimate the difficulty of arriving at it. There are a great many truths in which I have absolutely no interest — truths about the lifecycle of Ctenocephalides felis, (the common cat flea) or the extensive body of truths about the condition of my teeth that my dentist imposes on me. I see no reason why I should bother with these truths or make a point of believing them.
She uses the form “I see no reason why …” repeatedly and goes on to say that truth should be no impediment to believing whatever a person wants, especially if believing gives comfort. The idea of customizable belief reminded me of Sheilaism as described by Dick Meyer in his book Why We Hate Us. (The idea of Sheilaism may originate in Robert Bellah’s Habits of the Heart.) Sheilaism is a sort of raw individualism that rejects authority (and truth) and creates a hodgepodge belief system not unlike assembling a meal haphazardly from a buffet. Sheilaism also demonstrates how ideas are fragmented, torn from their contexts, converted into free-floating consumables, and reconstituted as expressions of one’s personal tastes and predilections. While I have considerable patience for regular folks in this regard, an academic who exhibits such muddy, relativist thinking is a serious abdication of academic integrity. One can only wonder what goes on in Prof. Baber’s classroom.
The New York Times has an article discussing plagiarism among students in the digital age. According to the article,
Digital technology makes copying and pasting easy, of course. But that is the least of it. The Internet may also be redefining how students — who came of age with music file-sharing, Wikipedia and Web-linking — understand the concept of authorship and the singularity of any text or image.
What this means is that students increasingly don’t even get that using sources without attribution is wrong; plagiarism doesn’t even register as academic misconduct. Who teaches students academic values and upholds their worth? Teachers and professors, who are apparently failing miserably under the pressure of the copy/paste functions of word processors. The article also provides numerous examples of brazen plagiarism committed by students (and parents!) who do in fact know better but do it anyway. Similarly, outside of academe, books, news articles, blog posts, etc. use and recycle large tracts of text without proper attribution and without being called to task. Some are even award winners.
Aside: the notion that creative works embodied in a digital format suitable for easy reproduction are available for use and reuse has swept away the entire concept of copyright. File sharing via Napster or YouTube raised the legal issues, but for all intents and purposes, the horse has already left the barn since so few respect copyright anymore. Although not true in the legal sense, in practice, the public sphere has become the public domain.
Finally, one academic blogger expands the NYT article linked above to the principled use of evidence in academic work and beyond:
… I’m enough of a devotee of our recent view of authorship and creativity (and property) to think that the norms established around plagiarism during the 20th Century need some kind of continuing defense, just with sufficient awareness of the changes in textual production and circulation.
What really worries me is what’s happening to the larger purpose of the analytical writing which tempts some to plagiarism. The thing I’m honestly afraid of is that we’ve come to a point where the professional value of learning to build strong arguments based on and determined by a command over solid evidence is in rapid decline.
These are good statements, but the blogger goes on to ask whether teaching sound academic standards is now a disservice to students in the professional world beyond academe where misshaped evidence, outright errors, omissions, and lies go unchecked and unpunished.
So maybe that’s the kind of writing and speaking we need to train our students to do: rhetorically effective and infinitely mutable on substance, entirely about rather than just sensibly attentive to affect and audience. At what point is it perverse to continue making buggy whips while the Ford plant churns away right next door?
As I said in my comment at the blog, if find it astonishing that an academic could even voice the question. Although I’m certain to be in the minority on this point, the answer is to me as duh! obvious as the answer to the question “should we torture?” All sorts of justifications and rationalizations exist for wriggling out from under the obvious answers, but no person of integrity entertains such a debate for longer than it takes to dismiss the question.