Posts Tagged ‘Education’

I often review my past posts when one receives a reader’s attention, sometimes adding tags and fixing typos, grammar, and broken links. One on my greatest hits (based on voting, not traffic) is Low Points in Education. It was among the first to tackle what I have since called our epistemological crisis, though I didn’t begin to use the epistemology tag until later. The crisis has caught up with a vengeance, though I can’t claim I’m the first to observe the problem. That dubious honor probably goes to Stephen Colbert, who coined the word truthiness in 2005. Now that alternative facts and fake news have entered the lingo as well (gaslighting has been revived), everyone has jumped on the bandwagon questioning the truthfulness or falsity behind anything coughed up in our media-saturated information environment. But as suggested in the first item discussed in Low Points in Education, what’s so important about truth?

It would be obvious and easy yet futile to argue in favor of high-fidelity appreciation of the world, even if only within the surprisingly narrow limits of human perception, cognition, and memory (all interrelated). Numerous fields of endeavor rely upon consensus reality derived from objectivity, measurement, reason, logic, and, dare I say it, facticity. Regrettably, human cognition doesn’t adhere any too closely to those ideals except when trained to value them. Well-educated folks have better acquaintance with such habits of mind; folks with formidable native intelligence can develop true authority, too. For the masses, however, those attributes are elusive, even for those who have partied through earned college degrees. Ironically worse, perhaps, are specialists, experts, and overly analytical intellectuals who exhibit what the French call a déformation professionelle. Politicians, pundits, and journalists are chief among the deformed and distorted. Mounting challenges to establishing truth now destabilize even mundane matters of fact, and it doesn’t help that myriad high-profile provocateurs (including the Commander in Chief, to whom I will henceforth refer only as “45”) are constantly throwing out bones for journalists to chase like so many unnourishing rubber chew toys.

Let me suggest, then, that human cognition, or more generally the mind, is an ongoing balancing act, making adjustments to stay upright and sane. Like the routine balance one keeps during locomotion, shifting weight side to side continuously, falling a bit only to catch oneself, difficulty is not especially high. But with the foundation below one’s feet shaking furiously, so to speak, legs get wobbly and many end up (figuratively at least) ass over teakettle. Further, the mind is highly situational, contingent, and improvisational and is prone to notoriously faulty perception even before one gets to marketing, spin, and arrant lies promulgated by those intent on coopting or directing one’s thinking. Simply put, we’re not particularly inclined toward accuracy but instead operate within a wide margin of error. Accordingly, we’re quite strong at adapting to ever-changing circumstance.

That strength turns out to be our downfall. Indeed, rootless adjustment to changing narrative is now so grave that basic errors of attribution — which entities said and did what — make it impossible to distinguish allies from adversaries reliably. (Orwell captured this with his line from the novel 1984, “Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.) Thus, on the back of a brazen propaganda campaign following 9/11, Iraq morphed from U.S. client state to rogue state demanding preemptive war. (Admittedly, the U.S. State Department had already lost control of its puppet despot, who in a foolish act of naked aggression tried to annex Kuwait, but that was a brief, earlier war quite unlike the undeclared one in which the U.S. has been mired for 16 years.) Even though Bush Administration lies have been unmasked and dispelled, many Americans continue to believe (incorrectly) that Iraq possessed WMDs and posed an existential threat to the U.S. The same type of confusion is arguably at work with respect to China, Russia, and Israel, which are mixed up in longstanding conflicts having significant U.S. involvement and provocation. Naturally, the default villain is always Them, never Us.

So we totter from moment to moment, reeling drunkenly from one breathtaking disclosure to the next, and are forced to reorient continuously in response to whatever the latest spin and spew happen to be. Some institutions retain the false sheen of respectability and authority, but for the most part, individuals are free to cherry-pick information and assemble their own truths, indulging along the way in conspiracy and muddle-headedness until at last almost no one can be reached anymore by logic and reason. This is our post-Postmodern world.

As a boy, my home included a coffee table book, title unknown, likely published circa 1960, about the origins of human life on Earth. (A more recent book of this type attracting lots of attention is Sapiens: A Brief History of Humankind (2015) by Yuval Harari, which I haven’t yet read.) It was heavily enough illustrated that my siblings and I consulted it mostly for the pictures, which can probably be excused since we were youngsters at the time time. What became of the book escapes me. In the intervening decades, I made no particular study of the ancient world — ancient meaning beyond the reach of human memory systems. Thus, ancient could potentially refer to anthropological history in the tens of thousands of years, evolutionary history stretching across tens of millions of years, geological history over hundreds of millions of years, or cosmological time going back a few billions. For the purpose of this blog post, let’s limit ancient to no more than fifty thousand years ago.

A few months ago, updates (over the 1960 publication) to the story of human history and civilization finally reached me (can’t account for the delay of several decades) via webcasts published on YouTube between Joe Rogan, Randall Carlson, and Graham Hancock. Rogan hosts the conversations; Carlson and Hancock are independent researchers whose investigations converge on evidence of major catastrophes that struck the ancient world during the Younger Dryas Period, erasing most but not all evidence of an antediluvian civilization. Whereas I’m a doomer, they are catastrophists. To call this subject matter fascinating is a considerable understatement. And yet, it’s neither here nor there with respect to how we conduct our day-to-day lives. Their revised history connects to religious origin stories, but such narratives have been relegated to myth and allegory for a long time already, making them more symbolic than historical.

In the tradition of Galileo, Copernicus, Newton, and Darwin, all of whom went against scientific orthodoxy of their times but were ultimately vindicated, Carlson and Graham appear to be rogue scientists/investigators exploring deep history and struggling against the conventional story of the beginnings of civilization around 6,000 years ago in the Middle East and Egypt. John Anthony West is another who disputes the accepted narratives and timelines. West is also openly critical of “quackademics” who refuse to consider accumulating evidence but instead collude to protect their cherished ideological and professional positions. The vast body of evidence being pieced together is impressive, and I truly appreciate their collective efforts. I’m about 50 pp. into Hancock’s Fingerprints of the Gods (1995), which contains copious detail not well suited to the conversational style of a webcast. His follow-up Magicians of the Gods (2015) will have to wait. Carlson’s scholarly work is published at the website Sacred Geometry International (and elsewhere, I presume).

So I have to admit that my blog, launched in 2006 as a culture blog, turned partially into a doomer blog as that narrative gained the weight of overwhelming evidence. What Carlson and Hancock in particular present is evidence of major catastrophes that struck the ancient world and are going to repeat: a different sort of doom, so to speak. Mine is ecological, financial, cultural, and finally civilizational collapse borne out of exhaustion, hubris, frailty, and most importantly, poor stewardship. Theirs is periodic cataclysmic disaster including volcanic eruptions and explosions, great floods (following ice ages, I believe), meteor strikes, earthquakes, tsunamis, and the like, each capable of ending civilization all at once. Indeed, those inevitable events are scattered throughout our geological history, though at unpredictable intervals often spaced tens or hundreds of thousands of years apart. For instance, the supervolcano under Yellowstone is known to blow roughly every 600,000 years, and we’re overdue. Further, the surface of the Moon indicates bombardment from meteors; the Earth’s history of the same is hidden somewhat by continuous transformation of the landscape lacking on the Moon. The number of near misses, also known as near-Earth objects, in the last few decades is rather disconcerting. Any of these disasters could strike at any time, or we could wait another 10,000 years.

Carlson and Hancock warn that we must recognize the dangers, drop our petty international squabbles, and unite as a species to prepare for the inevitable. To do otherwise would be to court disaster. However, far from dismissing the prospect of doom I’ve been blogging about, they merely add another category of things likely to kill us off. They give the impression that we should turn our attention away from sudden climate change, the Sixth Extinction, and other perils to which we have contributed heavily and worry instead about death from above (the skies) and below (the Earth’s crust). It’s impossible to say which is the most worrisome prospect. As a fatalist, I surmise that there is little we can do to forestall any of these eventualities. Our fate is already sealed in one respect or another. That foreknowledge make life precious for me, and frankly, is another reason to put aside our petty squabbles.

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

I attended a fundraiser a short while back. It’s familiar territory for me, filled with gifts culled from local businesses and corporations to be resold at auction, portable kitchens and bars to feed and libate guests to break down their inhibitions to giving, and lots of high heels and party dresses (with ample cleavage). Men rarely strut and parade the way the women do; tuxedos are the rare except. Secondary and tertiary activities are typical, often a DJ or live band that plays so loudly sensible people would flee the room rather than slowly go deaf. But monstrous volume in the era of amplified everything has dulled that reflex to nothingness. Or people are by now already deaf from habitual exposure to arena-rock volume filtered down to small venues. Folks simply, stupidly tough it out, ending the night with their ears ringing and their voices hoarse from screaming over the noise just to be heard.

Beneficiaries of fundraisers usually fall into two categories that are poorly served by American institutions: those seeking quality educations (including public schools that ought to be better funded through taxation) and folks suffering from catastrophic illness or disease that is ideally meant to be covered by health insurance but in practice is not. Auctioneers do pretty well enticing people to part with their money. It’s a true skill. But then, who goes to a fundraiser determined to hold tightly to their hard-earned cash? (Don’t answer that question.) Silent auctions may accompany the live auction, but the group vibe definitely contributes to some competition to outbid the next person (a wallet- or dick-measuring exercise?). Auction items are mostly luxury items, things normal Americans wouldn’t consider buying except when associated with charitable giving. Getting something for one’s charity (bought under or over its presumed market value) also shifts some portion of the philanthropic burden to those entities donating gifts.

All this is preliminary the most appallingly tone-deaf item offered for auction: a 4-person safari to a game preserve in South Africa to hunt and kill a wildebeest. When the auctioneer described the item, everyone in my vicinity looked at each other as if to say “what the fuck?” Certainly, humans have a long history of hunting game purely for sport (which is to say, not for food), and from the perspective of a South African safari outfitter, wild animals are a natural resource to be exploited the same way that, for instance, mining and agriculture is conducted throughout the world, but the last few years have seen a notable change of heart about killing animals, especially so-called romance animals (mostly large mammals, including whales, less so large fish), without need or admirable purpose. The outcry over an American dentist killing Cecil the Lion was an expression of that sentiment. So, too, was the killing of a gorilla at the Cincinnati Zoo after a child fell into the enclosure. (Personally, considering how few of them exist, I would privilege the life of the gorilla over the child, but that’s a mine field.) Pictures of Donald Trump’s sons standing over their trophy kills have also elicited significant disapproval. We are now acutely aware that wild animals are not an inexhaustible resource (and never were — consider the passenger pigeon).

I judged that bidding on the safari was no more or less robust than other auction items, but I mentioned aloud that if I were to bid on it, I would probably go on the safari but would also insist on merely paintballing the poor wildebeest, a relatively harmless proxy for killing it needlessly. Admittedly, the wildebeest would experience the same existential terror as if it were being hunted to death, but at least it would live. Or it would live until the next safari came round. Hunting and killing a wildebeest or other large game has never been on my bucket list, and its appearance at auction would not suddenly inspire me to add it to the list. That is the province of of a class of fools rich and insulated enough to still regard the world as their playground, with no thought of responsibility, stewardship, or consequences.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

I’m not paying close attention to the RNC in Cleveland. Actually, I’m ignoring it completely, still hoping that it doesn’t erupt in violence before the closing curtain. Yet I can’t help but hear some relevant news, and I have read a few commentaries. Ultimately, the RNC sounds like a sad, sad nonevent put on by amateurs, with many party members avoiding coming anywhere near. What’s actually transpiring is worthy of out-loud laughter at the embarrassment and indignities suffered by participants. The particular gaffe that caught my attention is cribbing from Michelle Obama in the speech delivered on Monday by Melania Trump. The speech writer, Meredith McIver, has accepted blame for it and characterized it as an innocent mistake.

Maybe someone else has already said or written this, but I suspect innocent plagiarism is probably true precisely because that’s the standard in quite a lot of academe these days. Students get away with it all the time, just not on a national stage. Reworking another’s ideas is far easier than coming up with one’s own original ideas, and Melania Trump has no reputation (near as I can tell) as an original thinker. The article linked to above indicates she admires Michelle Obama, so the plagiarism is from a twisted perspective an encomium.

The failure of Trump campaign officials to review the speech (or if they did review it, then do so effectively) is another LOL gaffe. It doesn’t rise to the level of the McCain campaign’s failure to vet Sarah Palin properly and won’t have any enduring effects, but it does reflect upon the Trump campaign’s ineptitude. My secret fear is that ineptitude is precisely why a lot of folks intend to vote for Trump: so that he can accelerate America’s self-destruction. It’s a negative view, and somewhat devil-may-care, to say “let’s make it a quick crash and get things over with already.” Or maybe it’s darkly funny only until suffering ramps up.

Something I wrote ten years ago at Creative Destruction. Probably worth an update.

Creative Destruction

We’ve all see the reports. U.S. high schoolers rank at or near the bottom in math and science. Admittedly, that link is to a story eight years old, but I doubt rankings have changed significantly. A new study and report are due out next year. See this link.

What interests me is that we live in an era of unprecedented technological advancement. While the U.S. may still be in the vanguard, I wonder how long that can last when the source of inspiration and creativity — human knowledge and understanding — is dying at the roots in American schools. It’s a sad joke, really, that follow-the-directions instructions for setting the clock on a VCR (remember those?) proved so formidable for most end users that a time-setting function is built into more recent recording systems such as TIVO. Technical workarounds may actually enable ever-increasing levels of disability working with our…

View original post 834 more words

According to Jean-Paul Sartre, the act of negation (a/k/a nihilation) is a necessary step in distinguishing foreground objects from background. A plethora of definitions and formal logic ensure that his philosophical formulations are of only academic interest to us nowadays, since philosophy in general has dropped out of currency in the public sphere and below awareness or concern even among most educated adults. With that in mind, I thought perhaps I should reinforce the idea of negation in my own modest (albeit insignificant) way. Negation, resistance, and dissent have familial relations, but they are undoubtedly distinct in some ways, too. However, I have no interest in offering formal treatments of terminology and so will gloss over the point and decline to offer definitions. Lump ’em all in together, I say. However, I will make a distinction between passive and active negation, which is the point of this blog post.

Although the information environment and the corporations that provide electronic access through hardware and connectivity would have us all jacked into the so-called Information Superhighway unceasingly, and many people do just that with enormous relish, I am of a different mind. I recognize that electronic media are especially potent in commanding attention and providing distraction. Stowed away or smuggled in with most messaging is a great deal of perception and opinion shaping that is worse than just unsavory, it’s damaging. So I go beyond passively not wanting handheld (thus nonstop) access to actively wanting not to be connected. Whereas others share excitement about the latest smartphone or tablet and the speed, cost, and capacity of the service provider for the data line on their devices, I don’t demur but insist instead “keep that nonsense away from me.” I must negate those prerogatives, refuse their claims on my attention, and be undisturbed in my private thoughts while away from the computer, the Internet, and the constant flow of information aimed indiscriminately at me and everyone.

Of course, I win no converts with such refusals. When I was shopping for a new phone recently, the default assumption by the sales clerk was that I wanted bells and whistles. She simply could not conceive of my desire to have a phone that is merely a phone, and the store didn’t sell one anyway. Even worse, since all phones are now by default smart phones, I had a data block put on my account to avoid inadvertently connecting to anything that would require a data line. That just blew her mind, like I was forgoing oxygen. But I’m quite clear that any vulnerability to information either tempting me or forced on me is worth avoiding and that time away from the computer and its analogues is absolutely necessary.

Personal anecdote: I was shopping at an appliance retailer (went to look at refrigerators) recently that had an embedded Apple store. At the counter with three models of the iPhone 6, the latest designations, were three kids roughly 8-11 in age (I estimate). They were unattended by parents, who must have believed that if the kids were not causing problems, they were a-okay. The kids themselves were absolutely transfixed — enthralled, really — by the screens, standing silent and motionless (very unlike most kids) with either a fierce concentration or utterly empty heads as they examined the gadgets. They were so zoomed in they took no notice at all of passersby. Parents of earlier generations used the TV as a pacifier or baby sitter the same way, but these kids seemed even more hollow than typical, dull-eyed TV viewers. Only a few days later, at a concert I attended, I watched a child who apparently could not pry his eyes away from the tablet he was carrying — even as he struggled to climb the stairs to his seat. The blue glare of the screen was all over his face.

Both scenes were unspeakably sad, though I might be hard-pressed to convince anyone of that assessment had I intervened. These scenes play out again and again, demonstrating that the youngest among us are the most vulnerable and least able to judge when to turn away, to disconnect. Adults fare no better. Schools and device makers alike have succeeded in selling electronics as “educational devices,” but the reality is that instead of exploring the world around us, people get sucked into a virtual world and the glossy fictions displayed on screens. They ultimately become slaves to their own devices. I mourn for their corrupted mindscapes, distorted and ruined by parents and teachers who ought to be wiser but who themselves have been coopted and hollowed out by mass media.

The phrase enlightened self-interest has been been used to describe and justify supposed positive results arising over time from individuals acting competitively, as opposed to cooperatively, using the best information and strategies available. One of the most enduring examples is the prisoner’s dilemma. Several others have dominated news cycles lately.

Something for Nothing

At the Univ. of Maryland, a psychology professor has been offering extra credit on exams of either 2 or 6 points if no more that 10 percent of students elect to receive the higher amount. If more than 10% go for the higher amount, no one gets anything. The final test question, which fails as a marker of student learning or achievement and doesn’t really function so well as a psychology or object lesson, either, went viral when a student tweeted out the question, perplexed by the prof’s apparent cruelty. Journalists then polled average people and found divergence (duh!) between those who believe the obvious choice is 6 pts (or reluctantly, none) and those who righteously characterize 2 pts as “the right thing to do.” It’s unclear what conclusion to draw, but the prof reports that since 2008, only one class got any extra credit by not exceeding the 10% cutoff.

Roping One’s Eyeballs

This overlong opinion article found in the Religion and Ethics section of the Australian Broadcasting Commission (ABC) website argues that advertizing is a modern-day illustration of the tragedy of the commons:

Expensively trained human attention is the fuel of twenty-first century capitalism. We are allowing a single industry to slash and burn vast amounts of this productive resource in search of a quick buck.

I practice my own highly restrictive media ecology, defending against the fire hose of information and marketing aimed at me (and everyone else) constantly, machine-gun style. So in a sense, I treat my own limited time and attention as a resource not to be squandered on nonsense, but when the issue is scaled up to the level of society, the metaphor is inapt and breaks down. I assert that attention as an exploitable resource functions very differently when considering an individual vs. the masses, which have unique behavioral properties. Still, it’s an interesting idea to consider.

No One’s Bank Run

My last last example is entirely predictable bank runs in Greece that were forestalled when banks closed for three weeks and placed withdrawal limits (euphemism: capital controls) on what cash deposits are actually held in the vaults. Greek banks have appealed to depositors to trust them — that their deposits are guaranteed and there will be no “haircut” such as occurred in Cyprus — but appeals were met with laughter and derision. Intolerance of further risk is an entirely prudent response, and a complete and rapid flight of capital would no doubt have ensued if it weren’t disallowed.

What these three examples have in common is simple: it matters little what any individual may do, but it matters considerably what everyone does. Institutions and systems typically have enough resilience to weather a few outliers who exceed boundaries (opting for 6 pts, pushing media campaigns to the idiotic point of saturation, or withdrawing all of one’s money from a faltering bank), but when everyone acts according to enlightened self-interest, well, it’s obvious that something’s gotta give. In the examples above, no one gets extra points, no one pays heed to much of anything anymore (or perhaps more accurately, attention is debased and diluted to the point of worthlessness), and banks fail. In the professor’s example, the threshold for negative results is only 10%. Different operating environments probably vary, but the modesty of that number is instructive.

More than a few writers have interpreted the tragedy of the commons on a global level. As a power law, it probably functions better at a feudal level, where resources are predominantly local and society is centered around villages rather than megalopolises and/or nation-states. However, it’s plain to observe, if one pays any attention (good luck with that in our new age of distraction, where everyone is trained to hear only what our own culture instructs, ignoring what nature tells us), that interlocking biological systems worldwide are straining and failing under the impacts of anthropomorphic climate change. Heating at the poles and deep in the oceans are where the worst effects are currently being felt, but climate chaos is certainly not limited to out-of-sight, out-of-mind locations. What’s happening in the natural world, however, is absolutely and scarily for real, unlike bogus stress tests banks undergo to buttress consumer sentiment (euphemism: keep calm and carry on). Our failure to listen to the right messages and heed warnings properly will be our downfall, but we will have lots of company because everyone is doing it.