Posts Tagged ‘Redux’

Third version of this topic. Whereas the previous two were about competing contemporary North American ways of knowing, this one is broader in both time and space.

The May 2019 issue of Harper’s Magazine has a fascinating review of Christina Thompson’s book Sea People: The Puzzle of Polynesia (2019). Beyond the puzzle itself — how did Polynesian people migrate to, settle, and populate the far-flung islands of the Central and South Pacific? — the review hits upon one of my recurring themes on this blog, namely, that human cognition is plastic enough to permit highly divergent ways of knowing.

The review (and book?) is laden with Eurocentric detail about the “discovery” of closely related Polynesian cultures dispersed more widely (geographically) than any other culture prior to the era of mass migration. Indeed, the reviewer chides the author at one point for transforming Polynesia from a subject in its own right into an exotic object of (Western) fascination. This distorted perspective is commonplace and follows from the earlier “discovery” and colonization of North America as though it were not already populated. Cartographers even today are guilty of this Eurocentrism, relegating “empty” expanses of the Pacific Ocean to irrelevance in maps when in fact the Pacific is “the dominant feature of the planet” and contains roughly twenty-five thousand islands (at current sea level? — noting that sea level was substantially lower during the last ice age some 13,000 years but due to rise substantially by the end of this century and beyond, engulfing many of the islands now lying dangerously close to sea level). Similar distortions are needed to squash the spherical (3D) surface of the globe onto planar (2D) maps (e.g., the Mercator projection, which largely ignores the Pacific Ocean in favor of continents; other projections shown here) more easily conceptualized (for Westerners) in terms of coordinate geometry using latitude and longitude (i.e., the Cartesian plane).

The review mentions the familiar dichotomy of grouping a hammer, saw, hatchet, and log in terms of abstract categories (Western thought) vs. utility or practicality (non-Western). Exploration of how different ways of knowing manifest is, according to the review, among the more intellectually exciting parts of the book. That’s the part I’m latching onto. For instance, the review offers this:

Near the middle of Sea People, Thompson explores the ramification of Polynesia as, until contact, an oral culture with “an oral way of seeing.” While writing enables abstraction, distancing, and what we generally call objectivity, the truth of oral cultures is thoroughly subjective. Islands aren’t dots on a map seen from the sky but destinations one travels to in the water.

This is the crux of the puzzle of Polynesians fanning out across the Pacific approximately one thousand years ago. They had developed means of wayfinding in canoes and outriggers without instruments or maps roughly 500 years prior to Europeans crossing the oceans in sailing ships. Perhaps I’m reading too much into the evidence, but abstraction and objectivity as a particular way of knowing, bequeathed to Western Europe via the Enlightenment and development of the scientific method, stunted or delayed exploration of the globe precisely because explorers began with a god’s eye view of the Earth from above rather than from the surface (object vs. subject). In contrast, quoting here from the book rather than the review, Polynesians used

a system known as etak, in which they visualize a “reference island,” — which is usually a real island but may also be imaginary — off to one side of the path they are following, about midway between their starting point and their destination. As the journey progresses, this island “moves” under each of the stars in the star path [situated near the horizon rather than overhead], while the canoe in which the voyagers are traveling stays still. Of course, the navigators know that it is the canoe and not the islands that are moving, but this is the way they conceptualize the voyage.

Placing oneself at the center of the world or universe — at least for the purpose of navigation — is a conceptual pose Westerners discarded when heliocentrism gradually replaced geocentrism. (Traveling using GPS devices ironically places the traveler back at the center of the map with terrain shifting around the vehicle, but it’s a poor example of wayfinding precisely because the traveler fobs the real work onto the device and likely possesses no real understanding or skill traversing the terrain besides following mechanical instructions.) While we Westerners might congratulate ourselves for a more accurate, objective orientation to the stars, its unwitting limitations are worth noting. Recent discoveries regarding human prehistory, especially megalithic stone construction accomplished with techniques still unknown and flatly impossible with modern technology, point to the existence of other ways of knowing lost to contemporary human cultures steadily triangulating on and conforming to Western thought (through the process of globalization). Loss of diversity of ways of knowing creates yet another sort of impoverishment that can only be barely glimpsed since most of us are squarely inside the bubble. Accordingly, it’s not for nothing that some unusually sensitive critics of modernity suggest we’re entering a new Dark Age.

 

Advertisements

As I reread what I wrote 2.5 years ago in my first blog on this topic, I surmise that the only update needed to my initial assessment is a growing pile of events that demonstrate my thesis: our corrupted information environment is too taxing on human cognition, with the result that a small but growing segment of society gets radicalized (wound up like a spring) and relatively random individuals inevitably pop, typically in a self-annihilating gush of violence. News reports bear this out periodically, as one lone-wolf kook after another takes it upon himself (are there any examples of females doing this?) to shoot or blow up some target, typically chosen irrationally or randomly though for symbolic effect. More journalists and bloggers are taking note of this activity and evolving or resurrecting nomenclature to describe it.

The earliest example I’ve found offering nomenclature for this phenomenon is a blog with a single post from 2011 (oddly, no follow-up) describing so-called stochastic terrorism. Other terms include syntactic violence, semantic violence, and epistemic violence, but they all revolve around the same point. Whether on the sending or receiving end of communications, some individuals are particularly adept at or sensitive to dog whistles that over time activate and exacerbate tendencies toward radical ideology and violence. Wired has a brief article from a few days ago discussing stochastic terrorism as jargon, which is basically what I’m doing here. Admittedly, the last of these terms, epistemic violence (alternative: epistemological violence), ranges farther afield from the end effect I’m calling wind-up toys. For instance, this article discussing structural violence is much more academic in character than when I blogged on the same term (one of a handful of “greatest hits” for this blog that return search-engine hits with some regularity). Indeed, just about any of my themes and topics can be given a dry, academic treatment. That’s not my approach (I gather opinions differ on this account, but I insist that real academic work is fundamentally different from my armchair cultural criticism), but it’s entirely valid despite being a bit remote for most readers. One can easily get lost down the rabbit hole of analysis.

If indeed it’s mere words and rhetoric that transform otherwise normal people into criminals and mass murderers, then I suppose I can understand the distorted logic of the far Left that equates words and rhetoric themselves with violence, followed by the demand that they be provided with warnings and safe spaces lest they be triggered by what they hear, read, or learn. As I understand it, the fear is not so much that vulnerable, credulous folks will be magically turned into automatons wound up and set loose in public to enact violent agendas but instead that virulent ideas and knowledge (including many awful truths of history) might cause discomfort and psychological collapse akin to what happens to when targets of hate speech and death threats are reduced, say, to quivering agoraphobia. Desire for protection from harm is thus understandable. The problem with such logic, though, is that protections immediately run afoul of free speech, a hallowed but misunderstood American institution that preempts quite a few restrictions many would have placed on the public sphere. Protections also stall learning and truth-seeking straight out of the gate. And besides, preemption of preemption doesn’t work.

In information theory, the notion of a caustic idea taking hold of an unwilling person and having its wicked way with him or her is what’s called a mind virus or meme. The viral metaphor accounts for the infectious nature of ideas as they propagate through the culture. For instance, every once in a while, a charismatic cult emerges and inducts new members, a suicide cluster appears, or suburban housewives develop wildly disproportionate phobias about Muslims or immigrants (or worse, Muslim immigrants!) poised at their doorsteps with intentions of rape and murder. Inflaming these phobias, often done by pundits and politicians, is precisely the point of semantic violence. Everyone is targeted but only a few are affected to the extreme of acting out violently. Milder but still invalid responses include the usual bigotries: nationalism, racism, sexism, and all forms of tribalism, “othering,” or xenophobia that seek to insulate oneself safely among like folks.

Extending the viral metaphor, to protect oneself from infectious ideas requires exposure, not insulation. Think of it as a healthy immune system built up gradually, typically early in life, through slow, steady exposure to harm. The alternative is hiding oneself away from germs and disease, which has the ironic result of weakening the immune system. For instance, I learned recently that peanut allergies can be overcome by gradual exposure — a desensitization process — but are exacerbated by removal of peanuts from one’s environment and/or diet. This is what folks mean when they say the answer to hate speech is yet more (free) speech. The nasty stuff can’t be dealt with properly when it’s quarantined, hidden away, suppressed, or criminalized. Maybe there are exceptions. Science fiction entertains those dangers with some regularity, where minds are shunted aside to become hosts for invaders of some sort. That might be overstating the danger somewhat, but violent eruptions may provide some credence.

In an earlier blog post, I mentioned how killing from a distance is one way among many that humans differentiate from other animals. The practical advantage of weaponry that distances one combatant from another should be obvious. Spears and swords extend one’s reach yet keep fighting hand-to-hand. Projectiles (bullets, arrows, catapults, artillery, etc.) allow killing from increasingly long distances, with weapons launched into low orbit before raining down ruin being the far extreme. The latest technology is drones (and drone swarms), which remove those who wield them from danger except perhaps psychological torment accruing gradually on remote operators. Humans are unique among animals for having devised such clever ways of destroying each other, and in the process, themselves.

I finally got around to seeing the film Black Panther. Beyond the parade of clich├ęs and mostly forgettable punchfest action (interchangeable with any other Marvel film), one particular remark stuck with me. When the warrior general of fictional Wakanda went into battle, a female as it happens, she dismissed the use of guns as “primitive.” Much is made of Wakanda’s advanced technology, some of it frankly indistinguishable from magic (e.g., the panther elixir). Wakanda’s possession of weaponry not shared with the rest of the world (e.g., invisible planes) is the MacGuffin the villain seeks to control so as exact revenge on the world and rule over it. Yet the film resorts predictably to punching and acrobatics as the principal mode of combat. Some of that strategic nonsense is attributable to visual storytelling found in both comic books and cinema. Bullets fly too fast to be seen and tracking airborne bombs never really works, either. Plus, a punch thrown by a villain or superhero arguably has some individual character to it, at least until one recognizes that punching leaves no lasting effect on anyone.

As it happens, a similar remark about “primitive” weapons (a blaster) was spat out by Obi-Wan Kenobi in one of the Star Wars prequels (dunno which one). For all the amazing technology at the disposal of those characters long ago in a galaxy far, far away, it’s curious that the weapon of choice for a Jedi knight is a light saber. Again, up close and personal (color coded, even), including actual peril, as opposed to, say, an infinity gauntlet capable of dispatching half a universe with a finger snap. Infinite power clearly drains the stakes out of conflict. Credit goes to George Lucas for recognizing the awesome visual storytelling the light saber offers. He also made blaster shots — the equivalent of flying bullets — visible to the viewer. Laser beams and other lighted projectiles had been done in cinema before Star Wars but never so well.

One of the very best lessons I took from higher education was recognizing and avoiding the intentional fallacy — in my own thinking no less than in that of others. Although the term arguably has more to do with critical theory dealing specifically with texts, I learned about it in relation to abstract fine arts, namely, painting and music. For example, the enigmatic expression of the Mona Lisa by Leonardo Da Vinci continues to spark inquiry and debate. What exactly does that smile mean? Even when words or programs are included in musical works, it’s seductively easy to conclude that the composer intends this or the work itself means that. Any given work purportedly allows audiences to peer into the mind of its creator(s) to interrogate intent. Conclusions thus drawn, however, are notoriously unreliable though commonplace.

It’s inevitable, I suppose, to read intent into artistic expression, especially when purpose feels so obvious or inevitable. Similar excavations of meaning and purpose are undertaken within other domains of activity, resulting in no end of interpretation as to surface and deep strategies. Original intent (also originalism) is a whole field of endeavor with respect to interpretation of the U.S. Constitution and imagining the framers’ intent. Geopolitics is another domain where hindsight analysis results in some wildly creative but ultimately conjectural interpretations of events. Even where authorial (and political) intent is explicitly recorded, such as with private diaries or journals, the possibility of deceptive intent by authors keeps everyone wondering. Indeed, although “fake news” is modern coin, a long history of deceptive publishing practice well beyond the adoption of a nom de plume attests to hidden or unknowable intent making “true intent” a meta property.

The multi-ring circus that the modern information environment has become, especially in the wake of electronic media (e.g., YouTube channels) produced by anyone with a camera and an Internet connection, is fertile ground for those easily ensnared by the intentional fallacy. Several categories of intent projected onto content creators come up repeatedly: profit motive, control of the narrative (no small advantage if one believes this blog post), setting the record straight, correcting error, grandstanding, and trolling for negative attention. These categories are not mutually exclusive. Long ago, I pointed to the phenomenon of arguing on-line and how it typically accomplishes very little, especially as comment threads lengthen and civility breaks down. These days, comments are an Internet legacy and/or anachronism that many content creators persist in offering to give the illusion of a wider discussion but in fact roundly ignore. Most blogs and channels are actually closed conversations. Maybe a Q&A follows the main presentation when held before an audience, but video channels are more often one-way broadcasts addressing an audience but not really listening. Public square discussion is pretty rare.

Some celebrate this new era of broadcasting, noting with relish how the mainstream media is losing its former stranglehold on attention. Such enthusiasm may be transparently self-serving but nonetheless rings true. A while back, I pointed to New Media Rockstars, which traffics in nerd culture entertainment media, but the term could easily be expanded to include satirical news, comedy, and conversational webcasts (also podcasts). Although some folks are rather surprised to learn that an appetite for substantive discussion and analysis exists among the public, I surmise that the shifting media landscape and disintegrated cultural narrative have bewildered a large segment of the public. The young in particular are struggling to make sense of the world, figure out what to be in life and how to function, and working out an applied philosophy that eschews more purely academic philosophy.

By way of example of new media, let me point to a trio of YouTube channels I only recently discovered. Some More News parodies traditional news broadcasts by sardonically (not quite the same as satirically) calling bullshit on how news is presented. Frequent musical cues between segments make me laugh. Unlike the mainstream media, which are difficult not to regard as propaganda arms of the government, Some More News is unapologetically liberal and indulges in black humor, which doesn’t make me laugh. Its raw anger and exasperation are actually a little terrifying. The second YouTube channel is Three Arrows, a sober, thorough debunking of news and argumentation found elsewhere in the public sphere. The speaker, who doesn’t appear onscreen, springs into action especially when accusations of current-day Nazism come up. (The current level of debate has devolved to recklessly calling nearly everyone a Nazi at some stage. Zero points scored.) Historical research often puts things into proper context, such as the magnitude of the actual Holocaust compared to some garden-variety racist running his or her mouth comparatively harmlessly. The third YouTube channel is ContraPoints, which is rather fanciful and profane but remarkably erudite considering the overall tone. Labels and categories are explained for those who may not have working definitions at the ready for every phrase or ideology. Accordingly, there is plenty of jargon. The creator also appears as a variety of different characters to embody various archetypes and play devil’s advocate.

While these channels may provide abundant information, correcting error and contextualizing better than most traditional media, it would be difficult to conclude they’re really moving the conversation forward. Indeed, one might wonder why bother preparing these videos considering how time consuming it has to be to do research, write scripts, assemble pictorial elements, etc. I won’t succumb to the intentional fallacy and suggest I know why they bother holding these nondebates. Further, unless straight-up comedy, I wouldn’t say they’re entertaining exactly, either. Highly informative, perhaps, if one pays close attention to frenetic online pace and/or mines for content (e.g., studying transcripts or following links). Interestingly, within a fairly short period of time, these channels are establishing their own rhetoric, sometimes useful, other times too loose to make strong impressions. It’s not unlike the development of new stylistic gestures in music or painting. What if anything worthwhile will emerge from the scrum will be interesting.

We’re trashing the planet. Everyone gets that, right? I’ve written several posts about trash, debris, and refuse littering and orbiting the planet, one of which is arguably among my greatest hits owing to the picture below showing The Boneyard outside Tucson, Arizona. That particular scene no longer exists as those planes were long ago repurposed.


I’ve since learned that boneyards are a worldwide phenomenon (see this link) falling under the term urbex. Why re-redux? Two recent newbits attracted my attention. The first is an NPR article about Volkswagen buying back its diesel automobiles — several hundred thousand of them to the tune of over $7 billion. You remember: the ones that scandalously cheated emissions standards and ruined Volkswagen’s reputation. The article features a couple startling pictures of automobile boneyards, though the vehicles are still well within their usable life (many of them new, I surmise) rather than retired after a reasonable term. Here’s one pic:

The other newsbit is that the Great Pacific Garbage Patch is now as much as 16 times bigger than we thought it was — and getting bigger. Lots of news sites reported on this reassessment. This link is one. In fact, there are multiple garbage patches in the Pacific Ocean, as well as in other oceanic bodies, including the Arctic Ocean where all that sea ice used to be.

Though not specifically about trashing the planet (at least with trash), the Arctic sea ice issue looms large in my mind. Given the preponderance of land mass in the Northern Hemisphere and the Arctic’s foundational role in climate stabilization, the predicted disappearance of sea ice in the Arctic (at least in the summertime) may truly be the unrecoverable climate tipping point. I’m not a scientist and rarely recite data or studies in support of my understandings. Others handle that part of the climate change story far better than I could. However, the layperson’s explanation that makes sense to me is that, like ice floating in a glass of liquid, gradual melting and disappearance of ice keeps the surrounding liquid stable just above freezing. Once the ice is fully melted, however, the surrounding liquid warms rapidly to match ambient temperature. If the temperature of Arctic seawater rises high enough to slow or disallow reformation of winter ice, that could well be the quick, ugly end to things some of us expect.

Long again this time and a bit contentious. Sorry for trying your patience.

Having watched a few hundred Joe Rogan webcasts by now (previous blog on this topic here), I am pretty well acquainted with guests and ideas that cycle through periodically. This is not a criticism as I’m aware I recycle my own ideas here, which is more nearly thematic than simply repetitive. Among all the MMA folks and comedians, Rogan features people — mostly academics — who might be called thought leaders. A group of them has even been dubbed the “intellectual dark web.” I dunno who coined the phrase or established its membership, but the names might include, in no particular order, Jordan Peterson, Bret Weinstein, Eric Weinstein, Douglas Murray, Sam Harris, Jonathan Haidt, Gad Saad, Camille Paglia, Dave Ruben, Christina Hoff Sommers, and Lawrence Krauss. I doubt any of them would have been considered cool kids in high school, and it’s unclear whether they’re any cooler now that they’ve all achieved some level of Internet fame on top of other public exposure. Only a couple seem especially concerned with being thought cool now (names withheld), though the chase for clicks, views, likes, and Patreon support is fairly upfront. That they can usually sit down and have meaningful conversations without rancor (admirably facilitated by Joe Rogan up until one of his own oxen is gored, less admirably by Dave Ruben) about free speech, Postmodernism, social justice warriors, politics, or the latest meme means that the cliquishness of high school has relaxed considerably.

I’m pleased (I guess) that today’s public intellectuals have found an online medium to develop. Lots of imitators are out there putting up their own YouTube channels to proselytize their own opinions. However, I still prefer to get deeper understanding from books (and to a lesser degree, blogs and articles online), which are far better at delivering thoughtful analysis. The conversational style of the webcast is relentlessly up-to-date and entertaining enough but relies too heavily on charisma. And besides, so many of these folks are such fast talkers, often talking over each other to win imaginary debate points or just dominate the conversational space, that they frustrate and bewilder more than they communicate or convince.

Considering that the ongoing epistemological crisis I’ve been blogging about over time is central to the claims and arguments of these folks (though they never quite call it that), I want to focus on the infamous disagreement between Sam Harris and Jordan Peterson on the question of what counts as truth. This conflict immediately put me in mind of C.P. Snow’s lecture The Two Cultures, referring to the sciences and the humanities and how their advocates and adherents frequently lack sufficient knowledge and understanding of the other’s culture. As a result, they talk or argue past each other. Lawrence Krauss provided a brief update almost a decade ago (long before he was revealed to be a creep — charged with sexual misconduct and brought low like so many men over the past year). Being a theoretical physicist, his preference is predictable:

(more…)

Fully a decade ago, I analyzed with more length than I usually allow myself an article from The New Yorker that examined how media trends were pushing away from literacy (the typographic mind) toward listening and viewing (orality) as primary modes of information gathering and entertainment. The trend was already underway with the advent of radio, cinema, and television, which moved the relatively private experience of silent reading to a public or communal realm as people shared experiences around emerging media. The article took particular aim at TV. In the intervening decade, media continue to contrive new paths of distribution, moving activity back to private information environments via the smart phone and earbuds. The rise of the webcast (still called podcast by some, though that’s an anachronism), which may include a video feed or display a static image over discussion and/or lecture, and streaming services are good examples. Neither has fully displaced traditional media just yet, but the ongoing shift in financial models is a definite harbinger of relentless change.

This comes up again because, interestingly, The New Yorker included with an article I popped open on the Web an audio file of the very same article read by someone not the author. The audio was 40 minutes, whereas the article may have taken me 15 to 20 minutes had I read it. For undisclosed reasons, I listened to the audio. Not at all surprisingly, I found it odd and troublesome. Firstly, though the content was nominally investigative journalism (buttressed by commentary), hearing it read to me made it feel like, well, storytime, meaning it was fiction. Secondly, since my eyes weren’t occupied with reading, they sought other things to do and thus fragmented my attention.

No doubt The New Yorker is pandering to folks who would probably not be readers but might well become listeners. In doing so, it’s essentially conceding the fight, admitting that the effort to read is easily eclipsed by the effortlessness of listening. As alternative and unequal modes of transmitting the content of the article, however, it strikes me as an initiative hatched not by writers and editors capable of critical thought and addressing a similarly enabled readership but by a combination of sales and marketing personnel attempting to capture a widening demographic of listeners (read: nonreaders). Navigating to the article might be a modest extra complication, but if a link to the audio file can be tweeted out (I don’t actually know if that’s possible), then I guess the text isn’t truly necessary.

Here part of what I wrote a decade ago:

If the waning of the typographic mind proceeds, I anticipate that the abstract reasoning and critical thinking skills that are the legacy of Enlightenment Man will be lost except to a few initiates who protect the flame. And with so many other threats cropping up before us, the prospect of a roiling mass of all-but-in-name barbarians ruled by a narrow class of oligarchs does indeed spell the total loss of democracy.

Are we getting perilously close that this dystopia? Maybe not, since it appears that many of those in high office and leadership positions labor under their own failures/inabilities to read at all critically and so execute their responsibilities with about the same credibility as hearsay. Even The New Yorker is no longer protecting the flame.

I recall Nathaniel Hawthorne’s short story The Celestial Railroad railing against the steam engine, an infernal machine, that disrupts society (agrarian at that time). It’s a metaphor for industrialization. The newest infernal machine (many candidates have appeared since Hawthorne’s time only to be supplanted by the next) is undoubtedly the smart phone. Its disruption of healthy formation of identity among teenagers has already been well researched and documented. Is it ironic that as an object of our own creation, it’s coming after our minds?

The storms referenced in the earlier version of this post were civilization-ending cataclysms. The succession of North American hurricanes and earthquakes earlier this month of September 2017 were natural disasters. I would say that September was unprecedented in history, but reliable weather records do not extend very far back in human history and the geological record extending back into human prehistory would suggest that, except perhaps for their concentration within the span of a month, the latest storms are nothing out of the ordinary. Some have even theorized that hurricanes and earthquakes could be interrelated. In the wider context of weather history, this brief period of destructive activity may still be rather mild. Already in the last twenty years we’ve experienced a series of 50-, 100- and 500-year weather events that would suggest exactly what climate scientists have been saying, namely, that higher global average temperatures and more atmospheric moisture will lead to more activity in the category of superstorms. Throw drought, flood, and desertification into the mix. This (or worse, frankly) may have been the old normal when global average temperatures were several degrees warmer during periods of hothouse earth. All indications are that we’re leaving behind garden earth, the climate steady state (with a relatively narrow band of global temperature variance) enjoyed for roughly 12,000 years.

Our response to the latest line of hurricanes that struck the Gulf, Florida, and the Caribbean has been characterized as a little tepid considering we had the experience of Katrina from which to learn and prepare, but I’m not so sure. True, hurricanes can be seen hundreds of miles and days away, allowing folks opportunity to either batten down the hatches or flee the area, but we have never been able to handle mass exodus, typically via automobile, and the sheer destructive force of the storms overwhelms most preparations and delays response. So after Katrina, it appeared for several days that the federal government’s response was basically this: you’re on your own; that apparent response occurred again especially in Puerto Rico, which like New Orleans quickly devolved into a true humanitarian crisis (and is not yet over). Our finding (in a more charitable assessment on my part) is that despite foreknowledge of the event and past experience with similar events, we can’t simply swoop in and smooth things out after the storms. Even the first steps of recovery take time.

I’ve cautioned that rebuilding on the same sites, with the reasonable expectation of repeat catastrophes in a destabilized climate that will spawn superstorms reducing entire cities to garbage heaps, is a poor option. No doubt we’ll do it anyway, at least partially; it’s already well underway in Houston. I’ve also cautioned that we need to brace for a diaspora as climate refugees abandon destroyed and inundated cities and regions. It’s already underway with respect to Puerto Rico. This is a storm of an entirely different sort (a flood, actually) and can also been seen from hundreds of miles and weeks, months, years away. And like superstorms, a diaspora from the coasts, because of the overwhelming force and humanitarian crisis it represents, is not something for which we can prepare adequately. Still, we know it’s coming, like a 20- or 50-year flood.

My previous entry on this topic is found here. The quintessential question asked with regard to education (often levied against educators) is “Why can’t Johnnie read?” I believe we now have several answers.

Why Bother With Basics?

A resurrected method of teaching readin’ and writin’ (from the 1930s as it happens) is “freewriting.” The idea is that students who experience writer’s block should dispense with basic elements such as spelling, punctuation, grammar, organization, and style to simply get something on the page, coming back later to revise and correct. I can appreciate the thinking, namely, that students so paralyzed from an inability to produce finished work extemporaneously should focus first on vomiting blasting something onto the page. Whether those who use freewriting actually go back to edit (as I do) is unclear, but it’s not a high hurdle to begin with proper rudiments.

Why Bother Learning Math?

At Michigan State University, the algebra requirement has been dropped from its general education requirements. Considering that algebra is a basic part of most high school curricula, jettisoning algebra from the university core curriculum is astonishing. Again, it’s not a terribly high bar to clear, but for someone granted a degree from an institution of higher learning to fail to do so is remarkable. Though the rationalization offered at the link above is fairly sophisticated, it sounds more like Michigan State is just giving up asking its students to bother learning. The California State University system has adopted a similar approach. Wayne State University also dropped its math requirement and upped the ante by recommending a new diversity requirement (all the buzz with social justice warriors).

Why Bother Learning Music?

The Harvard Crimson reports changes to the music curriculum, lowering required courses for the music concentration from 13 to 10. Notably, most of the quotes in the article are from students relieved to have fewer requirements to satisfy. The sole professor quoted makes a bland, meaningless statement about flexibility. So if you want a Harvard degree with a music concentration, the bar has been lowered. But this isn’t educational limbo, where the difficulty is increased as the bar goes down; it’s a change from higher education to not-so-high-anymore education. Not learning very much about music has never been prohibition to success, BTW. Lots of successful musicians don’t even read music.

Why Bother Learning History?

According to some conservatives, U.S. history curricula, in particular this course offered by The College Board, teach what’s bad about America and undermine American exceptionalism. In 2015, the Oklahoma House Common Education Committee voted 11-4 for emergency House Bill 1380 (authored by Rep. Dan Fisher) “prohibiting the expenditure of funds on the Advanced Placement United States History course.” This naked attempt to sanitize U.S. history and substitute preferred (patriotic) narratives is hardly a new phenomenon in education.

Takeaway

So why can’t Johnnie read, write, know, understand, or think? Simply put, because we’re not bothering to teach him to read, write, know, understand, or think. Johnnie has instead become a consumer of educational services and political football. Has lowering standards ever been a solution to the struggle of getting a worthwhile education? Passing students through just to be rid of them (while collecting tuition) has only produced a mass of miseducated graduates. Similarly, does a certificate, diploma, or granted degree mean anything as a marker of achievement if students can’t be bothered to learn time-honored elements of a core curriculum? The real shocker, of course, is massaging the curriculum itself (U.S. history in this instance) to produce citizens ignorant of their own past and compliant with the jingoism of the present.

When I first wrote about this topic back in July 2007, I had only just learned of the Great Pacific Garbage Patch (and similar garbage gyres in others oceans). Though I’d like to report simply that nothing has changed, the truth is that conditions have worsened. Some commentators have rationalized contextualized the issue by observing that the Earth, the environment, the ecosphere, the biosphere, Gaia, or whatever one wishes to call the natural world has always been under assault by humans, that we’ve never truly lived in balance with nature. While that perspective may be true in a literal sense, I can’t help gnashing my teeth over the sheer scale of the assault in the modern industrial age (extending back 250+ years but really getting going once the steam engine was utilized widely). At that point, production and population curves angled steeply upwards, where they continue point as though there be no biophysical limits to growth or the amount and degree of destruction that can be absorbed by the biosphere. Thus, at some undetermined point, industrial scale became planetary scale and humans became terraformers.

News reports came in earlier this month that the remote and uninhabited (by humans) Henderson Island in the Pacific is now an inadvertent garbage dump, with estimates of over 17 tons of debris littering its once-pristine shores.

17hendersonisland1-superjumbo

This despoliation is a collateral effect of human activity, not the predictable result of direct action, such as with the Alberta Tar Sands, another ecological disaster (among many, many others). In the U.S., the Environmental Protection Agency (EPA) describes its mission as protecting human health and the environment and has established a Superfund to clean up contaminated sites. Think of this as a corporate subsidy, since the principal contaminators typically inflict damage in the course of doing business and extracting profit then either move on or cease to exist. Standard Oil is one such notorious entity. Now that the EPA is in the process of being defunded (and presumably on its way to being deauthorized) by the current administration of maniacs, the ongoing death-by-a-thousand-cuts suffered by the natural world will likely need to be revised to death-by-millions-of-cuts, a heedless acceleration of the death sentence humans have set in motion. In the meantime, industry is being given a freer hand to pollute and destroy. What could possibly go wrong?

If all this weren’t enough, another development darkened my brow recently: the horrific amount of space debris from decades of missions to put men, communications and surveillance satellites, and (one would presume) weapons in orbit. (Maybe the evil brainchild of inveterate cold warriors known unironically as “Star Wars” never actually came into being, but I wouldn’t place any bets on that.) This video from the Discovery Network gives one pause, no?

Admittedly, the dots are not actual size and so would not be as dense or even visible from the point of view of the visualization, but the number of items (20,000+ pieces) is pretty astonishing. (See this link as well.) This report describes some exotic technologies being bandied about to address the problem of space junk. Of course, that’s just so that more satellites and spacecraft can be launched into orbit as private industry takes on the mantle once enjoyed exclusively by NASA and the Soviet space program. I suppose the explorer’s mindset never diminishes even as the most remote places on and now around Earth are no longer untouched but human refuse.