Posts Tagged ‘Education’

Among the many complaints that cross my path in the ongoing shitshow that American culture has become is an article titled “The Tragic Decline of Music Literacy (and Quality),” authored by Jon Henschen. His authorship is a rather unexpected circumstance since he is described as a financial advisor rather than an authority on music, technology, or culture. Henschen’s article reports on (without linking to it as I do) an analysis by Joan Serrà et al., a postdoctoral scholar at the Artificial Intelligence Research Institute. Curiously, the analysis is reported on and repackaged by quite a few news sites and blogs since its publication in 2012. For example, the YouTube video embedded below makes many of the same arguments and cites the so-called Millennial Whoop, a hook or gesture now ubiquitous in pop music that’s kinda sorta effective until one recognizes it too manifestly and it begins to sound trite, then irritating.

I won’t recount or summarize arguments except to say that neither the Henschen article nor the video discusses the underlying musical issues quite the way a trained musician would. Both are primarily quantitative rather than qualitative, equating an observed decrease in variety of timbre, loudness, and pitch/harmony as worse music (less is more worse). Lyrical (or poetical) complexity has also retreated. It’s worth noting, too, that the musical subject is narrowed to recorded pop music from 1955 to 2010. There’s obviously a lot to know about pop music, but it’s not generally the subject of serious study among academic musicians. AFAIK, no accredited music school offers degrees in pop music. Berklee College of Music probably comes the closest. (How exactly does songwriting as a major differ from composition?) That standard may be relaxing.

Do quantitative arguments demonstrate degradation of pop music? Do reduced variety, range, and experimentation make pop music the equivalent of a paint-by-the-numbers image with the self-imposed limitation that allows only unmixed primary colors? Hard to say, especially if one (like me) has a traditional education in art music and already regards pop music as a rather severe degradation of better music traditions. Reduction of the artistic palette from the richness and variety of, say, 19th-century art music proceeded through the 20th century (i.e., musical composition is now understood by the lay public to mean songs, which is just one musical genre among many) to a highly refined hit-making formula that has been proven to work remarkably well. Musical refinements also make use of new technological tools (e.g., rhythm machines, autotune, digital soundfield processing), which is another whole discussion.

Musical quality isn’t mere quantity (where more is clearly better), however, and some manage pretty well with limited resources. Still, a sameness or blandness is evident and growing within a genre that is already rather narrowly restricted to using drums, guitars, keyboards, vocals. The antidote Henschen suggests (incentivizing musical literacy and participation, especially in schools) might prove salutary, but such recommendations are ubiquitous throughout modern history. The magical combination of factors that actually catalyzes creativity, as opposed to degradation, is rather quixotic. Despite impassioned pleas not to allow quality to disappear, nothing could be more obvious than that culture drifts according to its own whims (to anthropomorphize) rather than being steered by well-meaning designs.

More to say in part 2 to follow.

Advertisements

Heard a curious phrase used with some regularity lately, namely, that “we’ve Nerfed the world.” Nerf refers to the soft, foam toys popular in the 70s and beyond that made balls and projectiles essentially harmless. The implication of the phrase is that we’ve become soft and vulnerable as a result of removing the routine hazards (physical and psychological) of existence. For instance, in the early days of cell phones, I recall padded street poles (like endzone goalposts) to prevent folks with their attention fixated too intently on their phones from harming themselves when stumbling blindly down the sidewalk.

Similarly, anti-bullying sentiment has reached fevered pitch such that no level of discomfort (e.g., simple name calling) can be tolerated lest the victim be scarred for life. The balancing point between preparing children for competitive realities of the world and protecting their innocence and fragility has accordingly moved heavily in favor of the latter. Folks who never develop the resilience to suffer even modest hardships are snowflakes, and they agitate these days on college campuses (and increasingly in workplaces) to withdraw into safe spaces where their beliefs are never challenged and experiences are never challenging. The other extreme is a hostile, cruel, or at least indifferent world where no one is offered support or opportunity unless he or she falls within some special category, typically connected through family to wealth and influence. Those are entitled.

A thermostatic response (see Neil Postman for more on this metaphor) is called for here. When things veer too far toward one extreme or the other, a correction is inevitable. Neither extreme is healthy for a functioning society, though the motivations are understandable. Either it’s toughen people up by providing challenge, which risks brutalizing people unnecessarily, or protect people from the rigors of life or consequences of their own choices to such a degree that they become dependent or dysfunctional. Where the proper balance lies is a question for the ages, but I daresay most would agree it’s somewhere squarely in the middle.

Jonathan Haidt and Greg Lukianoff have a new book out called The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure (2018), which is an expansion of an earlier article in The Atlantic of the same title. (Both are callbacks to Allan Bloom’s notorious The Closing of the American Mind (1987), which I’ve read twice. Similar reuse of a famous title references Robert Bork’s Slouching Toward Gomorrah (1996).) I haven’t yet read Haidt’s book and doubt I will bother, but I read the source article when it came out. I also don’t work on a college campus and can’t judge contemporary mood compared to when I was an undergraduate, but I’m familiar with the buzzwords and ​intellectual fashions reported by academics and journalists. My alma mater is embroiled in these battles, largely in connection with identity politics. I’m also aware of detractors who believe claims of Haidt and Lukianoff (and others) are essentially hysterics limited to a narrow group of progressive colleges and universities.

As with other cultural developments that lie outside my expertise, I punt when it comes to offering (too) strong opinions. However, with this particular issue, I can’t help but to think that the two extremes coexist. A noisy group of students attending highly competitive institutions of higher education lead relatively privileged lives compared to those outside the academy, whereas high school grads and dropouts not on that track (and indeed grads of less elite schools) frequently struggle getting their lives going in early adulthood. Most of us face that struggle early on, but success, despite nonsensical crowing about the “best economy ever” from the Oval Office, is difficult to achieve now as the broad socioeconomic middle is pushed toward the upper and lower margins (mostly lower). Stock market notwithstanding, economic reality is frankly indifferent to ideology.

One of the very best lessons I took from higher education was recognizing and avoiding the intentional fallacy — in my own thinking no less than in that of others. Although the term arguably has more to do with critical theory dealing specifically with texts, I learned about it in relation to abstract fine arts, namely, painting and music. For example, the enigmatic expression of the Mona Lisa by Leonardo Da Vinci continues to spark inquiry and debate. What exactly does that smile mean? Even when words or programs are included in musical works, it’s seductively easy to conclude that the composer intends this or the work itself means that. Any given work purportedly allows audiences to peer into the mind of its creator(s) to interrogate intent. Conclusions thus drawn, however, are notoriously unreliable though commonplace.

It’s inevitable, I suppose, to read intent into artistic expression, especially when purpose feels so obvious or inevitable. Similar excavations of meaning and purpose are undertaken within other domains of activity, resulting in no end of interpretation as to surface and deep strategies. Original intent (also originalism) is a whole field of endeavor with respect to interpretation of the U.S. Constitution and imagining the framers’ intent. Geopolitics is another domain where hindsight analysis results in some wildly creative but ultimately conjectural interpretations of events. Even where authorial (and political) intent is explicitly recorded, such as with private diaries or journals, the possibility of deceptive intent by authors keeps everyone wondering. Indeed, although “fake news” is modern coin, a long history of deceptive publishing practice well beyond the adoption of a nom de plume attests to hidden or unknowable intent making “true intent” a meta property.

The multi-ring circus that the modern information environment has become, especially in the wake of electronic media (e.g., YouTube channels) produced by anyone with a camera and an Internet connection, is fertile ground for those easily ensnared by the intentional fallacy. Several categories of intent projected onto content creators come up repeatedly: profit motive, control of the narrative (no small advantage if one believes this blog post), setting the record straight, correcting error, grandstanding, and trolling for negative attention. These categories are not mutually exclusive. Long ago, I pointed to the phenomenon of arguing on-line and how it typically accomplishes very little, especially as comment threads lengthen and civility breaks down. These days, comments are an Internet legacy and/or anachronism that many content creators persist in offering to give the illusion of a wider discussion but in fact roundly ignore. Most blogs and channels are actually closed conversations. Maybe a Q&A follows the main presentation when held before an audience, but video channels are more often one-way broadcasts addressing an audience but not really listening. Public square discussion is pretty rare.

Some celebrate this new era of broadcasting, noting with relish how the mainstream media is losing its former stranglehold on attention. Such enthusiasm may be transparently self-serving but nonetheless rings true. A while back, I pointed to New Media Rockstars, which traffics in nerd culture entertainment media, but the term could easily be expanded to include satirical news, comedy, and conversational webcasts (also podcasts). Although some folks are rather surprised to learn that an appetite for substantive discussion and analysis exists among the public, I surmise that the shifting media landscape and disintegrated cultural narrative have bewildered a large segment of the public. The young in particular are struggling to make sense of the world, figure out what to be in life and how to function, and working out an applied philosophy that eschews more purely academic philosophy.

By way of example of new media, let me point to a trio of YouTube channels I only recently discovered. Some More News parodies traditional news broadcasts by sardonically (not quite the same as satirically) calling bullshit on how news is presented. Frequent musical cues between segments make me laugh. Unlike the mainstream media, which are difficult not to regard as propaganda arms of the government, Some More News is unapologetically liberal and indulges in black humor, which doesn’t make me laugh. Its raw anger and exasperation are actually a little terrifying. The second YouTube channel is Three Arrows, a sober, thorough debunking of news and argumentation found elsewhere in the public sphere. The speaker, who doesn’t appear onscreen, springs into action especially when accusations of current-day Nazism come up. (The current level of debate has devolved to recklessly calling nearly everyone a Nazi at some stage. Zero points scored.) Historical research often puts things into proper context, such as the magnitude of the actual Holocaust compared to some garden-variety racist running his or her mouth comparatively harmlessly. The third YouTube channel is ContraPoints, which is rather fanciful and profane but remarkably erudite considering the overall tone. Labels and categories are explained for those who may not have working definitions at the ready for every phrase or ideology. Accordingly, there is plenty of jargon. The creator also appears as a variety of different characters to embody various archetypes and play devil’s advocate.

While these channels may provide abundant information, correcting error and contextualizing better than most traditional media, it would be difficult to conclude they’re really moving the conversation forward. Indeed, one might wonder why bother preparing these videos considering how time consuming it has to be to do research, write scripts, assemble pictorial elements, etc. I won’t succumb to the intentional fallacy and suggest I know why they bother holding these nondebates. Further, unless straight-up comedy, I wouldn’t say they’re entertaining exactly, either. Highly informative, perhaps, if one pays close attention to frenetic online pace and/or mines for content (e.g., studying transcripts or following links). Interestingly, within a fairly short period of time, these channels are establishing their own rhetoric, sometimes useful, other times too loose to make strong impressions. It’s not unlike the development of new stylistic gestures in music or painting. What if anything worthwhile will emerge from the scrum will be interesting.

If the previous blog in this series was about how some ideas and beliefs become lodged or stuck in place (fixity bias), this one is about how other ideas are notoriously mutable (flexibility bias), especially the latest, loudest thing to turn one’s head and divert attention. What makes any particular idea (or is it the person?) prone to one bias or another (see this list) is mysterious to me, but my suspicion is that a character disposition toward openness and adherence to authoritative evidence figure prominently in the case of shifting opinion. In fact, this is one of the primary problems with reason: if evidence can be deployed in favor of an idea, those who consider themselves “reasonable” and thus rely on accumulation of evidence and argumentation to sharpen their thinking are vulnerable to the latest “finding” or study demonstrating sumpinorutha. It’s the intellectual’s version of “New! Improved!”

Sam Harris exploits rationalism to argue against the existence of free will, saying that if sufficient evidence can be brought to bear, a disciplined thinker is compelled to subscribe to the conclusions of reasoned argument. Choice and personal agency (free will) are removed. I find that an odd way to frame the issue. Limitless examples of lack of choice are nonequivalent to the destruction of free will. For example, one can’t decide not to believe in gravity and fly up into the air more than a few inches. One can’t decide that time is an illusion (as theoretical physicists now instruct) and decide not to age. One can’t decide that pooping is too disgusting and just hold it all in (as some children attempt). Counter-evidence doesn’t even need to be argued because almost no one pretends to believe such nonsense. (Twisting one’s mind around to believe in the nonexistence of time, free will, or the self seems to be the special province of hyper-analytical thinkers.) Yet other types of belief/denial — many of them conspiracy theories — are indeed choices: religion, flat Earth, evolution, the Holocaust, the moon landings, 9/11 truth, who really killed JFK, etc. Lots of evidence has been mustered on different sides (multiple facets, actually) of each of these issues, and while rationalists may be compelled by a preponderance of evidence in favor of one view, others are free to fly in the face of that evidence for reasons of their own or adopt by default the dominant narrative and not worry or bother so much.

The public struggles in its grasp of truthful information, as reported in a Pew Research Center study called “Distinguishing Between Factual and Opinion Statements in the News.” Here’s the snapshot:

The main portion of the study, which measured the public’s ability to distinguish between five factual statements and five opinion statements, found that a majority of Americans correctly identified at least three of the five statements in each set. But this result is only a little better than random guesses. Far fewer Americans got all five correct, and roughly a quarter got most or all wrong.

Indiscriminate adoption by many Americans of a faulty viewpoint, or more pointedly, the propaganda and “fake news” on offer throughout the information environment, carries the implication that disciplined thinkers are less confused about truth or facts, taking instead a rational approach as the basis for belief. However, I suggest that reason suffers its own frailties not easily recognized or acknowledged. In short, we’re all confused, though perhaps not hopelessly so. For several years now, I’ve sensed the outline of a much larger epistemological crisis where quintessential Enlightenment values have come under open attack. The irony is that the wicked stepchild of science and reason — runaway technology —  is at least partially responsible for this epochal conflict. It’s too big an idea to grok fully or describe in a paragraph or two, so I’ll simply point to it an move on.

My own vulnerability to flexibility bias manifests specifically in response to appeals to authority. Although well educated, a lifelong autodidact, and an independent thinker, I’m careful not to succumb to the hubris of believing I’ve got it all figgered. Indeed, it’s often said that as one gains expertise and experience in the world, the certainty of youth yields to caution precisely because the mountain of knowledge and understanding one lacks looms larger even as one accumulates wisdom. Bodies of thought become multifaceted and all arguments must be entertained. When an expert, researcher, or academic proposes something outside my wheelhouse, I’m a sitting duck: I latch onto the latest, greatest utterance as the best truth yet available. I don’t fall for it nearly so readily with journalists, but I do recognize that some put in the effort and gain specialized knowledge and context well outside the bounds of normal life, such as war reporters. Various perverse incentives deeply embedded in the institutional model of journalism, especially those related to funding, make it nearly impossible to maintain one’s integrity without becoming a pariah, so only a handful have kept my attention. John Pilger, Chris Hedges, and Matt Taibbe figure prominently.

By way of example, one of the topics that has been of supreme interest to me, though its historical remove renders it rather toothless now, is the cataclysm(s) that occurred at the conclusion of the last ice age roughly 12,000 years ago. At least three hypotheses (of which I’m aware) have been proposed to explain why glacial ice disappeared suddenly over the course of a few weeks, unleashing the Biblical Flood: Earth crust displacement, asteroidal impact(s), and coronal mass ejection(s). Like most hypotheses, evidence is both physical and conjectural, but a sizable body of evidence and argumentation for each is available. As I became familiar with each, my head turned and I became a believer, sorta. Rather than “last one is the rotten egg,” however, the latest, most recent one typically displaces the previous one. No doubt another hypothesis will appear to turn my head and disorient me further. With some topics, especially politics, new information piling on top of old is truly dizzying. And as I’ve written about many topics, I simply lack the expertise to referee competing claims, so whatever beliefs I eventually adopt are permanently provisional.

Finally, my vulnerability to authoritative appeal also reacts to the calm, unflappable tones and complexity of construction of speakers such as Sam Harris, Steven Pinker, and Charles Murray. Their manner of speaking is sometimes described pejoratively as “academese,” though only Pinker has a teaching position. Murray in particular relies heavily on psychometrics, which may not be outright lying with statistics but allows him to rationalize (literally) extraordinarily taboo subjects. In contrast, it’s easy to disregard pundits and press agents foaming and fulminating over their pet narratives. Yet I also recognize that with academese, I’m being soothed more by style than by substance, a triumph of form over function. In truth, this communication style is an appeal to emotion masquerading as an appeal to authority. I still prefer it, just as I prefer a steady, explanatory style of journalism over the snarky, reinterpretive style of disquisition practiced by many popular media figures. What communicates most effectively to me and (ironically) pushes my emotional buttons also weakens my ability to discriminate and think properly.

Yet still more to come in part 5.

Language acquisition in early childhood is aided by heavy doses of repetition and the memorable structure of nursery rhymes, songs, and stories that are repeated ad nauseum to eager children. Please, again! Again, again … Early in life, everything is novel, so repetition and fixity are positive attributes rather than causes for boredom. The music of one’s adolescence is also the subject of endless repetition, typically through recordings (radio and Internet play, mp3s played over headphones or earbuds, dances and dance clubs, etc.). Indeed, most of us have mental archives of songs heard over and over to the point that the standard version becomes canonical: that’s just the way the song goes. When someone covers a Beatles song, it’s recognizably the same song, yet it’s not the same and may even sound wrong somehow. (Is there any acceptable version of Love Shack besides that of the B52’s?) Variations of familiar folk tales and folk songs, or different phrasing in The Lord’s Prayer, imprinted in memory through sheer repetition, also possess discomfiting differences, sometimes being offensive enough to cause real conflict. (Not your Abrahamic deity, mine!)

Performing musicians traverse warhorses many times in rehearsal and public performance so that, after an undetermined point, how one performs a piece just becomes how it goes, admitting few alternatives. Casual joke-tellers may improvise over an outline, but as I understand it, the pros hone and craft material over time until very little is left to chance. Anyone who has listened to old comedy recordings of Bill Cosby, Steve Martin, Richard Pryor, and others has probably learned the jokes (and timing and intonation) by heart — again through repetition. It’s strangely comforting to be able to go back to the very same performance again and again. Personally, I have a rather large catalogue of classical music recordings in my head. I continue to seek out new renditions, but often the first version I learned becomes the default version, the way something goes. Dislodging that version from its definitive status is nearly impossible, especially when it’s the very first recording of a work (like a Beatles song). This is also why live performance often fails in comparison with the studio recording.

So it goes with a wide variety of phenomenon: what is first established as how something goes easily becomes canonical, dogmatic, and unquestioned. For instance, the origin of the universe in the big bang is one story of creation to which many still hold, while various religious creation myths hold sway with others. News that the big bang has been dislodged from its privileged position goes over just about as well as dismissing someone’s religion. Talking someone out of a fixed belief is hardly worth the effort because some portion of one’s identity is anchored to such beliefs. Thus, to question a cherished belief is to impeach a person’s very self.

Political correctness is the doctrine that certain ideas and positions have been worked out effectively and need (or allow) no further consideration. Just subscribe and get with the program. Don’t bother doing the mental work or examining the issue oneself; things have already been decided. In science, steady evidenciary work to break down a fixed understanding is often thankless, or thanks arrives posthumously. This is the main takeaway of Thomas Kuhn’s The Structure of Scientific Revolutions: paradigms are changed as much through attrition as through rational inquiry and accumulation of evidence.

One of the unanticipated effects of the Information and Communications Age is the tsunami of information to which people have ready access. Shaping that information into a cultural narrative (not unlike a creation myth) is either passive (one accepts the frequently shifting dominant paradigm without compunction) or active (one investigates for oneself as an attribute of the examined life, which with wizened folks never really arrives at a destination, since it’s the journey that’s the point). What’s a principled rationalist to do in the face of a surfeit of alternatives available for or even demanding consideration? Indeed, with so many self-appointed authorities vying for control over cultural narratives like the editing wars on Wikipedia, how can one avoid the dizzying disorientation of gaslighting and mendacity so characteristic of the modern information environment?

Still more to come in part 4.

Fully a decade ago, I analyzed with more length than I usually allow myself an article from The New Yorker that examined how media trends were pushing away from literacy (the typographic mind) toward listening and viewing (orality) as primary modes of information gathering and entertainment. The trend was already underway with the advent of radio, cinema, and television, which moved the relatively private experience of silent reading to a public or communal realm as people shared experiences around emerging media. The article took particular aim at TV. In the intervening decade, media continue to contrive new paths of distribution, moving activity back to private information environments via the smart phone and earbuds. The rise of the webcast (still called podcast by some, though that’s an anachronism), which may include a video feed or display a static image over discussion and/or lecture, and streaming services are good examples. Neither has fully displaced traditional media just yet, but the ongoing shift in financial models is a definite harbinger of relentless change.

This comes up again because, interestingly, The New Yorker included with an article I popped open on the Web an audio file of the very same article read by someone not the author. The audio was 40 minutes, whereas the article may have taken me 15 to 20 minutes had I read it. For undisclosed reasons, I listened to the audio. Not at all surprisingly, I found it odd and troublesome. Firstly, though the content was nominally investigative journalism (buttressed by commentary), hearing it read to me made it feel like, well, storytime, meaning it was fiction. Secondly, since my eyes weren’t occupied with reading, they sought other things to do and thus fragmented my attention.

No doubt The New Yorker is pandering to folks who would probably not be readers but might well become listeners. In doing so, it’s essentially conceding the fight, admitting that the effort to read is easily eclipsed by the effortlessness of listening. As alternative and unequal modes of transmitting the content of the article, however, it strikes me as an initiative hatched not by writers and editors capable of critical thought and addressing a similarly enabled readership but by a combination of sales and marketing personnel attempting to capture a widening demographic of listeners (read: nonreaders). Navigating to the article might be a modest extra complication, but if a link to the audio file can be tweeted out (I don’t actually know if that’s possible), then I guess the text isn’t truly necessary.

Here part of what I wrote a decade ago:

If the waning of the typographic mind proceeds, I anticipate that the abstract reasoning and critical thinking skills that are the legacy of Enlightenment Man will be lost except to a few initiates who protect the flame. And with so many other threats cropping up before us, the prospect of a roiling mass of all-but-in-name barbarians ruled by a narrow class of oligarchs does indeed spell the total loss of democracy.

Are we getting perilously close that this dystopia? Maybe not, since it appears that many of those in high office and leadership positions labor under their own failures/inabilities to read at all critically and so execute their responsibilities with about the same credibility as hearsay. Even The New Yorker is no longer protecting the flame.

I recall Nathaniel Hawthorne’s short story The Celestial Railroad railing against the steam engine, an infernal machine, that disrupts society (agrarian at that time). It’s a metaphor for industrialization. The newest infernal machine (many candidates have appeared since Hawthorne’s time only to be supplanted by the next) is undoubtedly the smart phone. Its disruption of healthy formation of identity among teenagers has already been well researched and documented. Is it ironic that as an object of our own creation, it’s coming after our minds?

I saw something a short while back that tweaked my BS meter into the red: the learning pyramid. According to research done by The NTL Institute for Applied Behavioral Science in the 1960s (… behaviorists, ugh) and reported here (among other places), there are widely disparate rates of knowledge retention across different passive and active teaching methods:

learning-pyramid-synap-2

Let me state first something quite obvious: learning and retention (memory) aren’t the same things. If one seeks sheer retention of information as a proxy for learning, that’s a gross misunderstanding of both cognition and learning. For example, someone who has managed to memorize, let’s say, baseball statistics going back to the 1950s or Bible verses, may have accomplished an impressive mental task not at all aligned with normal cognitive function (the leaky bucket analogy is accurate), but neither example qualifies someone as learned the way most understand the term. Information (especially raw data) is neither knowledge, understanding, nor wisdom. They’re related, sure, but not the same (blogged about this before here). Increasing levels of organization and possession are required to reach each threshold.

The passive/active (participatory) labels are also misleading. To participate actively, one must have something to contribute, to be in possession of knowledge/skill already. To teach something effectively, one must have greater expertise than one’s students. Undoubtedly, teaching others solidifies one’s understanding and expertise, and further learning is often a byproduct, but one certainly can’t begin learning a new subject area by teaching it. Information (input) needs to come from elsewhere, which understandably has a lower retention rate until it’s been gone over repeatedly and formed the cognitive grooves that represent acquisition and learning. This is also the difference between reception and expression in communications. One’s expressive vocabulary (the words one can actually deploy in speech and writing) is a subset of one’s receptive vocabulary (the words one can understand readily upon hearing or reading). The expressive vocabulary is predicated on prior exposure that imbues less common words with power, specificity, and nuance. While it’s possible to learn new words quickly (in small quantities), it’s not generally possible to skip repetition that enables memorization and learning. Anyone studying vocabulary lists for the SAT/ACT (as opposed to a spelling bee) knows this intuitively.

Lastly, where exactly is most prospective knowledge and skill located, inside the self or outside? One doesn’t simply peel back layers of the self to reveal knowledge. Rather, one goes out into the world and seeks it (or doesn’t, sadly), disturbing it from its natural resting place. The great repositories of knowledge are books and other people (especially those who write books — whoda thunk?). So confronting knowledge, depending on the subject matter, occurs more efficiently one-on-one (an individual reading a book) or in groups (25 or so students in a classroom headed by 1 teacher). The inefficiency of a 1:1 ratio between student and teacher (a/k/a tutoring) is obviously available to those who place a high enough value on learning to hire a tutor. However, that’s not how education (primary through postgraduate) is undertaken in most cases. And just imagine the silliness of gathering a classroom of students to teach just for one person to learn with 90% retention, as the learning pyramid would suggest.

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific episodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.