Archive for October, 2017

Here’s another interesting tidbit from Anthony Giddens’ book The Consequences of Modernity, which is the subject of a series of book blogs I’ve been writing. In his discussion of disembedding mechanisms, he introduces the idea of civil inattention (from Goffman, actually). This has partly to do with presence or absence (including inattention) in both public and private settings where face-to-face contact used to be the only option but modern technologies have opened up the possibility of faceless interactions over distance, such as with the telegraph and telephone. More recently, the face has been reintroduced with videoconferencing, but nonverbal cues such as body language are largely missing; the fullness of communication remains attenuated. All manner of virtual or telepresence are in fact cheap facsimiles of true presence and the social cohesion and trust enabled by what Giddens calls facework commitments. Of course, we delude ourselves that interconnectivity mediated by electronics is a reasonable substitute for presence and attention, which fellow blogger The South Roane Agrarian bemoans with this post.

Giddens’ meaning is more specific than this, though. The inattention of which Giddens writes is not the casual distraction of others with which we all increasingly familiar. Rather, Giddens takes note of social behaviors embedded in deep culture having to do with signalling trust.

Two people approach and pass one another on a city sidewalk. What could be more trivial and uninteresting? … Yet something is going on here which links apparently minor aspects of bodily management to some of the most pervasive features of modernity. The “inattention” displayed is not indifference. Rather it is a carefully monitored demonstration of what might be called polite estrangement. As the two people approach one another, each rapidly scans the face of the other, looking away as they pass … The glance accords recognition of the other as an agent and as a potential acquaintance. Holding the gaze of the other only briefly, then looking ahead as each passes the other couples such an attitude with an implicit reassurance of lack of hostile intent. [p. 81]

It’s a remarkably subtle interaction: making eye contact to confirm awareness of another but then averting one’s eyes to establish that copresence poses no particular threat in either direction. Staring too fixedly at another communicates something quite else, maybe fear or threat or disapprobation. By denying eye contact — by keeping one’s eyes buried in a handheld device, for instance — the opportunity to establish a modicum of trust between strangers is missed. Intent (or lack thereof) is a mystery. In practice, such modern-day inattention is mere distraction, not a sign of malevolence, but the ingrained social cue is obviated and otherwise banal happenstances become sources of irritation, discomfort, and/or unease, as with someone who doesn’t shake hands or perform others types of greeting properly.

I wrote before about my irritation with others face-planted in their phones. It is not a matter of outright offense but rather a quiet sense of affront at failure to adopt accepted social behaviors (as I once did). Giddens puts it this way:

Tact and rituals of politeness are mutual protective devices, which strangers or acquaintances knowingly use (mostly on the level of practical consciousness) as a kind of implicit social contact. Differential power, particularly where it is very marked, can breach or skew norms …. [pp. 82–83]

That those social behaviors have adapted to omnipresent mobile media, everyone pacified or hypnotized within their individual bubbles, is certainly not a salutary development. It is, however, a clear consequence of modernity.

The scandal surrounding Harvey Weinstein and all the people he harassed, bullied, assaulted, molested, and raped has provided occasion for many who had dealings with him to revisit their experiences and wonder what might have been (or not been) had things gone differently, had they acted otherwise in response to his oafish predations. I judge it’s nearly impossible for those outside the Hollywood scene to understand fully the stakes involved (and thus the distorted psychology), but on the other hand, nearly everyone has experience with power imbalances that enable some to get away with exploiting and victimizing others. And because American culture responds to tragedies like a bunch of rubberneckers, the witch hunt has likely only just begun. There’s a better than average chance that, as with icebergs, the significantly larger portion of the problem lies hidden below the surface, as yet undisclosed. Clamor won’t alter much in the end; the dynamics are too ingrained. Still, expect accusations to fly all over the industry, including victim blaming. My strong suspicion is that some folks dodged (actively or passively) becoming victims and paid a price in terms of career success, whereas others fell prey or simply went along (and then stayed largely silent until now) and got some extra consideration out of it. Either way, it undermines one’s sense of self-worth, messing with one’s head for years afterwards. Sometimes there’s no escaping awful circumstance.

Life is messy, right? We all have episodes from our past that we wish we could undo. Hindsight makes the optimal path far more clear than in the moment. Fortunately, I have no crimes among my regrets, but with certain losses, I certainly wish I had known then what I know now (a logical fallacy). Strange that the news cycle has me revisiting my own critical turning points in sympathy with others undoubtedly doing the same.

As I generalize this thought process, I can’t help but to wonder as well what might have been had we not, say, (1) split the atom and immediately weaponized the technology, (2) succumbed to various Red Scares scattered around 20th- and 21st-century calendars but instead developed a progressive society worthy of the promise our institutions once embodied, (3) plunged forward out of avarice and shortsightedness by plundering the Earth, and (4) failed to reverse course once the logical conclusion to our aggregate effects on the biosphere was recognized. No utopia would have arisen had we dodged these bullets, of course, but the affairs of men would have been marginally improved, and we might even have survived the 21st century. Such thinking is purely hypothetical and invites a fatalist like me to wonder whether — given our frailty, weakness, and corruption (the human condition being akin to original sin) — we don’t already inhabit the best of all possible worlds.

Isn’t that a horrible thought? A world full of suffering and hardship, serial rapists and murderers, incompetent and venal political leaders, and torture and genocides is the best we can do? We can’t avoid our own worst tendencies? Over long spans of time, cataclysmic earthquakes, volcanic eruptions, superstorms, and meteor strikes already make life on Earth rather precarious, considering that over 99% of all species that once existed are now gone. On balance, we have some remarkable accomplishments, though often purchased with sizeable trade-offs (e.g., slave labor, patriarchal suppression). Still, into the dustbin of history is where we are headed rather sooner than later, having enjoyed only a brief moment in the sun.

I saw something a short while back that tweaked my BS meter into the red: the learning pyramid. According to research done by The NTL Institute for Applied Behavioral Science in the 1960s (… behaviorists, ugh) and reported here (among other places), there are widely disparate rates of knowledge retention across different passive and active teaching methods:

learning-pyramid-synap-2

Let me first state something quite obvious: learning and retention (memory) aren’t the same thing. If one seeks sheer retention of information as a proxy for learning, that’s a gross misunderstanding of both cognition and learning. For example, someone who has managed to memorize, let’s say, baseball statistics going back to the 1950s or Bible verses, may have accomplished an impressive mental task not at all aligned with normal cognitive function (the leaky bucket analogy discussed at the link above is accurate), but neither example qualifies someone as learned the way most understand the term. Information (especially raw data) is neither knowledge, understanding, nor wisdom. They’re related, sure, but not the same (blogged about this before here). Increasing levels of organization and possession are required to reach each threshold.

The passive/active (participatory) labels are also misleading. To participate actively, one must have something to contribute, to be in possession of knowledge/skill already. To teach something effectively, one must have greater expertise than one’s students. Undoubtedly, teaching others solidifies one’s understanding and expertise, and further learning is often a byproduct, but one certainly can’t begin learning a new subject area by teaching it. Information (input) needs to come from elsewhere, which understandably has a lower retention rate until it’s been gone over repeatedly and formed the cognitive grooves that represent acquisition and learning. This is also the difference between reception and expression in communications. One’s expressive vocabulary (the words one can actually deploy in speech and writing) is a subset of one’s receptive vocabulary (the words one can understand readily upon hearing or reading). The expressive vocabulary is predicated on prior exposure that imbues less common words with power, specificity, and nuance. While it’s possible to learn new words quickly (in small quantities), it’s not generally possible to skip repetition that enables memorization and learning. Anyone studying vocabulary lists for the SAT/ACT (as opposed to a spelling bee) knows this intuitively.

Lastly, where exactly is most prospective knowledge and skill located, inside the self or outside? One doesn’t simply peel back layers of the self to reveal knowledge. Rather, one goes out into the world and seeks it (or doesn’t, sadly), disturbing it from its natural resting place. The great repositories of knowledge are books and other people (especially those who write books — whoda thunk?). So confronting knowledge, depending on the subject matter, occurs more efficiently one-on-one (an individual reading a book) or in groups (25 or so students in a classroom headed by 1 teacher). The inefficiency of a 1:1 ratio between student and teacher (a/k/a tutoring) is obviously available to those who place a high enough value on learning to hire a tutor. However, that’s not how education (primary through postgraduate) is undertaken in most cases. And just imagine the silliness of gathering a classroom of students just for one person (the teacher) to learn with 90% retention, as the learning pyramid would suggest.

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

I’m a little gobsmacked that, in the aftermath of someone finally calling out the open secret of the Hollywood casting couch (don’t know, don’t care how this news cycle started) and netting Harvey Weinstein in the process, so many well-known actors have added their “Me, too!” to the growing scandal. Where were all these sheep before now? As with Bill Cosby and Bill Clinton, what good does it do to allow a serial abuser to continue unchallenged until years, decades later a critical mass finally boils over? I have no special knowledge or expertise in this area, so what follows is the equivalent of a thought experiment.

Though the outlines of the power imbalance between a Hollywood executive and an actor seeking a role (or other industry worker seeking employment) are pretty clear, creating a rich opportunity for the possessor of such power to act like a creep or a criminal, the specific details are still a little shrouded — at least in my limited consumption of the scandal press. How much of Weinstein’s behavior veers over the line from poor taste to criminality is a difficult question precisely because lots of pictorial evidence exists showing relatively powerless people playing along. It’s a very old dynamic, and its quasi-transactional nature should be obvious.

In my idealized, principled view, if one has been transgressed, the proper response is not to slink away or hold one’s tongue until enough others are similarly transgressed to spring into action. The powerless are duty bound to assert their own power — the truth — much like a whistleblower feels compelled to disclose corruptions of government and corporate sectors. Admittedly, that’s likely to compound the initial transgression and come at some personal cost, great or small. But for some of us (a small percentage, I reckon), living with ourselves in silent assent presents an even worse option. By way of analogy, if one were molested by a sketchy uncle and said nothing, I can understand just wanting to move on. But if one said nothing yet knew the sketchy uncle had more kids lined up in the extended family to transgress, then stepping up to protect the younger and weaker would be an absolute must.

In the past few decades, clergy of the Catholic Church sexually abused many young people and deployed an institutional conspiracy to hide the behaviors and protect the transgressors. Exposure should have broken trust bonds between the church and the faithful and invalidated the institution as an abject failure. Didn’t quite work out that way. Similar scandals and corruption across a huge swath of institutions (e.g., corporate, governmental, military, educational, entertainment, and sports entities) have been appearing in public view regularly, yet as a culture, we tolerate more creeps and criminals than we shame or prosecute. (TomDispatch.com is one of the sites that regularly reports these corruptions with respect to American empire; I can scarcely bear to read it sometimes.) I suspect part of that is a legitimate desire for continuity, to avoid burning down the house with everyone in it. That places just about everyone squarely within the “Me, too!” collective. Maybe I shouldn’t be so gobsmacked after all.

Caveat: This thought experiment definitely comes from a male perspective. I recognize that females view these issues quite differently, typically in consideration of far greater vulnerability than males experience (excepting the young boys in the Catholic Church example).

Reading further into Anthony Giddens’ book The Consequences of Modernity, I got a fuller (though still incomplete) sense of what is meant by his terms disembedding mechanisms, expert systems, and symbolic tokens, all of which disrupt time and space as formerly understood in traditional societies that enjoyed the benefit of centuries of continuity. I’ve been aware of analyses regarding, for instance, the sociology of money and the widespread effects of the introduction and adoption of mechanical clocks and timepieces. While most understand these developments superficially as unallayed progress, Giddens argues that they do in fact reorder our experience in the world away from an organic, immediate orientation toward an intellectualized adherence to distant, abstract, self-reinforcing (reflexive) mechanisms.

But those matters are not really what this blog post is about. Rather, this passage sparked my interest:

… when the claims of reason replaced those of tradition, they appeared to offer a sense of certitude greater than that provided by preexisting dogma. But this idea only appears persuasive so long as we do not see that the reflexivity of modernity actually subverts reason, at any rate where reason is understood as the gaining of certain knowledge … We are abroad in a world which is thoroughly constituted through reflexively applied knowledge, but where at the same time we can never be sure that any given element of that knowledge will not be revised. [p. 39]

Put another way, science and reason are axiomatically open to examination, challenge, and revision and often undergo disruptive change. That’s what is meant by Karl Popper’s phrase “all science rests upon shifting sand” and informs the central thesis of Thomas Kuhn’s well-known book The Structure of Scientific Revolutions. It’s not the narrow details that shift so much (hard sciences lead pretty reliably to applied engineering) as the overarching narrative, e.g., the story of the Earth, the cosmos, and ourselves as revealed through scientific inquiry and close examination. Historically, the absolute certainty of the medieval church, while not especially accurate in either details or narrative, yielded considerable authority to post-Enlightenment science and reason, which themselves continue to shift periodically.

Some of those paradigm shifts are so boggling and beyond the ken of the average thinker (including many college-educated folks) that our epistemology is now in crisis. Even the hard facts — like the age and shape of the Earth or its orbital relationship to other solar bodies — are hotly contested by some and blithely misunderstood by others. One doesn’t have to get bogged down in the vagaries of relativity, nuclear power and weapons, or quantum theory to lose the thread of what it means to live in the 21st century. Softer sciences such as psychology, anthropology, economics, and even history now deliver new discoveries and (re-)interpretations of facts so rapidly, like the dizzying pace of technological change, that philosophical systems are unmoored and struggling for legitimacy. For instance, earlier this year, a human fossil was found in Morocco that upended our previous knowledge of human evolution (redating the first appearance of biologically modern humans about 100,000 years earlier). More popularly, dieticians still disagree on what sorts of foods are healthy for most of us (though we can probably all agree that excess sugar is bad). Other recent developments include the misguided insistence among some neurobiologists and theorists that consciousness, free will, and the self do not exist (I’ll have a new post regarding that topic as time allows) and outright attacks on religion not just for being in error but for being the source of evil.

I have a hard time imagining other developments in 21st-century intellectual thought that would shake the foundations of our cosmology any more furiously than what we’re now experiencing. Even the dawning realization that we’ve essentially killed ourselves (with delayed effect) by gradually though consistently laying waste to our own habitat is more of an “oops” than the mind-blowing moment of waking up from The Matrix to discover the unreality of everything once believed. Of course, for fervent believers especially, the true facts (best as we can know them, since knowledge is forever provisional) are largely irrelevant in light of desire (what one wants to believe), and that’s true for people on both sides of the schism between church and science/reason.

As Shakespeare wrote in Hamlet, “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.” So it’s probably wrong to introduce a false dualism, though it has plenty of historical precedent. I’ll suggest instead that there are more facets and worldviews at play in the world than the two that have been warring in the West for the last 600 years.