What is more tantalizing and enticing than a secret? OK, probably sex appeal, but never mind that for now. Secrets confer status on the keeper and bring those on whom the secret is bestowed into an intimate (nonsexual, for you dirty thinkers) relationship with the secret sharer. I remember the sense of relief and quiet exhilaration when the Santa Claus story was finally admitted by my parents to be a hoax untrue. I had already ceased to really believe in it/him but wasn’t yet secure enough as a 6- or 7-year-old (or whenever it was) to assert it without my parents’ confirmation. And it was a secret I withheld from my younger siblings, perhaps my first instruction on when lying was acceptable, even looked upon approvingly. Similarly, I remember how it felt to be told about sex for the first time by older kids (now you can go there, you cretins) and thus realize that my parents (and everyone else’s) had done the dirty — multiple times even for families with more than one kid. I was the possessor of secret knowledge, and everyone figured out quickly that it was best to be discreet about it. It may have been the first open secret. Powerful stuff, as we were to learn later in our hormone-addled adolescence. In early adulthood, I also began to assert my atheism, which isn’t really a secret but still took time to root fully. From my mature perspective, others who believe in one sky-god or another look like the kids who at a tender age still believe in Santa Claus and the Easter Bunny. I don’t go out of my way to dispel anyone’s faith.

Even as adults, those of us who enjoy secret knowledge feel a bit of exhilaration. We know what goes on (a little or a lot) behind the scenes, behind the curtain, in the backrooms and dark places. It may also mean that we know how the proverbial sausage is made, which is far less special. National security clearance, operating at many levels of access, may be the most obvious example, or maybe it’s just being a bug on the wall in the dugout or locker room during a pro sports contest. Being within the circle of intimates is intoxicating, though the circumstances that gets one into the circle may be rather mundane, and those on the outside may look oddly pathetic.

The psychology behind secret knowledge functions prominently with conspiracy theories. Whether the subject is political assassinations, Bigfoot or the Loch Ness Monster, the moon landings, Area 51 and alien abduction, chemtrails/contrails, or 9/11, one’s personal belief and pet theory inescapably confers special status, especially as unacknowledged or unaccepted truth. Often, as others seek to set the record straight, one digs in to defend cherished beliefs. It’s an elixir,  a dangerous cycle that traps people in contrafactual cliques. So we have flat Earthers, birthers, 9/11 truthers, creationists, climate change deniers, etc. (I count myself among one of those groups, BTW. Figure it out for yourself.) The range of interpretations floated in the political realm with respect to the machinations of the two parties and the White House boggle my mind with possibilities. However, I’m squarely outside those circles and feel no compulsion to decide what I believe when someone asserts secret knowledge from inside the circle. I float comfortably above the fray. Similarly, with so much fake news pressing for my attention, I consciously hold quite a lot of it in abeyance until time sorts it out for me.

Advertisements

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

The witch hunt aimed at sexual predators continues to amaze as it crashes the lives of more and more people. I knew once the floodgates were opened that many of the high and mighty would be brought low. It was probably overdue, but no one can be truly surprised by the goings on giving rise to this purge. Interestingly, the media have gone into the archives and found ample evidence of jokes, hush money, accusations, and lawsuits to demonstrate that this particular open secret was a well-known pattern. Some have offered the simplest of explanations: power corrupts (another open secret). No one really wants to hear that time-honored truth or admit that they, too, are entirely corruptible.

One of the accused has openly admitted that the accusations against him are true, which is almost a breath of fresh air amid all the denials and obfuscations but for the subject matter of the admission. And because it’s a witch hunt, those accused are vulnerable to the mob demanding immediate public shaming and then piling on. No investigation or legal proceeding is necessary (though that may be coming, too). The court of public opinion effects immediate destruction of life and livelihood. Frankly, it’s hard to be sympathetic toward the accused, but I cling to noble sentiment when it comes to application of the law. We should tread lightly to avoid the smears of false accusation and not be swept into moral panic.

Ran Prieur weighed in with this paragraph (no link to his blog, sorry; it’s quite searchable until it gets pushed down and off the page):

I like Louis CK’s apology because he understands that the core issue is power … We imagine these people are bad because they crossed the line between consent and coercion. But when almost the entire world is under authoritarian culture, where it’s normal for some people to tell other people what to do, where it’s normal for us to do what we’re told even if we don’t feel like it, then the line between consent and coercion is crossed so often that it basically doesn’t exist.

Once a culture has crossed the line into normalization of hierarchy, it’s a constant temptation to cross the next line, between using a position of power for the good of the whole, and using it selfishly. And once that line has been crossed, it’s tempting for selfish use of power to veer into sex acts.

I like to think, in a few thousand years, human culture will be so much improved that one person having any power over another will be a scandal.

It’s a slightly fuller explanation of the power dynamic, just as Louis CK offered his own explanation. The big difference is that no one wants to hear it from an admitted sexual predator. Thus, Louis CK is over. Similarly, no one can watch The Cosby Show in innocence anymore. Remains to be seen if any of the fallen will ever rise to career prominence again. Yet Prieur’s final statement confounds me completely. He gets the power dynamic but then plainly doesn’t get it at all. Power and authority are not optional in human society. Except for a few rare, isolated instances of radical egalitarianism, they are entirely consistent with human nature. While we might struggle to diminish the more awful manifestations, so long as there are societies, there will be power imbalances and the exploitation and predation (sexual and otherwise) that have been with us since our prehistory.

Remember: we’re mammals, meaning we compete with each other for sexual access. Moreover, we can be triggered easily enough, not unlike dogs responding when a bitch goes into heat. Sure, humans have executive mental function that allows us to overcome animal impulses some of the time, but that’s not a reliable antidote to sexual misconduct ranging from clumsy come-ons to forcible rape. This is not to excuse anyone who acts up. Rather, it’s a reminder that we all have to figure out how to maneuver in the world effectively, which frankly includes protecting ourselves from predators. The young, sexually naïve, and powerless will always be prime targets. Maybe we’re not quite barbarians anymore, raping and pillaging with wanton disregard for our victims, but neither are we far removed from that characterization, as recent accounts demonstrate.

Commentary on the previous post poses a challenging question: having perceived that civilization is set on a collision course with reality, what is being done to address that existential problem? More pointedly, what are you doing? Most rubes seem to believe that we can technofix the problem, alter course and set off in a better, even utopian direction filled with electronic gadgetry (e.g., the Internet of things), death-defying medical technologies (as though that goal were even remotely desirable), and an endless supply of entertainments and ephemera curated by media shilling happy visions of the future (in high contrast with actual deprivation and suffering). Realists may appreciate that our charted course can’t be altered anymore considering the size and inertia of the leviathan industrial civilization has become. Figuratively, we’re aboard the RMS Titanic, full steam ahead, killer iceberg(s) looming in the darkness. The only option is to see our current path through to its destination conclusion. Maybe there’s a middle ground between, where a hard reset foils our fantasies but at least allows (some of) us to continue living on the surface of Planet Earth.

Problem is, the gargantuan, soul-destroying realization of near-term extinction has the potential to radicalize even well-balanced people, and the question “what are you doing?” is tantamount to an accusation that you’re not doing enough because, after all, nothing will ever be enough. We’ve been warned taught repeatedly to eat right, brush our teeth, get some exercise, and be humble. Yet those simple requisites for a happy, healthy life are frequently ignored. How likely is it that we will then heed the dire message that everything we know will soon be swept away?

The mythological character Cassandra, who prophesied doom, was cursed to never be believed, as was Chicken Little. The fabulous Boy Who Cried Wolf (from Aesop’s Fables) was cursed with bad timing. Sandwich-board prophets, typically hirsute Jesus freaks with some version of the message “Doom is nigh!” inscribed on the boards, are a cliché almost always now understood as set-ups for some sort of joke.

It’s an especially sick joke when the unheeded message proves to be true. If one is truly radicalized, then self-immolation on the sidewalk in front of the White House may be one measure of commitment, but the irony is that no one takes such behavior seriously except as an indication of how unhinged the prophet of doom has gotten (suggesting a different sort of commitment). Yet that’s where we’ve arrived in the 21st century. Left/right, blue/red factions have abandoned the centrist middle ground and moved conspicuously toward the radical fringes in what’s being called extreme social fragmentation. On some analyses, the rising blood tide of terrorists and mass murders are examples of an inchoate protest against the very nature of existence, a complete ontological rejection. When the ostensible purpose of, say, the Las Vegas shooter, is to take out as many people as possible, rejecting other potential sites as not promising enough for high body counts, it may not register in the public mind as a cry in the wilderness, an extreme statement that modern life is no longer worth living, but the action speaks for itself even in the absence of a formal manifesto articulating a collapsed philosophy.

In such a light, the sandwich-board prophet, by eschewing violence and hysteria, may actually be performing a modest ministerial service. Wake up and recognize that all living things must eventually die that our time is short. Cherish what you have, be among those you love and who love you, and brace yourself.

rant on/

Four years, ago, the Daily Mail published an article with the scary title “HALF the world’s wild animals have disappeared in 40 years” [all caps in original just to grab your eyeballs]. This came as no surprise to anyone who’s been paying attention. I blogged on this very topic in my review of Vaclav Smil’s book Harvesting the Biosphere, which observed at the end a 50% decrease in wild mammal populations in the last hundred years. The estimated numbers vary according to which animal population and what time frame are under consideration. For instance, in 2003, CNN reported that only 10% of big ocean fish remain compared to 47 years prior. Predictions indicate that the oceans could be without any fish by midcentury. All this is old news, but it’s difficult to tell what we humans are doing about it other than worsening already horrific trends. The latest disappearing act is flying insects, whose number have decreased by 75% in the last 25 years according to this article in The Guardian. The article says, um, scientists are shocked. I don’t know why; these articles and indicators of impending ecological collapse have been appearing regularly for decades. Similar Malthusian prophesies are far older. Remember colony collapse disorder? Are they surprised it’s happening now, as opposed to the end of the 21st century, safely after nearly everyone now alive is long dead? C’mon, pay attention!

Just a couple days ago, the World Meteorological Association issued a press release indicating that greenhouse gases have surged to a new post-ice age record. Says the press release rather dryly, “The abrupt changes in the atmosphere witnessed in the past 70 years are without precedent.” You don’t say. Even more astoundingly, the popular online news site Engadget had this idiotic headline: “Scientists can’t explain a ‘worrying’ rise in methane levels” (sourcing Professor Euan Nisbet of Royal Holloway University of London). Um, what’s to explain? We’ve been burning the shit out of planetary resources, temperatures are rising, and methane formerly sequestered in frozen tundra and below polar sea floors is seeping out. As I said, old news. How far up his or her ass has any reputable scientist’s head got to be to make such an outta-touch pronouncement? My answer to my own question: suffocation. Engadget made up that dude just for the quote, right? Nope.

Not to draw too direct a connection between these two issues (wildlife disappearances and greenhouse gases — hey, I said pay attention!) because, ya know, reckless conjecture and unproven conclusions (the future hasn’t happened yet, duh, it’s the future, forever telescoping away from us), but a changing ecosystem means evolutionary niches that used to support nature’s profundity are no longer doing so reliably. Plus, we just plain ate a large percentage of the animals or drove them to extinction, fully or nearly (for now). As these articles routinely and tenderly suggest, trends are “worrying” for humans. After all, how are we gonna put seafood on our plates when all the fish have been displaced by plastic?

rant off/

Here’s another interesting tidbit from Anthony Giddens’ book The Consequences of Modernity, which is the subject of a series of book blogs I’ve been writing. In his discussion of disembedding mechanisms, he introduces the idea of civil inattention (from Goffman, actually). This has partly to do with presence or absence (including inattention) in both public and private settings where face-to-face contact used to be the only option but modern technologies have opened up the possibility of faceless interactions over distance, such as with the telegraph and telephone. More recently, the face has been reintroduced with videoconferencing, but nonverbal cues such as body language are largely missing; the fullness of communication remains attenuated. All manner of virtual or telepresence are in fact cheap facsimiles of true presence and the social cohesion and trust enabled by what Giddens calls facework commitments. Of course, we delude ourselves that interconnectivity mediated by electronics is a reasonable substitute for presence and attention, which fellow blogger The South Roane Agrarian bemoans with this post.

Giddens’ meaning is more specific than this, though. The inattention of which Giddens writes is not the casual distraction of others with which we all increasingly familiar. Rather, Giddens takes note social behaviors embedded in deep culture having to do with signalling trust.

Two people approach and pass one another on a city sidewalk. What could be more trivial and uninteresting? … Yet something is going on here which links apparently minor aspects of bodily management to some of the most pervasive features of modernity. The “inattention” displayed is not indifference. Rather it is a carefully monitored demonstration of what might be called polite estrangement. As the two people approach one another, each rapidly scans the face of the other, looking away as they pass … The glance accords recognition of the other as an agent and as a potential acquaintance. Holding the gaze of the other only briefly, then looking ahead as each passes the other couples such an attitude with an implicit reassurance of lack of hostile intent. [p. 81]

It’s a remarkably subtle interaction: making eye contact to confirm awareness of another but then averting one’s eyes to establish that copresence poses no particular threat in either direction. Staring too fixedly at another communicates something quite else, maybe fear or threat or disapprobation. By denying eye contact — by keeping one’s eyes buried in a handheld device, for instance — the opportunity to establish a modicum of trust between strangers is missed. Intent (or lack thereof) is a mystery. In practice, such modern-day inattention is mere distraction, not a sign of malevolence, but the ingrained social cue is obviated and otherwise banal happenstances become sources of irritation, discomfort, and/or unease, as with someone who doesn’t shake hands or perform others types of greeting properly.

I wrote before about my irritation with others face-planted in their phones. It is not a matter of outright offense but rather a quiet sense of affront at failure to adopt accepted social behaviors (as I once did). Giddens puts it this way:

Tact and rituals of politeness are mutual protective devices, which strangers or acquaintances knowingly use (mostly on the level of practical consciousness) as a kind of implicit social contact. Differential power, particularly where it is very marked, can breach or skew norms …. [pp. 82–83]

That those social behaviors have adapted to omnipresent mobile media, everyone pacified or hypnotized within their individual bubbles, is certainly not a salutary development. It is, however, a clear consequence of modernity.

The scandal surrounding Harvey Weinstein and all the people he harassed, bullied, assaulted, molested, and raped has provided occasion for many who had dealings with him to revisit their experiences and wonder what might have been (or not been) had things gone differently, had they acted otherwise in response to his oafish predations. I judge it’s nearly impossible for those outside the Hollywood scene to understand fully the stakes involved (and thus the distorted psychology), but on the other hand, nearly everyone has experience with power imbalances that enable some to get away with exploiting and victimizing others. And because American culture responds to tragedies like a bunch of rubberneckers, the witch hunt has likely only just begun. There’s a better than average chance that, as with icebergs, the significantly larger portion of the problem lies hidden below the surface, as yet undisclosed. Clamor won’t alter much in the end; the dynamics are too ingrained. Still, expect accusations to fly all over the industry, including victim blaming. My strong suspicion is that some folks dodged (actively or passively) becoming victims and paid a price in terms of career success, whereas others fell prey or simply went along (and then stayed largely silent until now) and got some extra consideration out of it. Either way, it undermines one’s sense of self-worth, messing with one’s head for years afterwards. Sometimes there’s no escaping awful circumstance.

Life is messy, right? We all have episodes from our past that we wish we could undo. Hindsight makes the optimal path far more clear than in the moment. Fortunately, I have no crimes among my regrets, but with certain losses, I certainly wish I had known then what I know now (a logical fallacy). Strange that the news cycle has me revisiting my own critical turning points in sympathy with others undoubtedly doing the same.

As I generalize this thought process, I can’t help but to wonder as well what might have been had we not, say, (1) split the atom and immediately weaponized the technology, (2) succumbed to various Red Scares scattered around 20th- and 21st-century calendars but instead developed a progressive society worthy of the promise our institutions once embodied, (3) plunged forward out of avarice and shortsightedness by plundering the Earth, and (4) failed to reverse course once the logical conclusion to our aggregate effects on the biosphere was recognized. No utopia would have arisen had we dodged these bullets, of course, but the affairs of men would have been marginally improved, and we might even have survived the 21st century. Such thinking is purely hypothetical and invites a fatalist like me to wonder whether — given our frailty, weakness, and corruption (the human condition being akin to original sin) — we don’t already inhabit the best of all possible worlds.

Isn’t that a horrible thought? A world full of suffering and hardship, serial rapists and murderers, incompetent and venal political leaders, and torture and genocides is the best we can do? We can’t avoid our own worst tendencies? Over long spans of time, cataclysmic earthquakes, volcanic eruptions, superstorms, and meteor strikes already make life on Earth rather precarious, considering that over 99% of all species that once existed are now gone. On balance, we have some remarkable accomplishments, though often purchased with sizeable trade-offs (e.g., slave labor, patriarchal suppression). Still, into the dustbin of history is where we are headed rather sooner than later, having enjoyed only a brief moment in the sun.

I saw something a short while back that tweaked my BS meter into the red: the learning pyramid. According to research done by The NTL Institute for Applied Behavioral Science in the 1960s (… behaviorists, ugh) and reported here (among other places), there are widely disparate rates of knowledge retention across different passive and active teaching methods:

learning-pyramid-synap-2

Let me state first something quite obvious: learning and retention (memory) aren’t the same things. If one seeks sheer retention of information as a proxy for learning, that’s a gross misunderstanding of both cognition and learning. For example, someone who has managed to memorize, let’s say, baseball statistics going back to the 1950s or Bible verses, may have accomplished an impressive mental task not at all aligned with normal cognitive function (the leaky bucket analogy is accurate), but neither example qualifies someone as learned the way most understand the term. Information (especially raw data) is neither knowledge, understanding, nor wisdom. They’re related, sure, but not the same (blogged about this before here). Increasing levels of organization and possession are required to reach each threshold.

The passive/active (participatory) labels are also misleading. To participate actively, one must have something to contribute, to be in possession of knowledge/skill already. To teach something effectively, one must have greater expertise than one’s students. Undoubtedly, teaching others solidifies one’s understanding and expertise, and further learning is often a byproduct, but one certainly can’t begin learning a new subject area by teaching it. Information (input) needs to come from elsewhere, which understandably has a lower retention rate until it’s been gone over repeatedly and formed the cognitive grooves that represent acquisition and learning. This is also the difference between reception and expression in communications. One’s expressive vocabulary (the words one can actually deploy in speech and writing) is a subset of one’s receptive vocabulary (the words one can understand readily upon hearing or reading). The expressive vocabulary is predicated on prior exposure that imbues less common words with power, specificity, and nuance. While it’s possible to learn new words quickly (in small quantities), it’s not generally possible to skip repetition that enables memorization and learning. Anyone studying vocabulary lists for the SAT/ACT (as opposed to a spelling bee) knows this intuitively.

Lastly, where exactly is most prospective knowledge and skill located, inside the self or outside? One doesn’t simply peel back layers of the self to reveal knowledge. Rather, one goes out into the world and seeks it (or doesn’t, sadly), disturbing it from its natural resting place. The great repositories of knowledge are books and other people (especially those who write books — whoda thunk?). So confronting knowledge, depending on the subject matter, occurs more efficiently one-on-one (an individual reading a book) or in groups (25 or so students in a classroom headed by 1 teacher). The inefficiency of a 1:1 ratio between student and teacher (a/k/a tutoring) is obviously available to those who place a high enough value on learning to hire a tutor. However, that’s not how education (primary through postgraduate) is undertaken in most cases. And just imagine the silliness of gathering a classroom of students to teach just for one person to learn with 90% retention, as the learning pyramid would suggest.

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

I’m a little gobsmacked that, in the aftermath of someone finally calling out the open secret of the Hollywood casting couch (don’t know, don’t care how this news cycle started) and netting Harvey Weinstein in the process, so many well-known actors have added their “Me, too!” to the growing scandal. Where were all these sheep before now? As with Bill Cosby and Bill Clinton, what good does it do to allow a serial abuser to continue unchallenged until years, decades later a critical mass finally boils over? I have no special knowledge or expertise in this area, so what follows is the equivalent of a thought experiment.

Though the outlines of the power imbalance between a Hollywood executive and an actor seeking a role (or other industry worker seeking employment) are pretty clear, creating a rich opportunity for the possessor of such power to act like a creep or a criminal, the specific details are still a little shrouded — at least in my limited consumption of the scandal press. How much of Weinstein’s behavior veers over the line from poor taste to criminality is a difficult question precisely because lots of pictorial evidence exists showing relatively powerless people playing along. It’s a very old dynamic, and its quasi-transactional nature should be obvious.

In my idealized, principled view, if one has been transgressed, the proper response is not to slink away or hold one’s tongue until enough others are similarly transgressed to spring into action. The powerless are duty bound to assert their own power — the truth — much like a whistleblower feels compelled to disclose corruptions of government and corporate sectors. Admittedly, that’s likely to compound the initial transgression and come at some personal cost, great or small. But for some of us (a small percentage, I reckon), living with ourselves in silent assent presents an even worse option. By way of analogy, if one were molested by a sketchy uncle and said nothing, I can understand just wanting to move on. But if one said nothing yet knew the sketchy uncle had more kids lined up in the extended family to transgress, then stepping up to protect the younger and weaker would be an absolute must.

In the past few decades, clergy of the Catholic Church sexually abused many young people and deployed an institutional conspiracy to hide the behaviors and protect the transgressors. Exposure should have broken trust bonds between the church and the faithful and invalidated the institution as an abject failure. Didn’t quite work out that way. Similar scandals and corruption across a huge swath of institutions (e.g., corporate, governmental, military, educational, entertainment, and sports entities) have been appearing in public view regularly, yet as a culture, we tolerate more creeps and criminals than we shame or prosecute. (TomDispatch.com is one of the sites that regularly reports these corruptions with respect to American empire; I can scarcely bear to read it sometimes.) I suspect part of that is a legitimate desire for continuity, to avoid burning down the house with everyone in it. That places just about everyone squarely within the “Me, too!” collective. Maybe I shouldn’t be so gobsmacked after all.

Caveat: This thought experiment definitely comes from a male perspective. I recognize that females view these issues quite differently, typically in consideration of far greater vulnerability than males experience (excepting the young boys in the Catholic Church example).