Archive for the ‘Culture’ Category

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

Advertisements

I’m a little gobsmacked that, in the aftermath of someone finally calling out the open secret of the Hollywood casting couch (don’t know, don’t care how this news cycle started) and netting Harvey Weinstein in the process, so many well-known actors have added their “Me, too!” to the growing scandal. Where were all these sheep before now? As with Bill Cosby and Bill Clinton, what good does it do to allow a serial abuser to continue unchallenged until years, decades later a critical mass finally boils over? I have no special knowledge or expertise in this area, so what follows is the equivalent of a thought experiment.

Though the outlines of the power imbalance between a Hollywood executive and an actor seeking a role (or other industry worker seeking employment) are pretty clear, creating a rich opportunity for the possessor of such power to act like a creep or a criminal, the specific details are still a little shrouded — at least in my limited consumption of the scandal press. How much of Weinstein’s behavior veers over the line from poor taste to criminality is a difficult question precisely because lots of pictorial evidence exists showing relatively powerless people playing along. It’s a very old dynamic, and its quasi-transactional nature should be obvious.

In my idealized, principled view, if one has been transgressed, the proper response is not to slink away or hold one’s tongue until enough others are similarly transgressed to spring into action. The powerless are duty bound to assert their own power — the truth — much like a whistleblower feels compelled to disclose corruptions of government and corporate sectors. Admittedly, that’s likely to compound the initial transgression and come at some personal cost, great or small. But for some of us (a small percentage, I reckon), living with ourselves in silent assent presents an even worse option. By way of analogy, if one were molested by a sketchy uncle and said nothing, I can understand just wanting to move on. But if one said nothing yet knew the sketchy uncle had more kids lined up in the extended family to transgress, then stepping up to protect the younger and weaker would be an absolute must.

In the past few decades, clergy of the Catholic Church sexually abused many young people and deployed an institutional conspiracy to hide the behaviors and protect the transgressors. Exposure should have broken trust bonds between the church and the faithful and invalidated the institution as an abject failure. Didn’t quite work out that way. Similar scandals and corruption across a huge swath of institutions (e.g., corporate, governmental, military, educational, entertainment, and sports entities) have been appearing in public view regularly, yet as a culture, we tolerate more creeps and criminals than we shame or prosecute. (TomDispatch.com is one of the sites that regularly reports these corruptions with respect to American empire; I can scarcely bear to read it sometimes.) I suspect part of that is a legitimate desire for continuity, to avoid burning down the house with everyone in it. That places just about everyone squarely within the “Me, too!” collective. Maybe I shouldn’t be so gobsmacked after all.

Caveat: This thought experiment definitely comes from a male perspective. I recognize that females view these issues quite differently, typically in consideration of far greater vulnerability than males experience (excepting the young boys in the Catholic Church example).

Reading further into Anthony Giddens’ book The Consequences of Modernity, I got a fuller (though still incomplete) sense of what is meant by his terms disembedding mechanisms, expert systems, and symbolic tokens, all of which disrupt time and space as formerly understood in traditional societies that enjoyed the benefit of centuries of continuity. I’ve been aware of analyses regarding, for instance, the sociology of money and the widespread effects of the introduction and adoption of mechanical clocks and timepieces. While most understand these developments superficially as unallayed progress, Giddens argues that they do in fact reorder our experience in the world away from an organic, immediate orientation toward an intellectualized adherence to distant, abstract, self-reinforcing (reflexive) mechanisms.

But those matters are not really what this blog post is about. Rather, this passage sparked my interest:

… when the claims of reason replaced those of tradition, they appeared to offer a sense of certitude greater than that provided by preexisting dogma. But this idea only appears persuasive so long as we do not see that the reflexivity of modernity actually subverts reason, at any rate where reason is understood as the gaining of certain knowledge … We are abroad in a world which is thoroughly constituted through reflexively applied knowledge, but where at the same time we can never be sure that any given element of that knowledge will not be revised. [p. 39]

Put another way, science and reason are axiomatically open to examination, challenge, and revision and often undergo disruptive change. That’s what is meant by Karl Popper’s phrase “all science rests upon shifting sand” and informs the central thesis of Thomas Kuhn’s well-known book The Structure of Scientific Revolutions. It’s not the narrow details that shift so much (hard sciences lead pretty reliably to applied engineering) as the overarching narrative, e.g., the story of the Earth, the cosmos, and ourselves as revealed through scientific inquiry and close examination. Historically, the absolute certainty of the medieval church, while not especially accurate in either details or narrative, yielded considerable authority to post-Enlightenment science and reason, which themselves continue to shift periodically.

Some of those paradigm shifts are so boggling and beyond the ken of the average thinker (including many college-educated folks) that our epistemology is now in crisis. Even the hard facts — like the age and shape of the Earth or its orbital relationship to other solar bodies — are hotly contested by some and blithely misunderstood by others. One doesn’t have to get bogged down in the vagaries of relativity, nuclear power and weapons, or quantum theory to lose the thread of what it means to live in the 21st century. Softer sciences such as psychology, anthropology, economics, and even history now deliver new discoveries and (re-)interpretations of facts so rapidly, like the dizzying pace of technological change, that philosophical systems are unmoored and struggling for legitimacy. For instance, earlier this year, a human fossil was found in Morocco that upended our previous knowledge of human evolution (redating the first appearance of biologically modern humans about 100,000 years earlier). More popularly, dieticians still disagree on what sorts of foods are healthy for most of us (though we can probably all agree that excess sugar is bad). Other recent developments include the misguided insistence among some neurobiologists and theorists that consciousness, free will, and the self do not exist (I’ll have a new post regarding that topic as time allows) and outright attacks on religion not just for being in error but for being the source of evil.

I have a hard time imagining other developments in 21st-century intellectual thought that would shake the foundations of our cosmology any more furiously than what we’re now experiencing. Even the dawning realization that we’ve essentially killed ourselves (with delayed effect) by gradually though consistently laying waste to our own habitat is more of an “oops” than the mind-blowing moment of waking up from The Matrix to discover the unreality of everything once believed. Of course, for fervent believers especially, the true facts (best as we can know them, since knowledge is forever provisional) are largely irrelevant in light of desire (what one wants to believe), and that’s true for people on both sides of the schism between church and science/reason.

As Shakespeare wrote in Hamlet, “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.” So it’s probably wrong to introduce a false dualism, though it has plenty of historical precedent. I’ll suggest instead that there are more facets and worldviews at play in the world that the two that have been warring in the West for the last 600 years.

The storms referenced in the earlier version of this post were civilization-ending cataclysms. The succession of North American hurricanes and earthquakes earlier this month of September 2017 were natural disasters. I would say that September was unprecedented in history, but reliable weather records do not extend very far back in human history and the geological record extending back into human prehistory would suggest that, except perhaps for their concentration within the span of a month, the latest storms are nothing out of the ordinary. Some have even theorized that hurricanes and earthquakes could be interrelated. In the wider context of weather history, this brief period of destructive activity may still be rather mild. Already in the last twenty years we’ve experienced a series of 50-, 100- and 500-year weather events that would suggest exactly what climate scientists have been saying, namely, that higher global average temperatures and more atmospheric moisture will lead to more activity in the category of superstorms. Throw drought, flood, and desertification into the mix. This (or worse, frankly) may have been the old normal when global average temperatures were several degrees warmer during periods of hothouse earth. All indications are that we’re leaving behind garden earth, the climate steady state (with a relatively narrow band of global temperature variance) enjoyed for roughly 12,000 years.

Our response to the latest line of hurricanes that struck the Gulf, Florida, and the Caribbean has been characterized as a little tepid considering we had the experience of Katrina from which to learn and prepare, but I’m not so sure. True, hurricanes can be seen hundreds of miles and days away, allowing folks opportunity to either batten down the hatches or flee the area, but we have never been able to handle mass exodus, typically via automobile, and the sheer destructive force of the storms overwhelms most preparations and delays response. So after Katrina, it appeared for several days that the federal government’s response was basically this: you’re on your own; that apparent response occurred again especially in Puerto Rico, which like New Orleans quickly devolved into a true humanitarian crisis (and is not yet over). Our finding (in a more charitable assessment on my part) is that despite foreknowledge of the event and past experience with similar events, we can’t simply swoop in and smooth things out after the storms. Even the first steps of recovery take time.

I’ve cautioned that rebuilding on the same sites, with the reasonable expectation of repeat catastrophes in a destabilized climate that will spawn superstorms reducing entire cities to garbage heaps, is a poor option. No doubt we’ll do it anyway, at least partially; it’s already well underway in Houston. I’ve also cautioned that we need to brace for a diaspora as climate refugees abandon destroyed and inundated cities and regions. It’s already underway with respect to Puerto Rico. This is a storm of an entirely different sort (a flood, actually) and can also been seen from hundreds of miles and weeks, months, years away. And like superstorms, a diaspora from the coasts, because of the overwhelming force and humanitarian crisis it represents, is not something for which we can prepare adequately. Still, we know it’s coming, like a 20- or 50-year flood.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

Some phrases have a wide range of applicability, such as the book title “______ for Dummies.” The popular Netflix show Orange is the New Black is another, claiming cachet in criminality. Let me jump on the bandwagon and observe how Transgression is the New Chic. There are two aspects to how transgression has become the new “it” thing: committing a transgression and being transgressed. Seems these days everyone is positioning themselves along one axis or another, sometimes both.

Not many of us possess the ability to transgress others without consequences. To do so basically requires fuck-you money. Celebrity also helps. With those characteristics, however, one can get away with an awful lot of mischief and make themselves look pretty damn cool in the process (if one is impressed by such foolishness). At the top of the heap is our Bully-in-Chief, who is busy testing another man-child in a reckless exercise in brinkmanship that could easily blow up in our faces (and theirs, too, which would be criminal considering the mismatch of power — like a billionaire stealing from a fast food worker). Yet the impulse to puff up one’s chest and appear unwavering in resolve or whatever other silly justification enters the minds of status seekers is awfully strong. To rational minds, it looks like insanity. Sadly, the masses do not possess rational minds and so give the game credibility.

On the flip side, claiming victimization at imagined transgressions is another fantasy league populated by the emotionally needy. Snowflakes. Or the Strawberry Generation of people prone to spoil at the slightest whiff of life’s difficulties. The so-called microaggression and the demand for safe spaces are frequent power plays used by star players, where points are scored by cowing into submission administrators too timid to call bullshit on the charade. Berkeley administrators offering counseling for students “terrorized” by a speech delivered by Ben Shapiro (whether students actually attend is beside the point) is a good example. Feigning offense works when lying to oneself, too, so the master player get double points for transgressing him- or herself. Well played. When the cycle of blaming and bullying will subside is anyone’s guess.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific epaisodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

rant on/

As the next in an as-yet unnumbered series of Storms of the Century (I predict more than a dozen at least) is poised to strike nearly the entirety of the State of Florida, we know with confidence from prior experience, recent and not so recent, that any lessons we might take regarding how human habitation situated along or near coastlines vulnerable to extreme weather events, now occurring with increasing frequency and vehemence, will remain intransigently unlearned. Instead, we’ll begin rebuilding on the very same sites as soon as construction labor and resources can be mustered and deployed. Happened in New Orleans and New Jersey; is about to happen in Houston; and will certainly happen all across Florida — even the fragile Florida Keys. I mean, shit, we can’t do without The Magic Kingdom and other attractions in the central-Florida tourist mecca, now can we?

This predictable spin around the dance floor might look like a tragicomic circus waltz (e.g., The Daring Young Man on the Flying Trapeze), or even out-of-tune, lopsided calliope music from the carousel, except that positioning ourselves right back in harm’s way would be better characterized as a danse macabre. I dub it the Builder’s Waltz, which could also be the Rebuilder’s Rumba, the Catastrophe Tango, the Demolition Jive … take your pick.

Obstinate refusal to apprehend reality as it slams into us is celebrated as virtue these days. Can’t lose hope even as dark forces coalesce all around us, right? Was it always so? Still, an inkling might be dawning on some addle-brained deniers that perhaps science-informed global warming and climate change news might actually be about something with real-world impact, such as dramatic reduction of oil refinery output or a lost citrus crop. So much for illusions of business as usual continuing unhindered into the foreseeable future. Instead, our future looks more like dominoes lined up to fall — like the line of hurricanes formed in the Atlantic. Good luck hunkering down and weathering once-in-a-lifetime storms that just keep coming. And rebuilding the same things in the same places, well, just let it go, man, ’cuz it’s already gone.

rant off/

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.

Since Jordan Peterson came to prominence last fall, he’s been maligned and misunderstood. I, too, rushed to judgment before understanding him more fully by watching many of his YouTube clips (lectures, speeches, interviews, webcasts, etc.). As the months have worn on and media continue to shove Peterson in everyone’s face (with his willing participation), I’ve grown in admiration and appreciation of his two main (intertwined) concerns: free speech and cultural Marxism. Most of the minor battles I’ve fought on these topics have come to nothing as I’m simply brushed off for not “getting it,” whatever “it” is (I get that a lot for not being a conventional thinker). Simply put, I’m powerless, thus harmless and of no concern. I have to admit, though, to being surprised at the proposals Peterson puts forward in this interview, now over one month old:

Online classes are nothing especially new. Major institutions of higher learning already offer distance-learning courses, and some institutions exist entirely online, though they tend to be degree mills with less concern over student learning than with profitability and boosting student self-esteem. Peterson’s proposal is to launch an online university for the humanities, and in tandem, to reduce the number of students flowing into today’s corrupted humanities departments where they are indoctrinated into the PoMo cult of cultural Marxism (or as Peterson calls it in the interview above, neo-Marxism). Teaching course content online seems easy enough. As pointed out, the technology for it has matured. (I continue to believe face-to-face interaction is far better.) The stated ambition to overthrow the current method of teaching the humanities, though, is nothing short of revolutionary. It’s worth observing, however, that the intent appears not to be undermining higher education (which is busy destroying itself) but to save or rescue students from the emerging cult.

Being a traditionalist, I appreciate the great books approach Peterson recommends as a starting point. Of course, this approach stems from exactly the sort of dead, white, male hierarchy over which social justice warriors (SJWs) beat their breasts. No doubt: patriarchy and oppression are replete throughout human history, and we’re clearly not yet over with it. To understand and combat it, however, one must study rather than discard history or declare it invalid as a subject of study. That also requires coming to grips with some pretty hard, brutal truths about our capacity for mayhem and cruelty — past, present, and future.

I’ve warned since the start of this blog in 2006 that the future is not shaping up well for us. It may be that struggles over identity many young people are experiencing (notably, sexual and gender dysphoria occurring at the remarkably vulnerable phase of early adulthood) are symptoms of a larger cultural transition into some other style of consciousness. Peterson clearly believes that the struggle in which he is embroiled is fighting against the return of an authoritarian style tried repeatedly in the 20th century to catastrophic results. Either way, it’s difficult to contemplate anything worthwhile emerging from brazen attempts at thought control by SJWs.