Posts Tagged ‘free speech’

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as,, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.


What is more tantalizing and enticing than a secret? OK, probably sex appeal, but never mind that for now. Secrets confer status on the keeper and bring those on whom the secret is bestowed into an intimate (nonsexual, for you dirty thinkers) relationship with the secret sharer. I remember the sense of relief and quiet exhilaration when the Santa Claus story was finally admitted by my parents to be a hoax untrue. I had already ceased to really believe in it/him but wasn’t yet secure enough as a 6- or 7-year-old (or whenever it was) to assert it without my parents’ confirmation. And it was a secret I withheld from my younger siblings, perhaps my first instruction on when lying was acceptable, even looked upon approvingly. Similarly, I remember how it felt to be told about sex for the first time by older kids (now you can go there, you cretins) and thus realize that my parents (and everyone else’s) had done the dirty — multiple times even for families with more than one kid. I was the possessor of secret knowledge, and everyone figured out quickly that it was best to be discreet about it. It may have been the first open secret. Powerful stuff, as we were to learn later in our hormone-addled adolescence. In early adulthood, I also began to assert my atheism, which isn’t really a secret but still took time to root fully. From my mature perspective, others who believe in one sky-god or another look like the kids who at a tender age still believe in Santa Claus and the Easter Bunny. I don’t go out of my way to dispel anyone’s faith.

Even as adults, those of us who enjoy secret knowledge feel a bit of exhilaration. We know what goes on (a little or a lot) behind the scenes, behind the curtain, in the backrooms and dark places. It may also mean that we know how the proverbial sausage is made, which is far less special. National security clearance, operating at many levels of access, may be the most obvious example, or maybe it’s just being a bug on the wall in the dugout or locker room during a pro sports contest. Being within the circle of intimates is intoxicating, though the circumstances that gets one into the circle may be rather mundane, and those on the outside may look oddly pathetic.

The psychology behind secret knowledge functions prominently with conspiracy theories. Whether the subject is political assassinations, Bigfoot or the Loch Ness Monster, the moon landings, Area 51 and alien abduction, chemtrails/contrails, or 9/11, one’s personal belief and pet theory inescapably confers special status, especially as unacknowledged or unaccepted truth. Often, as others seek to set the record straight, one digs in to defend cherished beliefs. It’s an elixir,  a dangerous cycle that traps people in contrafactual cliques. So we have flat Earthers, birthers, 9/11 truthers, creationists, climate change deniers, etc. (I count myself among one of those groups, BTW. Figure it out for yourself.) The range of interpretations floated in the political realm with respect to the machinations of the two parties and the White House boggle my mind with possibilities. However, I’m squarely outside those circles and feel no compulsion to decide what I believe when someone asserts secret knowledge from inside the circle. I float comfortably above the fray. Similarly, with so much fake news pressing for my attention, I consciously hold quite a lot of it in abeyance until time sorts it out for me.

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

Since Jordan Peterson came to prominence last fall, he’s been maligned and misunderstood. I, too, rushed to judgment before understanding him more fully by watching many of his YouTube clips (lectures, speeches, interviews, webcasts, etc.). As the months have worn on and media continue to shove Peterson in everyone’s face (with his willing participation), I’ve grown in admiration and appreciation of his two main (intertwined) concerns: free speech and cultural Marxism. Most of the minor battles I’ve fought on these topics have come to nothing as I’m simply brushed off for not “getting it,” whatever “it” is (I get that a lot for not being a conventional thinker). Simply put, I’m powerless, thus harmless and of no concern. I have to admit, though, to being surprised at the proposals Peterson puts forward in this interview, now over one month old:

Online classes are nothing especially new. Major institutions of higher learning already offer distance-learning courses, and some institutions exist entirely online, though they tend to be degree mills with less concern over student learning than with profitability and boosting student self-esteem. Peterson’s proposal is to launch an online university for the humanities, and in tandem, to reduce the number of students flowing into today’s corrupted humanities departments where they are indoctrinated into the PoMo cult of cultural Marxism (or as Peterson calls it in the interview above, neo-Marxism). Teaching course content online seems easy enough. As pointed out, the technology for it has matured. (I continue to believe face-to-face interaction is far better.) The stated ambition to overthrow the current method of teaching the humanities, though, is nothing short of revolutionary. It’s worth observing, however, that the intent appears not to be undermining higher education (which is busy destroying itself) but to save or rescue students from the emerging cult.

Being a traditionalist, I appreciate the great books approach Peterson recommends as a starting point. Of course, this approach stems from exactly the sort of dead, white, male hierarchy over which social justice warriors (SJWs) beat their breasts. No doubt: patriarchy and oppression are replete throughout human history, and we’re clearly not yet over with it. To understand and combat it, however, one must study rather than discard history or declare it invalid as a subject of study. That also requires coming to grips with some pretty hard, brutal truths about our capacity for mayhem and cruelty — past, present, and future.

I’ve warned since the start of this blog in 2006 that the future is not shaping up well for us. It may be that struggles over identity many young people are experiencing (notably, sexual and gender dysphoria occurring at the remarkably vulnerable phase of early adulthood) are symptoms of a larger cultural transition into some other style of consciousness. Peterson clearly believes that the struggle in which he is embroiled is fighting against the return of an authoritarian style tried repeatedly in the 20th century to catastrophic results. Either way, it’s difficult to contemplate anything worthwhile emerging from brazen attempts at thought control by SJWs.

Violent events of the past week (Charleston, VA; Barcelona, Spain) and political responses to them have dominated the news cycle, pushing other newsworthy items (e.g., U.S.-South Korean war games and a looming debt ceiling crisis) off the front page and into the darker recesses of everyone’s minds (those paying attention, anyway). We’re absorbed instead with culture wars run amok. I’m loath to apply the term terrorism to regular periodic eruptions of violence, both domestic and foreign. That term carries with it intent, namely, the objective to create day-to-day terror in the minds of a population so as to interfere with proper functions of society. It’s unclear to me whether recent perpetrators of violence are coherent enough to formulate sophisticated motivations or plans. The dumb, obvious way of doing things — driving into crowds of people — takes little or no planning and may just as well be the result of inchoate rage boiling over in a moment of high stress and opportunity. Of course, it needn’t be all or nothing, and considering our reflexively disproportionate responses, the term terrorism and attendant destabilization is arguably accurate even without specified intent. That’s why in the wake of 9/11 some 16 years ago, the U.S. has become a security state.

It’s beyond evident that hostilities have been simmering below the not-so-calm surface. Many of those hostilities, typically borne out of economic woes but also part of a larger clash of civilizations, take the form of identifying an “other” presumably responsible for one’s difficulties and then victimizing the “other” in order to elevate oneself. Of course, the “other” isn’t truly responsible for one’s struggles, so the violent dance doesn’t actually elevate anyone, as in “supremacy”; it just wrecks both sides (though unevenly). Such warped thinking seems to be a permanent feature of human psychology and enjoys popular acceptance when the right “other” is selected and universal condemnation when the wrong one is chosen. Those doing the choosing and those being chosen haven’t changed much over the centuries. Historical Anglo-Saxons and Teutons choose and people of color (all types) get chosen. Jews are also chosen with dispiriting regularity, which is an ironic inversion of being the Chosen People (if you believe in such things — I don’t). However, any group can succumb to this distorted power move, which is why so much ongoing, regional, internecine conflict exists.

As I’ve been saying for years, a combination of condemnation and RightThink has simultaneously freed some people from this cycle of violence but merely driven the holdouts underground. Supremacy in its various forms (nationalism, racism, antisemitism, etc.) has never truly been expunged. RightThink itself has morphed (predictably) into intolerance, which is now veering toward radicalism. Perhaps a positive outcome of this latest resurgence of supremacist ideology is that those infected with the character distortion have been emboldened to identify themselves publicly and thus can be dealt with somehow. Civil authorities and thought leaders are not very good at dealing with hate, often shutting people out of the necessary public conversation and/or seeking to legislate hate out of existence with restrictions on free speech. But it is precisely through free expression and diplomacy that we address conflict. Violence is a failure to remain civil (duh!), and war (especially the genocidal sort) is the extreme instance. It remains to be seen if the lid can be kept on this boiling pot, but considering cascade failures lined up to occur within the foreseeable future, I’m pessimistic that we can see our way past the destructive habit of shifting blame onto others who often suffer even worse than those holding the reins of power.

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

The Internets/webs/tubes have been awfully active spinning out theories and conspiracies with respect to Democratic presidential nominee Hillary Clinton (are those modifiers even necessary?) and the shoe ready to drop if and when Julian Assange releases information in his possession reputed to spell the end of her candidacy and political career. Assange has been unaccountably coy: either he has the goods or he doesn’t. There’s no reason to tease and hype. Hillary has been the subject of intense scrutiny for 25+ years. With so much smoke billowing in her wake, one might conclude burning embers must exist. But our current political culture demonstrates that one can get away with unthinkably heinous improprieties, evasions, and crimes so long as one trudges steadfastly through all the muck. Some even make a virtue out of intransigence. Go figure.

If I were charitable, I would say that Hillary has been unfairly maligned and that her 2010 remark “Can’t we just drone this guy?” is either a fabrication or taken out of context. Maybe it was a throwaway joke, uttered in a closed meeting and forgotten except for someone who believed it might be useful later. Who can ever know? But I’m not so charitable. No one in a position of authority can afford to be flip about targeting political irritants. Hillary impresses as someone who, underneath all the noise, would not lose any sleep over droning her detractors.

There is scarcely anything on the political landscape as divisive as when someone blows the whistle on illicit government actions and programs. For instance, some are absolutely convinced that Edward Snowden is a traitor and ought to receive a death sentence (presumably after a trial, but not necessarily). Others understand his disclosures as the act of a patriot of the highest order, motivated not by self-interest but by love of country and the sincere belief in the public’s right to know. The middle ground between these extremes is a veritable wasteland — one I happen to occupy. Julian Assange is similarly divisive, and like Snowden, he appears to believe that the truth will eventually come out and indeed must. What I can’t quite reconcile is the need for secrecy and the willingness of the general public to accept leaders who habitually operate behind such veils. Talk of transparency is usually just subterfuge. If we’re truly the good guys and our ideals are superior to those of our detractors, why not simply trust in those strengths?

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitutional scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. (more…)