Posts Tagged ‘Politics’

One of the very best lessons I took from higher education was recognizing and avoiding the intentional fallacy — in my own thinking no less than in that of others. Although the term arguably has more to do with critical theory dealing specifically with texts, I learned about it in relation to abstract fine arts, namely, painting and music. For example, the enigmatic expression of the Mona Lisa by Leonardo Da Vinci continues to spark inquiry and debate. What exactly does that smile mean? Even when words or programs are included in musical works, it’s seductively easy to conclude that the composer intends this or the work itself means that. Any given work purportedly allows audiences to peer into the mind of its creator(s) to interrogate intent. Conclusions thus drawn, however, are notoriously unreliable though commonplace.

It’s inevitable, I suppose, to read intent into artistic expression, especially when purpose feels so obvious or inevitable. Similar excavations of meaning and purpose are undertaken within other domains of activity, resulting in no end of interpretation as to surface and deep strategies. Original intent (also originalism) is a whole field of endeavor with respect to interpretation of the U.S. Constitution and imagining the framers’ intent. Geopolitics is another domain where hindsight analysis results in some wildly creative but ultimately conjectural interpretations of events. Even where authorial (and political) intent is explicitly recorded, such as with private diaries or journals, the possibility of deceptive intent by authors keeps everyone wondering. Indeed, although “fake news” is modern coin, a long history of deceptive publishing practice well beyond the adoption of a nom de plume attests to hidden or unknowable intent making “true intent” a meta property.

The multi-ring circus that the modern information environment has become, especially in the wake of electronic media (e.g., YouTube channels) produced by anyone with a camera and an Internet connection, is fertile ground for those easily ensnared by the intentional fallacy. Several categories of intent projected onto content creators come up repeatedly: profit motive, control of the narrative (no small advantage if one believes this blog post), setting the record straight, correcting error, grandstanding, and trolling for negative attention. These categories are not mutually exclusive. Long ago, I pointed to the phenomenon of arguing on-line and how it typically accomplishes very little, especially as comment threads lengthen and civility breaks down. These days, comments are an Internet legacy and/or anachronism that many content creators persist in offering to give the illusion of a wider discussion but in fact roundly ignore. Most blogs and channels are actually closed conversations. Maybe a Q&A follows the main presentation when held before an audience, but video channels are more often one-way broadcasts addressing an audience but not really listening. Public square discussion is pretty rare.

Some celebrate this new era of broadcasting, noting with relish how the mainstream media is losing its former stranglehold on attention. Such enthusiasm may be transparently self-serving but nonetheless rings true. A while back, I pointed to New Media Rockstars, which traffics in nerd culture entertainment media, but the term could easily be expanded to include satirical news, comedy, and conversational webcasts (also podcasts). Although some folks are rather surprised to learn that an appetite for substantive discussion and analysis exists among the public, I surmise that the shifting media landscape and disintegrated cultural narrative have bewildered a large segment of the public. The young in particular are struggling to make sense of the world, figure out what to be in life and how to function, and working out an applied philosophy that eschews more purely academic philosophy.

By way of example of new media, let me point to a trio of YouTube channels I only recently discovered. Some More News parodies traditional news broadcasts by sardonically (not quite the same as satirically) calling bullshit on how news is presented. Frequent musical cues between segments make me laugh. Unlike the mainstream media, which are difficult not to regard as propaganda arms of the government, Some More News is unapologetically liberal and indulges in black humor, which doesn’t make me laugh. Its raw anger and exasperation are actually a little terrifying. The second YouTube channel is Three Arrows, a sober, thorough debunking of news and argumentation found elsewhere in the public sphere. The speaker, who doesn’t appear onscreen, springs into action especially when accusations of current-day Nazism come up. (The current level of debate has devolved to recklessly calling nearly everyone a Nazi at some stage. Zero points scored.) Historical research often puts things into proper context, such as the magnitude of the actual Holocaust compared to some garden-variety racist running his or her mouth comparatively harmlessly. The third YouTube channel is ContraPoints, which is rather fanciful and profane but remarkably erudite considering the overall tone. Labels and categories are explained for those who may not have working definitions at the ready for every phrase or ideology. Accordingly, there is plenty of jargon. The creator also appears as a variety of different characters to embody various archetypes and play devil’s advocate.

While these channels may provide abundant information, correcting error and contextualizing better than most traditional media, it would be difficult to conclude they’re really moving the conversation forward. Indeed, one might wonder why bother preparing these videos considering how time consuming it has to be to do research, write scripts, assemble pictorial elements, etc. I won’t succumb to the intentional fallacy and suggest I know why they bother holding these nondebates. Further, unless straight-up comedy, I wouldn’t say they’re entertaining exactly, either. Highly informative, perhaps, if one pays close attention to frenetic online pace and/or mines for content (e.g., studying transcripts or following links). Interestingly, within a fairly short period of time, these channels are establishing their own rhetoric, sometimes useful, other times too loose to make strong impressions. It’s not unlike the development of new stylistic gestures in music or painting. What if anything worthwhile will emerge from the scrum will be interesting.

Advertisements

Back in the 1980s when inexpensive news programs proliferated, all wanting to emulate 60 Minutes or 20/20, I recall plenty having no problem working the public into a lather over some crime or injustice. A typical framing trick was to juxtapose two unrelated facts with the intent that the viewer leap to an unwarranted conclusion. Here’s an example I just made up: “On Tuesday, Jane went to her plastic surgeon for a standard liposuction procedure. By Friday, Jane was dead.” Well, what killed Jane? The obvious inference, by virtue of juxtaposition, is the procedure. Turns out it was an entirely unrelated traffic accident. The crap news program could legitimately claim that it never said the procedure killed Jane, yet it led the credulous public to believe so. Author Thomas Sowell resorts to that same sort of nonsense in his books: a habit of misdirection when arguing his point. I initially sought out his writing for balance, as everyone needs others capable of articulating competing ideas to avoid the echo chamber of one’s own mind (or indeed the chorus of the converted). Sowell failed to keep me as a reader.

It’s not always so easy to recognize cheap rhetorical tricks. They appear in movies all the time, but then, one is presumably there to be emotionally manipulated affected by the story, so a healthy suspension of disbelief goes a long way to enhance one’s enjoyment. Numerous fanboy sites (typically videos posted to YouTube) offer reviews and analysis that point out failures of logic, plotting, and continuity, as well as character inconsistency and embedded political propaganda messaging, but I’ve always thought that taking movies too seriously misses the point of cheap entertainment. Considering the powerful influence cinematic storytelling has over attitudes and beliefs, perhaps I’m being too cavalier about it.

When it comes to serious debate, however, I’m not nearly so charitable. The favored 5-minute news debate where 3 or 4 floating heads spew their rehearsed talking point, often talking over each other in a mad grab for air time, accomplishes nothing. Formal, long-form debates in a theater in front of an audience offer better engagement if participants can stay within proper debate rules and etiquette. Political debates during campaign season fail on that account regularly, with more spewing of rehearsed talking points mixed with gratuitous swipes at opponents. Typically, both sides claim victory in the aftermath and nothing is resolved, since that’s not really the objective. (Some opine that government, being essentially nonstop campaigning, suffers a similar fate: nothing is resolved because that’s not the true objective anymore.)

I was intrigued to learn recently of the semi-annual Munk Debates, named after their benefactors, that purport to be formal debates with time limits, moderation, and integrity. I had never heard of them before they booked Jordan Peterson alongside Michael Eric Dyson, Michelle Goldberg, and Stephen Fry. Like Donald Trump did for TV and print news, Peterson has turned into a 1-man ratings bonanza for YouTube and attracts viewers to anything in which he participates, which is quite a lot. The proposition the four debaters were provided was this: Be it resolved, what you call political correctness, I call progress … Problem is, that’s not really what was debated most of the time. Instead, Dyson diverted the debate to identity politics, specifically, racism and so-called white privilege. Goldberg mostly attacked Peterson regarding his opinions outside of the debate, Peterson defended himself against repeated personal attacks by Goldberg and Dyson, and Fry stayed relatively true to the intended topic. Lots of analysis and opinion appeared on YouTube almost immediately after the debate, so wade in if that’s what interests you. I viewed some of it. A couple videos called Dyson a grievance merchant, which seems to me accurate.

What concerns me more here are the cheap rhetorical tricks employed by Dyson — the only debater booed by the audience — that fundamentally derailed the proceedings. Dyson speaks with the fervor of a revivalist preacher, a familiar style that has been refined and coopted many times over to great effect. Whether deserved or not, it carries associations of great moral authority and momentous occasion. Unfortunately, if presented as a written transcript rather than a verbal rant, Dyson’s remarks are incoherent, unhinged, and ineffective except for their disruptive capacity. He reminded everyone of his blackness and his eloquence, the first of which needs no reminder, the second of which immediately backfired and called into question his own claim. Smart, eloquent people never tell you they’re smart and eloquent; the proof is in their behavior. Such boastful announcements tend to work against a person. Similarly, any remark that beings with “As a black/white/red/brown/blue man/woman/hybrid of _______ ethnicity/sexuality/identity …” calls in a host of associations that immediately invalidates the statement that follows as skewed and biased.

The two point-scoring bits of rhetoric Dyson levies with frequency, which probably form a comfort zone to which he instinctively retreats in all challenges, are his blackness (and by proxy his default victimhood) and historical oppression of blacks (e.g., slavery, Jim Crow laws, etc.). There are no other issues that concern him, as these two suffice to push everyone back on their heels. That’s why the debate failed to address political correctness effectively but instead revolved around identity politics. These issues are largely distinct, unless one debates the wisdom of switching out terminology cyclically, such as occurs even now with various racial epithets (directed to every race, not just blacks). That obvious tie-in, the use of euphemism and neologism to mask negative intent, was never raised. Nor were the twisted relations between free speech, hate speech, and approved speech codes (politically correct speech). Nope, the debate featured various personalities grandstanding on stage and using the opportunity to push and promote their personal brands, much like Trump has over the years. Worse, it was mostly about Michael Eric Dyson misbehaving. He never had my attention in the past; now I intend to avoid him at all costs.

From Wikipedia:

Trial by combat (also wager of battle, trial by battle or judicial duel) was a method of Germanic law to settle accusations in the absence of witnesses or a confession in which two parties in dispute fought in single combat; the winner of the fight was proclaimed to be right. In essence, it was a judicially sanctioned duel. It remained in use throughout the European Middle Ages, gradually disappearing in the course of the 16th century.

Unlike trial by ordeal in general, which is known to many cultures worldwide, trial by combat is known primarily from the customs of the Germanic peoples. It was in use among the ancient Burgundians, Ripuarian Franks, Alamans, Lombards, and Swedes. It was unknown in Anglo-Saxon law, Roman law and Irish Brehon Law and it does not figure in the traditions of Middle Eastern antiquity such as the code of Hammurabi or the Torah.

Trial by combat has profound echoes in 21st-century geopolitics and jurisprudence. Familiar phrases such as right of conquest, manifest destiny, to the winner go the spoils, might makes right, and history written by the victors attest to the enduring legacy of hindsight justification by force of arms. More broadly, within the American system, right of access to courts afforded to all citizens also admits nuisance suits and more than a few mismatched battles where deep-pocketed corporations sue individuals and small organizations, often nonprofits, into bankruptcy and submission. For instance, I recently learned of Strategic Lawsuits Against Public Participation (SLAPPs) “used to silence and harass critics by forcing them to spend money to defend these baseless suits.” They employ brute economic power in place of force of arms.

Trial by combat fell out of practice with the onset of the Enlightenment but the broader complex of ideas survived. Interest in medieval Europe as storytelling fodder in cinema and fantasy literature (notably, the shocking trial by combat depicted in the extremely popular HBO drama Game of Thrones where the accused and accuser both designate their proxies rather than doing battle themselves) lends legitimacy to settling disputes via violence. Even the original Karate Kid (1984) has a new YouTube Red series set 30 years later. The bad-boy acolyte replaces his scorched-earth sensei and seeks revenge from the titular character for being bested decades before, the latter of whom is yanked back from quiet obscurity (and the actor who portrays him from career limbo) to fight again and reprove his skills, which is to say, his righteousness. The set-up is surprisingly delicious to contemplate and has considerable nostalgic appeal. More importantly, it embodies the notion (no doubt scripted according to cliché) that only the pure of heart (or their proxies, students in this case) can claim ultimate victory because, well, it’s god’s will or some such and thus good guys must always win. What that really means is that whoever wins is by definition virtuous. If only reality were so reliably simple.

The certainty of various religious dogma and codes of conduct characteristic of the medieval period (e.g., chivalry) is especially seductive in modern times, considering how the public is beset by an extraordinary degree of existential and epistemological uncertainty. The naturalist fallacy is also invoked, where the law of the jungle (only the fittest and/or strongest get to eat or indeed survive) substitutes for more civilized (i.e., enlightened and equanimous) thinking. Further, despite protestations, this complex of ideas legitimizes bullying, whether (1) in the schoolyard with the principal bully flanked by underlings picking on vulnerable weaklings who haven’t formed alliances for self-protection, (2) the workplace, with its power players and Machiavellian manipulators, or (3) a global military power such as the U.S. dictating terms to and/or warring with smaller, weaker nations that lack the GDP, population, and insanity will to project power globally. I daresay most Americans take comfort in having the greatest military and arsenal ever mustered on their side and accordingly being on the right side (the victorious one) of history, thus a beacon of hope to all who would conflate victory with virtue. Those who suffer at our hands must understand things quite differently. (Isn’t it more accurate that when bad guys win, rebellions and insurgencies are sparked?)

One remarkable exception deserves notice. The U.S. presidency is among the most heavily scrutinized and contentious positions (always under attack) and happens to be the Commander-in-Chief of the self-same greatest goddamn fighting force known to man. It’s no secret that the occupant of that office (45) is also widely recognized as the Bully-in-Chief. Despite having at his disposal considerable resources — military, executive staff, and otherwise — 45 has eschewed forming the political coalitions one might expect and essentially gone it alone, using the office (and his Twitter account) as a one-man bully pulpit. Hard to say what he’s trying to accomplish, really. Detractors have banded together (incompetently) to oppose him, but 45 has demonstrated unexpected tenacity, handily dominating rhetorical trials by combat through sheer bluster and hubris. On balance, he scores some pretty good hits, too. (The proposed fist fight between 45 and Joe Biden turned out to be a tease, but how entertaining would that bout have been without actually settling anything!) This pattern has left many quite dumbfounded, and I admit to being astounded as well except to observe that rank stupidity beats everything in this bizarre political rock-paper-scissors contest. How quintessentially American: nuthin’ beats stoopid.

Two shocking and vaguely humorous (dark, sardonic humor) events occurred recently in the gun debate: (1) in a speech, Marco Rubio sarcastically offered the very reform a healthy majority of the public wants — banning assault weapons — and revealed himself to be completely tin-earred with respect to the public he addresses, and (2) 45 supported some gun controls and even raised the stakes, saying that guns should be taken from people flagged as unstable and dangerous before they commit their mayhem. Rubio had already demonstrated his inability to think on his feet, being locked into scripts handed to him by … whom exactly? Certainly not the public he purportedly serves. So much for his presidential aspirations. OTOH, 45 channels populism and can switch positions quickly. Though ugly and base in many cases, populism at least expresses the will of the people, such as it can be known. His departure from reflexive Republican defense of the hallowed 2nd Amendment shouldn’t be too great a surprise; he’s made similar remarks in the past. His willingness to discard due process and confiscate guns before a crime has been committed sounds more than a little like Spielbergian precrime (via Orwell and Philip K. Dick). To even entertain this prospect in the gun debate demonstrates just how intolerable weekly mass shootings — especially school shootings by troubled youth — have become in the land of the free and home of the brave. On balance, 45 also recommended arming classroom teachers (a risible solution to the problem), so go figger.

Lodged deep in my brain is a potent archetype I don’t often see cited: the Amfortas wound. The term comes from Richard Wagner’s music drama Parsifal (synopsis found here). Let me describe the principal elements (very) briefly. Amfortas is the king of the Knights of the Holy Grail and has a seeping wound than cannot be healed except, according to prophecy, by an innocent youth, also described as a fool wizened by compassion. Such a youth, Parsifal, appears and after familiar operatic conflict does indeed fulfill the prophecy. Parsifal is essentially a retelling of the Arthurian legend. The music is some of the most transcendentally beautiful orchestral composition ever committed to paper and is very much recommended. Admittedly, it’s rather slow for today’s audiences more inclined to throwaway pop music.

Anyway, to tie together the gun debate and Parsifal, I muse that the Amfortas wound is gun violence and 45 is the titular fool who in the end heals the wound and becomes king of the Knights of the Holy Grail. The characterization is not entirely apt, of course, because it’s impossible to say that 45 is young, or compassionate, or wizened, but he has oddly enough moved the needle on gun debate. Not single-handedly, mind you, but from a seat of considerable power unlike, say, the Parkland survivors. Resolution and healing have yet to occur and will no doubt be opposed by the NRA and Marco Rubio. Maybe we’re only in Act I of the traditional 3-act structure. Other characters and plots devices from Parsifal I leave uncast. The main archetype is the Amfortas wound.

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

(more…)

Adding one, revising one. The added one is The Daily Impact, written by Tom Lewis, author of a couple books warning of the collapse of industrial civilization. Lewis appears to be a news junkie, so posts are often torn from the day’s headlines. He’s a good read and not afraid to be sardonically funny. The revised one is The Compulsive Explainer, written by Hal Smith. Blogs come and go, and I had thought that The Compulsive Explainer had stopped being updated last summer, but I see that the author merely switched from WordPress to Blogger without any indication. I suspect Smith isn’t much read (if commentary is a useful measure) but probably deserves to be, not least for his ex patriot perspective.

Because this entry is so slight, there is considerably more unrelated content beneath the fold. (more…)

Returning to the discomforts of my culture-critic armchair just in time of best- and worst-of lists, years in review, summaries of celebrity deaths, etc., the past year, tumultuous in many respects, was also strangely stable. Absent were major political and economic crises and calamities of which myriad harbingers and forebodings warned. Present, however, were numerous natural disasters, primary among them a series of North American hurricanes and wildfires. (They are actually part of a larger, ongoing ecocide now being accelerated by the Trump Administration’s ideology-fueled rollback of environmental protections and regulations, but that’s a different blog post.) I don’t usually make predictions, but I do live on pins and needles with expectations things could take a decidedly bad turn at any moment. For example, continuity of government — specifically, the executive branch — was not expected to last the year by many pundits, yet it did, and we’ve settled into a new normal of exceedingly low expectations with regard to the dignity and effectiveness of high office.

I’ve been conflicted in my desire for stability — often understood pejoratively as either the status quo or business as usual — precisely because those things represent extension and intensification of the very trends that spell our collective doom. Yet I’m in no hurry to initiate the suffering and megadeath that will accompany the cascade collapse of industrial civilization, which will undoubtedly hasten my own demise. I usually express this conflict as not knowing what to hope for: a quick end to things that leaves room for survival of some part of the biosphere (not including large primates) or playing things out to their bitter end with the hope that my natural life is preserved (as opposed to an unnatural end to all of us).

The final paragraph at this blog post by PZ Myers, author of Pharyngula seen at left on my blogroll, states the case for stability:

… I grew up in the shadow of The Bomb, where there was fear of a looming apocalypse everywhere. We thought that what was going to kill us was our dangerous technological brilliance — we were just too dang smart for our own good. We were wrong. It’s our ignorance that is going to destroy us, our contempt for the social sciences and humanities, our dismissal of the importance of history, sociology, and psychology in maintaining a healthy, stable society that people would want to live in. A complex society requires a framework of cooperation and interdependence to survive, and without people who care about how it works and monitor its functioning, it’s susceptible to parasites and exploiters and random wreckers. Ignorance and malice allow a Brexit to happen, or a Trump to get elected, or a Sulla to march on Rome to ‘save the Republic’.

So there’s the rub: we developed human institutions and governments ideally meant to function for the benefit and welfare of all people but which have gone haywire and/or been corrupted. It’s probably true that being too dang smart for our own good is responsible for corruptions and dangerous technological brilliance, while not being dang smart enough (meaning even smarter or more clever than we already are) causes our collective failure to achieve anything remotely approaching the utopian institutions we conceive. Hell, I’d be happy for competence these days, but even that low bar eludes us.

Instead, civilization teeters dangerously close to collapse on numerous fronts. The faux stability that characterizes 2017 will carry into early 2018, but who knows how much farther? Curiously, having just finished reading Graham Hancock’s The Magicians of the Gods (no review coming from me), he ends ends with a brief discussion of the Younger Dryas impact hypothesis and the potential for additional impacts as Earth passes periodically through a region of space, a torus in geometry, littered with debris from the breakup of a large body. It’s a different death-from-above from that feared throughout the Atomic Age but even more fearsome. If we suffer anther impact (or several), it would not be self-annihilation stemming from our dim long-term view of forces we set in motion, but that hardly absolves us of anything.

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)

Brief, uncharacteristic foray into national politics. The Senate narrowly approved a tax reform bill that’s been hawked by that shiny-suit-wearing-used-car-salesman-conman-guy over the past months as simply a big, fat tax cut. From all appearances, it won’t quite work out that way. The 479-pp. bill is available here (PDF link), including last-minute handwritten amendments. I don’t know how typical that is of legislative processes, but I doubt rushing or forcing a vote in the dead of night on an unfinished bill no one has had the opportunity to review leads to good results. Moreover, what does that say to schoolchildren about finishing one’s homework before turning it in?

Considering the tax reform bill is still a work in progress, it’s difficult to know with much certainty its effects if/when signed into law. However, summaries and snapshots of tax effects on typical American households have been provided to aid in the layperson’s grasp of the bill. This one from Mic Network Inc. (a multichannel news/entertainment network with which I am unfamiliar, so I won’t vouch for its reliability) states that the bill is widely unpopular and few trust the advance marketing of the bill:

Only 16% of Americans have said they think the plan is actually going to cut their taxes, less than half the number of people polled who think that their bill is going to go up, according to a Nov. 15 poll from Quinnipiac University.

Yet it seems the Republican-led effort will be successful, despite concerns that many middle class people could actually see their taxes rise, that social programs could suffer, that small businesses could be harmed and that a hoped-for economic boom may never materialize. [links removed]

When a change in tax law goes into effect, one good question is, “who gets help and who gets hurt?” For decades now, the answer has almost always been Reverse Robin Hood: take (or steal) from the poor and give to the rich. That’s why income inequality has increased to extreme levels commencing with the Reagan administration. The economic field of play has been consciously, knowingly tilted in favor of certain groups at the expense of others. Does anyone really believe that those in power are looking out for the poor and downtrodden? Sorry, that’s not the mood of the nation right now. Rather than assisting people who need help, governments at all levels have been withdrawing support and telling people, in effect, “you’re on your own, but first pay your taxes.” I propose we call the new tax bill Reverse Cowgirl, because if anything is certain about it, it’s that lots of people are gonna get fucked.

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]