Posts Tagged ‘Pose of Objectivity’

I caught the presentation embedded below with Thomas L. Friedman and Yuval Noah Harari, nominally hosted by the New York Times. It’s a very interesting discussion but not a debate. For this now standard format (two or more people sitting across from each other with a moderator and an audience), I’m pleased to observe that Friedman and Harari truly engaged each others’ ideas and behaved with admirable restraint when the other was speaking. Most of these talks are rude and combative, marred by constant interruptions and gotchas. Such bad behavior might succeed in debate club but makes for a frustratingly poor presentation. My further comments follow below.

With a topic as open-ended as The Future of Humanity, arguments and support are extremely conjectural and wildly divergent depending on the speaker’s perspective. Both speakers here admit their unique perspectives are informed by their professions, which boils down to biases borne out of methodology, and to a lesser degree perhaps, personality. Fair enough. In my estimation, Harari does a much better job adopting a pose of objectivity. Friedman comes across as both salesman and a cheerleader for human potential.

Both speakers cite a trio of threats to human civilization and wellbeing going forward. For Harari, they’re nuclear war, climate change, and technological disruption. For Friedman, they’re the market (globalization), Mother Nature (climate change alongside population growth and loss of diversity), and Moore’s Law. Friedman argues that all three are accelerating beyond control but speaks of each metaphorically, such as when refers to changes in market conditions (e.g., from independent to interdependent) as “climate change.” The biggest issue from my perspective — climate change — was largely passed over in favor of more tractable problems.

Climate change has been in the public sphere as the subject of considerable debate and confusion for at least a couple decades now. I daresay it’s virtually impossible not to be aware of the horrific scenarios surrounding what is shaping up to be the end of the world as we know it (TEOTWAWKI). Yet as a global civilization, we’ve barely reacted except with rhetoric flowing in all directions and some greenwashing. Difficult to assess, but perhaps the appearance of more articles about surviving climate change (such as this one in Bloomberg Businessweek) demonstrates that more folks recognize we can no longer stem or stop climate change from rocking the world. This blog has had lots to say about the collapse of industrial civilization being part of a mass extinction event (not aimed at but triggered by and including humans), so for these two speakers to cite but then minimize the peril we face is, well, fa├žile at the least.

Toward the end, the moderator finally spoke up and directed the conversation towards uplift (a/k/a the happy chapter), which almost immediately resulted in posturing on the optimism/pessimism continuum with Friedman staking his position on the positive side. Curiously, Harari invalidated the question and refused to be pigeonholed on the negative side. Attempts to shoehorn discussions into familiar if inapplicable narratives or false dichotomies is commonplace. I was glad to see Harari calling bullshit on it, though others (e.g., YouTube commenters) were easily led astray.

The entire discussion is dense with ideas, most of them already quite familiar to me. I agree wholeheartedly with one of Friedman’s remarks: if something can be done, it will be done. Here, he refers to technological innovation and development. Plenty of prohibitions throughout history not to make available disruptive technologies have gone unheeded. The atomic era is the handy example (among many others) as both weaponry and power plants stemming from cracking the atom come with huge existential risks and collateral psychological effects. Yet we prance forward headlong and hurriedly, hoping to exploit profitable opportunities without concern for collateral costs. Harari’s response was to recommend caution until true cause-effect relationships can be teased out. Without saying it manifestly, Harari is citing the precautionary principle. Harari also observed that some of those effects can be displaced hundreds and thousands of years.

Displacements resulting from the Agrarian Revolution, the Scientific Revolution, and the Industrial Revolution in particular (all significant historical “turnings” in human development) are converging on the early 21st century (the part we can see at least somewhat clearly so far). Neither speaker would come straight out and condemn humanity to the dustbin of history, but at least Harari noted that Mother Nature is quite keen on extinction (which elicited a nervous? uncomfortable? ironic? laugh from the audience) and wouldn’t care if humans were left behind. For his part, Friedman admits our destructive capacity but holds fast to our cleverness and adaptability winning out in the end. And although Harari notes that the future could bring highly divergent experiences for subsets of humanity, including the creation of enhanced humans from our reckless dabbling with genetic engineering, I believe cumulative and aggregate consequences of our behavior will deposit all of us into a grim future no sane person should wish to survive.

Advertisements

Continuing from part 1, which is altogether too much screed and frustration with Sam Harris, I now point to several analyses that support my contentions. First is an article in The Nation about the return of so-called scientific racism and speaks directly about Charles Murray, Sam Harris, and Andrew Sullivan, all of whom are embroiled in the issue. Second is an article in The Baffler about constructing arguments ex post facto to conform to conclusions motivated in advance of evidence. Most of us are familiar with the the constructed explanation, where in the aftermath of an event, pundits, press agents, and political insiders propose various explanatory narratives to gain control over what will eventually become the conventional understanding. Published reports such as the Warren Commission‘s report on the assassination of JFK is one such example, and I daresay few now believe the report and the consensus that it presents weren’t politically motivated and highly flawed. Both linked articles above are written by Edward Burmilla, who blogs at Gin and Tacos (see blogroll). Together, they paint a dismal picture of how reason and rhetoric can be corrupted despite the sheen of scientific respectability.

Third is an even more damaging article (actually a review of the new anthology Trump and the Media) in the Los Angeles Review of Books by Nicolas Carr asking the pointed question “Can Journalism Be Saved?” Admittedly, journalism is not equivalent with reason or rationalism, but it is among several professions that employ claims of objectivity, accuracy, and authority. Thus, journalism demands both attention and respect far in excess of the typical blogger (such as me) or watering-hole denizen perched atop a barstool. Consider this pullquote:

… the flaws in computational journalism can be remedied through a more open and honest accounting of its assumptions and limitations. C. W. Anderson, of the University of Leeds, takes a darker view. To much of the public, he argues, the pursuit of “data-driven objectivity” will always be suspect, not because of its methodological limits but because of its egghead aesthetics. Numbers and charts, he notes, have been elements of journalism for a long time, and they have always been “pitched to a more policy-focused audience.” With its ties to social science, computational journalism inevitably carries an air of ivory-tower elitism, making it anathema to those of a populist bent.

Computational journalism is contrasted with other varieties of journalism based on, say, personality, emotionalism, advocacy, or simply a mad rush to print (or pixels) to scoop the competition. This hyperrational approach has already revealed its failings, as Carr reports in his review.

What I’m driving at is that, despite frequent appeals to reason, authority, and accuracy (especially the quantitative sort), certain categories of argumentation fail to register on the average consumer of news and information. It’s not a question of whether arguments are right or wrong, precisely; it’s about what appeals most to those paying even a modest bit of attention. And the primary appeal for most (I judge) isn’t reason. Indeed, reason is swept aside handily when a better, um, reason for believing something appears. If one has done the difficult work of acquiring critical thinking and reasoning skills, it can be quite the wake-up call when others fail to behave according to reason, such as with acting against enlightened self-interest. The last presidential election was a case in point.

Circling back so something from an earlier blog, much of human cognition is based on mere sufficiency: whatever is good enough in the moment gets nominated then promoted to belief and/or action. Fight, flight, or freeze is one example. Considered evaluation and reason are not even factors. Snap judgments, gut feelings, emotional resonances, vibes, heuristics, and Gestalts dominate momentary decision-making, and in the absence of convincing countervailing information (if indeed one is even vulnerable to reason, which would be an unreasonable assumption), action is reinforced and suffices as belief.

Yet more in part 3 to come.

Oddly, there is no really good antonym for perfectionism. Suggestions include sloppiness, carelessness, and disregard. I’ve settled on approximation, which carries far less moral weight. I raise the contrast between perfectionism and approximation because a recent study published in Psychological Bulletin entitled “Perfectionism Is Increasing Over Time: A Meta-Analysis of Birth Cohort Differences From 1989 to 2016″ makes an interesting observation. Here’s the abstract:

From the 1980s onward, neoliberal governance in the United States, Canada, and the United Kingdom has emphasized competitive individualism and people have seemingly responded, in kind, by agitating to perfect themselves and their lifestyles. In this study, the authors examine whether cultural changes have coincided with an increase in multidimensional perfectionism in college students over the last 27 years. Their analyses are based on 164 samples and 41,641 American, Canadian, and British college students, who completed the Multidimensional Perfectionism Scale (Hewitt & Flett, 1991) between 1989 and 2016 (70.92% female, Mage = 20.66). Cross-temporal meta-analysis revealed that levels of self-oriented perfectionism, socially prescribed perfectionism, and other-oriented perfectionism have linearly increased. These trends remained when controlling for gender and between-country differences in perfectionism scores. Overall, in order of magnitude of the observed increase, the findings indicate that recent generations of young people perceive that others are more demanding of them, are more demanding of others, and are more demanding of themselves.

The notion of perfection, perfectness, perfectibility, etc. has a long tortured history in philosophy, religion, ethics, and other domains I won’t even begin to unpack. From the perspective of the above study, let’s just say that the upswing in perfectionism is about striving to achieve success, however one assesses it (education, career, relationships, lifestyle, ethics, athletics, aesthetics, etc.). The study narrows its subject group to college students (at the outset of adult life) between 1989 and 2016 and characterizes the social milieu as neoliberal, hyper-competitive, meritocratic, and pressured to succeed in a dog-eat-dog environment. How far back into childhood results of the study (agitation) extend is a good question. If the trope about parents obsessing and competing over preschool admission is accurate (may be just a NYC thang), then it goes all the way back to toddlers. So much for (lost) innocence purchased and perpetuated through late 20th- and early 21st-century affluence. I suspect college students are responding to awareness of two novel circumstances: (1) likelihood they will never achieve levels of success comparable to their own parents, especially financial (a major reversal of historical trends) and (2) recognition that to best enjoy the fruits of life, a quiet, reflective, anonymous, ethical, average life is now quite insufficient. Regarding the second of these, we are inundated by media showing rich celebrities (no longer just glamorous actors/entertainers) balling out of control, and onlookers are enjoined to “keep up.” The putative model is out there, unattainable for most but often awarded by randomness, undercutting the whole enterprise of trying to achieve perfection.

(more…)

We all live in perceptual bubbles of varying breadth and focus. Otherwise, we would be omniscient, which none of us is or can be. Two hot topics that lie outside my perceptual bubble are geopolitical struggles in Israel and Northern Ireland. I’ve also read analyses that suggest that our current troubles and involvements in the Middle East are part of a clash of cultures going back two millennia, where the mostly Christian West won the battle back in the Middle Ages but newly gained oil wealth in the Middle East has prompted a resumption of hostilities. I have a mixture of opinions passing acquaintance with geopolitics, and the complexity of the myriad interacting elements keeps me from getting a good fix on what’s proven to be a constantly shifting target. That aspect of modern history is the domain of intelligence agencies, military strategists, and diplomats. I don’t necessarily trust those professionals, though, since they operate with their own perceptual biases. (When your main tool is a bomb hammer, everything tends to look like a target nail.) But I also recognize that I’m in a really lousy position to second-guess or drive from the back seat. Plus, I have zero influence, even at the voting booth.

In the narrower arena of domestic and campaign politics, the news media (journalists) have failed in their legitimate role as the fourth estate, which function is now being performed by their cousins in entertainment media. (I’ll skip the diatribe that journalism has essentially merged with entertainment and utterly lost any claim to objectivity.) Specifically, we live in a surprisingly mature age of political satire replete with shows that deliver news in comic form far better than serious journalists do with straight faces. The model is undoubtedly The Daily Show, which has already spun off The Colbert Report, Last Week Tonight, Full Frontal, and The Nightly Show. Each of these shows features a host considerably smarter than the audience, who proceeds with rapid-fire (though scripted) takedowns of all manner of political dysfunction. Each has its own stylistic tics, but in aggregate, they arguably do a better job of investigative journalism these days than, say, 60 Minutes, Dateline, or 20/20. Better yet, since they don’t pretend to be serious journalism, they can dispense with bogus claims to objectivity and simply go ahead to indulge in righteous indignation and silly stunts, exposing corruption, stupidity, and inanity in all their shameful manifestations. Political humor has now become a form of gallows humor.

(more…)

We have embraced the machinery of our undoing as recreation. —Joe Bageant

The subject of industrial and civilizational collapse has wandered in and out of the front of my brain for years now; it’s never far from my attention. I’ve blogged about it numerous times, but I almost always retreat immediately afterward from the anticipated horror. As knowledge of our shared dilemma (now unavoidable fate) diffuses through the population, I am vaguely heartened to see that some few others really do get it and are sounding the alarm and sharing their ideas, analyses, and preparations for the future. So it’s no surprise when I become newly aware of someone who has essentially spent a lifetime piecing together the puzzle and in the process become fairly well known as an activist, educator, writer, or simply a whistle blower on the subject of collapse. Michael Ruppert, the subject of the 2009 documentary film Collapse, is my most recent such discovery.

mv5bmzu5mde5mje4mv5bml5banbnxkftztcwmtazmtq1mw-_v1_

Recounting the arguments Ruppert makes in the film in favor of understanding what is happening in and to the world at this particular junction in history is frankly unnecessary. As Ruppert says in the film, there is no longer any point to those debates. Either you are aware and convinced or you are unaware and/or in denial. Further debate solves nothing. What interests me more is the way the issue is framed by the filmmakers and how film critics have responded in print.

When Al Gore appeared in An Inconvenient Truth, it was revealed that he had been shaping his presentation on anthropogenic global warming over a period of decades and that the film was the fruition of years of effort that had matured and ripened in terms of the message, the underlying science, and the readiness of the public to listen (but not yet to act). So a good portion of the film was about Al Gore getting out the message, which is typical of politicians whose eyes always stray to the campaign angle of whatever cause they are pushing. The irony was that Al Gore’s political career was already over, yet he couldn’t resist being a central part of the story. “Look mom, no hands!”

In Collapse, Michael Ruppert is revealed to have been on a similar lifelong quest to discover the truth and get out the message. Accordingly, he occupies the center of the story not just as the lone talking head in a stark yet dramatically lit room (reminiscent of a criminal interrogation room) but as the impassioned, charismatic voice of Cassandra doomed to be ignored for his dark, unsavory prophecy — except that he’s not quite being ignored. (He points several times to power players in government and industry acting on his conclusions but refusing to be honest or validate his message.) The collapse of global civilization may not be the biggest story in the history of mankind, but if not, it’s certainly the most immediately relevant. Appallingly, the film is framed predominantly as a human story: the story of Michael Ruppert. Maybe this can be excused as synecdoche — the story of Ruppert is the story of all of us — but I suspect instead that the filmmakers find Ruppert’s story more engaging and entertaining as a documentary subject than the ongoing collapse, which is dragged onto stage mostly by Ruppert.

This myopic view also accounts in part for the inability of film critics to engage with and evaluate the content of the film beyond Ruppert himself as a quirky subject. I haven’t read all of the reviews, but those I have read follow the template of repeating some of Ruppert’s assertions for context to support adjectival blurbs such as mesmerizing, compelling, terrifying, ominous, riveting, chilling, horrifying, etc., which are probably also applicable to Ruppert himself, yet the reviewers can’t seem to overcome the myth of journalistic objectivity to say, “well duh, this stuff is so convincing and obvious it absolutely demands we add our voices to his and warn of what’s to come.” Instead, one reviewer after the next hedges behind words like possible, plausible, seemingly persuasive, etc. “Golly, I’m just a dumb film critic, hardly even a real journalist. I can’t understand anything if not filtered through a celluloid lens.” The filmmakers, too, reveal some of this same bullshit report-the-controversy attitude when an off-screen voice asks Ruppert something to the effect “Who are you to be stating such dark conclusions? What makes you qualified to offer an opinion?” Earlier in the film, Ruppert described himself as unusually adept at critical thinking. That’s the answer he might have given. But in the heat of confrontation, his actual answer was disappointing. I wish he had replied, “What qualifications do you think I need to describe reality accurately?”

This, of course, is the crux of the documentary. Not being a movie critic, I didn’t know that the filmmakers tend to take as their subjects people who are regarded as kooks, freaks, weirdos, eccentrics, and obsessives. So from its inception, the film is another in a serious of profiles of strange folk who need not be taken any too seriously. This perspective is reinforced by imposing the question of credentials on the subject, and in turn, the movie critics all fall in line and agree that they, too, are unqualified to evaluate Ruppert’s statements but can only review the film for its entertainment value. Never mind that the arguments and underlying science take only modest intellectual wherewithal to recognize as truth. No, this truth, since we lack the courage to confront it with any integrity, is now being served up as spectacle: lookie at the funny tin-foil-hat-wearing conspiracy theorist going on about doom. How wonderfully charming he is for our fun and enjoyment! As it happens, Ruppert escapes being humiliated or shamed despite being lured into self-exposure and made the subtle object of derision. Withholding judgment as the cowardly critics do does not change the fact that Ruppert was set up for failure.

This is one reason among many why we have earned our fate.

This passage from E.O. Wilson’s book Consilience piqued my interest:

Without the stimulus and guidance of emotion, rational thought slows and disintegrates. The rational mind does not float about the irrational; it cannot free itself to engage in pure reason. There are pure theorems in mathematics but no pure thoughts that discover them. In the brain-in-the-vat fantasy of neurobiological theory and science fiction, the organ in its nutrient bath has been detached from the impediments of the body and liberated to explore the inner universe of the mind. But that is not what would ensue in reality. All the evidence from the brain sciences points in the opposite direction, to a waiting coffin-bound hell of the wakened dead, where the remembered and imagined world decays until chaos mercifully grants oblivion.

Wilson does a good job synthesizing consciousness studies and taking a stab at describing the physical and psychological bases of the mind. This paragraph, though, stuck out like a sore thumb from the chapter and book, in part because he calls back to the midcentury notion of disembodied brains controlling the world and in part because he waxes poetic in his dismissal of that idea. (Wish I could write so well ….) I vaguely remember reading Donovan’s Brain back in my youth. There is even a Wikipedia entry for the novel, which was made into a movie at least twice, one with Nancy Reagan. More familiar to the TV generation is the same basic story idea that found its way into a Star Trek episode called “Spock’s Brain.” The terseness of both titles is a humorous curiosity.

The larger point, I guess, is that the brain’s various routines that operate in parallel to construct consciousness are inseverable from sensory inputs from the body and from the emotional context used to create narrative. As such, it’s important to recognize that there is no divine Truth or Objectivity out there beyond our grasp or perception, even though we may sometimes hope for such things. (We adopt a posture of objectivity to reduce the distortions of emotion and irrational thinking, but that’s not the same thing.) There is a physical reality that is somewhat discontinuous with our perception of everyday “reality.” Such limitations are built into our physiology, just like we lack the eagle’s acute visual perception or the dog’s acute olfactory sensation.

If the brain can’t exist without the body, how much of the body can be missing before the brain fails to function? It’s a strange question, and the answer is quite a lot. None of the extremities are essential, nor are most of the sensory parts (ears, eyes, tongue, nose) — at least separately. Few examples exist of individuals deprived of their senses, but Helen Keller is one obvious example. Her sightlessness and deafness (not from birth but from an illness at nineteen months) delayed her cognitive development until she broke through the symbolic barrier and developed language later in childhood. Over 70 cases of deafblindness are known, some congenital and others acquired. If an example exists of a fully developed mind later deprived of all sensory experience, it would be curious to know how long a person stays rational before the descent into madness begins.

The New Yorker has a rather long but interesting article called “Twilight of the Books” about the decline of reading and literacy in the modern world. The article is far reaching in its attempt to summarize information from a number of sources, notably a book by Maryanne Wolf, a professor at Tufts University and director of its Center for Reading and Language Research, titled Proust and the Squid. The article begins with a litany of statistics demonstrating that reading is in decline.

I have to pause here to chide The New Yorker about its own writing, which is the flip side of reading on the literacy coin. Don’t all articles pass over at least two desks: the writer’s and the editor’s?

In January 1994, forty-nine per cent of respondents told the Pew Research Center for the People and the Press that they had read a newspaper the day before. In 2006, only forty-three per cent said so, including those who read online. Book sales, meanwhile, have stagnated. The Book Industry Study Group estimates that sales fell from 8.27 books per person in 2001 to 7.93 in 2006. According to the Department of Labor, American households spent an average of a hundred and sixty-three dollars on reading in 1995 and a hundred and twenty-six dollars in 2005. [emphasis added]

Isn’t per cent better as one word: percent? Similarly, shouldn’t a hundred and sixty-three be one hundred sixty-three? Any experienced copy editor should know that we don’t write numbers (or numerals) the way we speak them. We may say one-oh-six, but we don’t write 1o6 (as opposed to 106 — the typographical difference is difficult to see with some fonts, but it’s there). There are lots of other style errors, contractions, and generalized clumsiness, but I’ll move on.

As I read the article, I was struck by the number of times I said to myself, Duh, that’s so obvious it doesn’t bear stating! But I realized that most of the Duh! moments aren’t in fact so obvious to anyone ignorant of even entry-level media theory, which is really what I have. So I’ll reproduce a few noteworthy items with comments. (more…)