Archive for January, 2016

I remember that sinking feeling when the Deepwater Horizon oil well blew out in April 2010 and gushed oil into the Gulf of Mexico for 87 days at an estimated rate of 62,000 barrels per day (9,900 m3/d) until it was reportedly capped (but may not have been fully contained). That feeling was more intense than the disgust I felt at discovering the existence of the Great Pacific Garbage Patch (and subsequently others in the Atlantic and Indian Oceans). For reasons that make no particular sense, slo-mo ecological disasters in the oceans didn’t sicken me as much as high-speed despoliation of the Gulf. More recently, I’ve been at a loss, unable to process things, actually, at two new high-speed calamities: the contaminated tap water flowing from public waterworks in Flint, MI, and the methane leak from an underground well in the Porter Ranch neighborhood of Los Angeles, CA (no links provided, search for yourself). Whereas the first two examples turned my stomach at the mere knowledge, the second two are quite literally sickening people.

These examples could be part of a daily diet of stomach-churning news if I had the nerve to gather further examples. Indeed, the doomer sites I habituate at intervals (no longer daily) gather them together for me. As with the examples above, many are industrial chemical spills and contamination; others are animal and plant populations dying en masse (e.g., bee colony collapse disorder); yet others are severe weather events (e.g., the California drought) on the rise due to the onset of climate change (which has yet to go nonlinear). Miserable examples keep piling up, yet business as usual continues while it can. Death tolls are difficult to assess, but at present, they appear to be impacting nonhuman species with greater ferocity thus far. Some characterize this as Mother Nature doing her necessary work by gradually removing the plant and animal species on which humans depend as the very top apex predator. That means eventually removing us, too. I don’t care for such a romantic anthropomorphism. Rather, I observe that we humans are doing damage to the natural world and to ourselves in perhaps the slowest slo-mo disaster, the most likely endpoint being near-term extinction.

As much, then, as the alarm has been sounding adequately with respect to high-speed disasters stemming from human greed, incompetence, and frailty, I find that even worse calamity awaiting us has yet to penetrate the popular mind. Admittedly, it’s awfully hard to get one’s head around: the extinction of the human species. Those who resign themselves to speaking the truth of inevitability are still characterized as kooks, wackos, conspiracy mongers, and worse, leaders of death cults. From my resigned side of the fence, proper characterization appears to be the very opposite: those who actively ruin nature for profit and power are the death cult leaders, while those who prophesy doom are merely run-of-the-mill Cassandras. The ranks of the latter, BTW, seem to be gaining while critical thought still exists in small, isolated oases.

After some delay, I picked up Nick Carr’s latest book The Glass Cage to read (see link to Carr’s blog Rough Type on my blogroll at left). Carr is an exceptionally clear thinker and lays out his arguments both for and against technology very well. Like my blog about Michael Crawford’s book, I won’t get too involved blogging about The Glass Cage, which discusses deskilling among other things. However, my reading of his discussion of self-driving cars (and autopilot on airplanes) and the attendant loss of the driver’s and pilot’s skill and focus coincided with something I read elsewhere, namely, that while self-driving cars may free the driver of some attentional burdens (not really burdens upon closer inspection), they are likely to cause increased congestion precisely because self-driving cars would no longer require passengers. Thus, an owner could potentially instruct the car to drive back home from work in the morning and then to come back and pick him or her up in the evening, handily doubling the time and distance the car is on the road. Similarly, a driver could avoid paying parking fees in pricey downtown precincts by instructing the vehicle to circle the block while the owner is out of the car shopping or dining. These are workarounds that can be fully anticipated and perhaps limited, but there will undoubtedly be others not so easily anticipated.

Carr argues that technology has enabled some (e.g., for those who designed their own software) to profit disproportionately from their effort. This is especially true of wikis and social media sites that run on user-generated content. It’s impossible to establish whether that’s laudable innovation, a questionable workaround, or simply gaming the system. Either way, redesigning workflows and information flows carries the unintended consequence of creating perverse incentives, and one can be certain than in a hustling society such as ours, many someones are going to discover ways to exploits loopholes. This is already the case with the legal system, the financial system, social media, and journalism, and it seems ubiquitous with education and sports, where cheating is only a problem if one gets caught.

Perverse incentives don’t arise solely from rapid, destabilizing technological change, though that’s frequently a trigger. What’s worse, perhaps, is when such perversity is normalized. For instance, politics now operates under a perverse funding regime that awards disproportionate influence to deep pockets while creating no incentive for participants (politicians or deep pockets) to seek reform. Similarly, pooling wealth, and with it political power, within an extremely small segment of society carries no incentive for the ultrarich to plow their riches back into society at large. A few newly philanthropic individuals don’t convince me that, in the current day and age, any high-minded idealism is at work. Rather, it’s more plausible that the work of figuring out things to do with one’s money is more interesting, to a few at least, than merely hoarding it. A better incentive, such as shame, does not yet exist. So the ultrarich are effectively circling the block, clogging things up for no better reason than that they can.

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/

I chanced upon a dinner conversation of the type that tends to light me up, full of familiar assertions and brief citations that purport to make sense of the world but instead open up broad inquiries without ever resolving anything. Whereas all the hanging threads might be frustrating to others, I don’t mind that we leapt from subject to subject carelessly. Engagement with another’s intellect is what really fires me.

So in the course of the discussion, I argued (as devil’s advocate) that the discontinuity between various scales and timeframes renders subtle appreciation of the world and/or universe moot. Specifically, fine-grained knowledge that flows from hard sciences such as mathematics, biology, chemistry, and physics does not combine to form anything approaching a complete picture of reality in the mind of the average person. Soft sciences such as sociology, psychology, economics, anthropology, and history are as likely to confound and confuse as illuminate, considering their vulnerability to interpretative flexibility. Further, the extensive conjectural and theoretical complexity of cosmology and quantum sciences are so far out of scope for typical beer-swilling Joes as to be invisible. Even the basic distinction between the Euclidian and Ptolemaic models of the solar system is losing currency with no immediately apparent effect in the wider (American) culture of prideful ignorance.

Here’s the rub: even though I believe more nearly the opposite, namely, that refined understandings of the universe developed and held in the minds of a relative few and never achieving the completeness of a union theory yet sufficient to bestow upon us hubris a model for action in the world are eventually (or ultimately) embedded in the deep culture, I found it difficult to argue that point to us fish inside the fishbowl. Indeed, the fellow across the table from me, who possessed far greater scientific wherewithal than do I, could only rebut my assertions with the baldest “is not, is too” type of negation.

I attempted an exploration of a deep-culture effect more than two years ago with this post, but I fear the whole subject was too arcane to make much of an impression. General readers simply do not go in for such analysis. Yet I still believe that the effect is present in, for example, our willingness to trash the world until it’s uninhabitable — at least for us — and our earnest desire to transcend our embodiment and be something other than human (e.g., Transhumanism), which is an expression of our deep self-loathing and self-destructive impulse (explored amply by The Compulsive Explainer — see blogroll). Like my dinner table conversation, these inquiries lead nowhere in particular but are nonetheless fascinating. Perhaps a generous reader out there who can point to a better example that is more accessible and convincing than what I’ve been able to muster.