If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitution scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. Read the rest of this entry »

I get exasperated when I read someone insisting dogmatically upon ideological purity. No such purity exists, as we are all participants, in varying degrees, in the characteristics of global civilization. One of those characteristics is the thermodynamic cycle of energy use and consumption that gradually depletes available energy. The Second Law guarantees depletion, typically over cosmological time frames, but we are seeing it manifest over human history as EROI decreases dramatically since the start of the fossil fuel era. So playing gotcha by arguing, for instance, “You use electricity, too, right? Therefore, you have no right to tell me what I can and can’t do with electricity!” is laughably childish. Or put another way, if even an inkling of agreement exists that maybe we should conserve, forgo needless waste, and accept some discomfort and hardship, then it’s typically “you first” whenever the issue is raised in the public sphere.

In a risible article published at Townhall.com, Michelle Malkin calls the Pope a hypocrite for having added his authority to what scientists and environmentalists have been saying: we face civilization-ending dangers from having fouled our own nest, or “our common home” as the Pope calls it. As though that disrespect were not yet enough, Malkin also tells the Pope essentially to shut it:

If the pontiff truly believes “excessive consumption” of modern conveniences is causing evil “climate change,” will he be shutting down and returning the multi-million-dollar system Carrier generously gifted to the Vatican Museums?

If not, I suggest, with all due respect, that Pope Francis do humanity a favor and refrain from blowing any more hot air unless he’s willing to stew in his own.

The disclaimer “with all due respect” does nothing to ease the audacity of a notorious ideologue columnist picking a fight over bogus principles with the leader of the world’s largest church, who (I might add) is slowly regaining some of the respect the Catholic Church lost over the past few scandalous decades. I suspect Malkin is guilelessly earnest in the things she writes and found a handy opportunity to promote the techno-triumphalist book she researched and wrote for Mercury Ink (owned by Glenn Beck). However, I have no trouble ignoring her completely, since she clearly can’t think straight.

Plenty of other controversy followed in the wake of the latest papal encyclical, Laudato Si. That’s to be expected, I suppose, but religious considerations and gotcha arguments aside, the Pope is well within the scope of his official concern to sound the alarm alongside the scientific community that was once synonymous with the Church before they separated. If indeed Pope Francis has concluded that we really are in the midst of both an environmental disaster and a mass extinction (again, more process than event), it’s a good thing that he’s bearing witness. Doomers like me believe it’s too little, too late, and that our fate is already sealed, but there will be lots of ministry needed when human die-offs get rolling. Don’t bother seeking any sort of grace from Michelle Malkin.

I’m not a serious cineaste, but I have offered a few reviews on The Spiral Staircase. There are many, many cineastes out there, though, and although cinema is now an old medium (roughly 100 years old), cineastes tend to be on the younger side of 35 years. Sure, lots of established film critics are decidedly older, typically acting under the aegis of major media outlets, but I’m thinking specifically of the cohort who use new, democratized media (e.g., cheap-to-produce and -distribute YouTube channels) to indulge in their predilections. For example, New Media Rockstars has a list of their top 100 YouTube channels (NMR No. 1 contains links to the rest). I have heard of almost none of them, since I don’t live online like so many born after the advent of the Information/Communications Age. The one I pay particular attention to is Screen Junkies (which includes Honest Trailers, the Screen Junkies Show, and Movie Fights), and I find their tastes run toward childhood enthusiasms that mire their criticism in a state of permanent adolescence and self-mocking geekdom. The preoccupation with cartoons, comic books, action figures, superheros, and popcorn films couldn’t be more clear. Movies Fights presumes to award points on the passion, wit, and rhetoric of the fighters rather than quality of the films they choose to defend. However, adjudication is rarely neutral, since trump cards tend to get played when a superior film or actor is cited against an inferior one.

So I happened to catch three recent flicks that are central to Screen Junkies canon: Captain America: Winter Soldier, The Avengers: Age of Ultron, and Transformers: Age of Extinction (links unnecessary). They all qualify as CGI festivals — films centered on hyperkinetic action rather than story or character (opinions differ, naturally). The first two originate from the MCU (acronym alert: MCU = Marvel Cinematic Universe, which is lousy with comic book superheros) and the last is based on a Saturday-morning children’s cartoon. Watching grown men and a few women on Screen Junkies getting overexcited about content originally aimed at children gives me pause, yet I watch them to see what fighters say, knowing full well that thoughtful remarks are infrequent.

Were I among the fighters (no chance, since I don’t have my own media fiefdom), I would likely be stumped when a question needs immediate recall (by number, as in M:I:3 for the third Mission Impossible film) of a specific entry from any of numerous franchises pumping out films regularly like those named above. Similarly, my choices would not be so limited to films released after 1990 as theirs, that year being the childhood of most of the fighters who appear. Nor would my analysis be so embarrassingly visual in orientation, since I understand good cinema to be more about story and character than whiz-bang effects.

Despite the visual feast fanboys adore (what mindless fun!), lazy CGI festivals suffer worst from overkill, far outstripping the eye’s ability to absorb onscreen action fully or effectively. Why bother with repeat viewing of films with little payoff in the first place? CGI characters were interesting in and of themselves the first few times they appeared in movies without causing suspension of belief, but now they’re so commonplace that they feel like cheating. Worse, moviegoers are now faced with so many CGI crowds, clone and robot armies, zombie swarms, human-animal hybrids, et cetera ad nauseum, little holds the interest of jaded viewers. Thus, because so few scenes resonate emotionally, sheer novelty substitutes (ineffectively) for meaning, not that most chases or slugfests in the movies offer much truly original. The complaint is heard all the time: we’ve seen it before.

Here’s my basic problem with the three CGI-laden franchise installments I saw recently: their overt hypermilitarism. When better storytellers such as Kubrick or Coppola make films depicting the horrors of war (or other existential threats, such as the ever-popular alien invasion), their perspective is indeed that war is horrible, and obvious moral and ethical dilemmas flow from there. When hack filmmakers pile up frenzied depictions of death and destruction, typically with secondary or tertiary characters whose dispatch means and feels like nothing, and with cities destroyed eliciting no emotional response because it’s pure visual titillation, they have no useful, responsible, or respectable commentary. Even the Screen Junkies recognize that, unlike, say, Game of Thrones, none of their putative superheroes really face much more than momentary distress before saving the day in the third act and certainly no lasting injury (a little make-up blood doesn’t convince me). Dramatic tension simply drains away, since happy resolutions are never in doubt. Now, characters taking fake beatdowns are laughter inducing, sorta like professional wrestling after the sheepish admission that they’ve been acting all along. Frankly, pretend drama with nothing at stake is a waste of effort and the audience’s time and trust. That so many fanboys enjoy being goosed or that some films make lots of money is no justification. The latter is one reason why cinema so often fails to rise to the aspiration of art: it’s too bound up in grubbing for money.

Earlier this month, I offered a comment at Text Patterns that observed (in part) the following:

History demonstrates pretty clearly that the focus of innovation and investment has shifted considerably over the past 150 years or so from (1) mechanical contraptions and processes to (2) large infrastructure projects to (3) space exploration and now to (4) computers, communications, and networking.

I would amend (3) to include nuclear technologies on top of space exploration. Thus, various categories of innovation attract attention and glamor then fade away. I also suggested that correlating these orientations with styles of warfare might be an interesting blog post, but I won’t perform that analysis, leaving the idea available for others to develop.

With respect to (2), it seems to me that while large infrastructure projects enjoyed a period of preeminence from 1900 to 1950 or so, concluding roughly with the construction of the interstate highway system, then yielded in the popular imagination to nuclear power (and nuclear angst) and numerous NASA projects, infrastructure projects have never really stopped despite many reports of decaying and decrepit infrastructure that reduce quality of life and economic competitiveness. The development of high-speed rail, standard in Europe and Japan, is a good example of infrastructure never developed in the U.S. because we have chosen instead to support automobiles and over-the-road trucking. Elizabeth Moss Kanter’s new book, Move: Putting America’s Infrastructure Back in the Lead, argues that we need to turn attention back to infrastructure posthaste.

If infrastructure projects lack the faux glamor and sexiness of more contemporary categories of innovation and investment, they have nonetheless been ongoing, though perhaps not diligently enough to avoid periodic bridge collapses, electrical grid failures, and train derailments. Surveying the scene in my hometown (Chicago), I can recall that in the last 15 years, major renovations have been done to all the interstates (they are under construction almost continuously yet remain choked with traffic), as well as two massive reconstruction projects to different stretches of Upper and Lower Wacker Drive. The Chicago Transit Authority (CTA) has also had major capital projects to update its rail lines and stations, the newest phase being Red and Purple Line modernization, including a planned flyover for the Brown Line just north of the Clark Junction. This list of projects omits regular neighborhood street and intersection repairs, redesigns, and reconstruction, as well as water, sewer, and other utility repairs, all of which lack a significant wow factor.

One project that just opened, though it’s not yet quite complete, is converting the disused rail line known as the Bloomingdale Trail (now renamed The 606 after the stem of Chicago zip codes) into a walking/running/biking path and park system extending across several NW Chicago neighborhoods. Although the Bloomingdale Trail has been open less than a week, I ventured onto it and found that it has already been widely embraced by the public. Indeed, riding the Bloomingdale Trail on my bicycle was frustrating and tense because there was no possibility of going at a normal bike speed plus lots of maneuvering to avoid collisions with others on the path. If I were a runner or walker on the path instead, I would undoubtedly have been irritated by bicyclists zipping by me at close quarters. Although the path is designed for mixed use, it fails to satisfy (IMO) any of those intended uses because of sheer congestion. Instant popularity has made the trail a victim of its own success, not unlike the overused Lakeshore Bike Path and Forest Preserve picnic shelters. And therein lies the problem: if you built it, they will come — and they often come in droves that overwhelm and ruin the entire experience.

This seems to be an endemic feature of modern infrastructure. We build spaces and places to accommodate transit, transportation, and events of all sorts, but then they stagger under the masses of people and vehicles that descend upon them. I have had similarly poor experiences at events such at the Chase Corporate Challenge (a 5K foot race and walk in Grant Park) and the Late Ride (a 25-mile bike ride — not race — through the city occurring well after midnight) where the events are so oversubscribed that constant jostling for position on the pavement among the other participants ruins the experience. Difficult entry and egress from sporting events at Wrigley Field, Soldier Field, and the United Center have also been cause for consternation. The annual lakefront Air and Water Show, the Taste of Chicago, and July 4 fireworks displays (before the city cancelled its fireworks) are further examples of places I avoid because they’re simply too mobbed with people.

Some of the cause is population density, even though the City of Chicago proper had higher density in the first half of the 20th century, and some of the cause is overpopulation, with events drawing people from the surrounding suburbs and exurbs. Some of the cause, too, is the abandonment of an historically rural, agrarian way of life for modern, technological social structures organized around city centers. The resulting depopulation of the countryside, relatively speaking, is a curiosity to me, which I have prophesied may give way to repopulation (which I called repatriation) when agribusiness fails to deliver enough food to grocery shelves. But until then, we’re stuck congregating in mostly the same places.

I have read a number of exhortations by gurus of one sort or another, usually plumbing the depths of self-delusion, to “imagine the absurd” as a means of unlocking one’s latent creativity blocked by hewing too closely to convention and, dare I say it, reality. Invoking absurdity seems to me redundant: we (well, some) already live with absurd comfort and wealth, purchased by the sweat and suffering of many, not least of which is the Earth itself (or herself, if you prefer). Slights and insults absorbed by the biosphere in the last decade may be individually insignificant, but over time and in aggregate, they constitute a proverbial death by a thousand cuts. But you don’t have to take my word for it: investigate for yourself how many plant and animal species have suffered significant die-offs. Other absurdities are piling up, too, especially in the area of politics, which is looking more than ever (how is that even possible?) like a clown circus as, for example, more candidate from both major political parties enter the 2016 presidential race. We haven’t even gotten to loony third-party candidates yet.

These are familiar ideas to readers of this blog, and although they bear repeating, they are not really what I want to discuss. Rather, it has become increasingly clear that in an age of excess and entitlement — call it the Age of Absurdity — the awful truth can only be told through comedy, just like Michael Moore’s short-lived comic documentary TV show of the same name. Sure, there are a growing number of Cassandras like me prophesying doom, but our claim on public dialogue is thus far negligible. Further, serious documentaries exposing absurd levels of corruption, mendacity, idiocy, and cruelty are currently enjoying a golden age. But compare these to any number of TV shows and movies — all offered for comedic entertainment purposes — that are now functioning as de facto news outlets (The Daily Show and Real Time with Bill Maher have been particularly manifest about this), and it’s easy to see that the public prefers its truth sugar-coated, even if it’s given a reverse twist such as with The Colbert Report. (I can’t watch Colbert for the same reason I can’t watch South Park or The Simpsons: they’re too accurate, too on the nose, even as jokey reflections of or on the Age of Absurdity.) The only thing one needs to reveal truth inside a comedy show (not just the fake news shows) is to ignore the laugh track and turn off one’s sense of humor, treating each comedy bit earnestly, the way a child would. That’s how awful, accurate, and absurd things have become.

Take, for instance, this article in The New Yorker, which is satire on its face but quite literally tells the truth when considered soberly. The last line, “Our research is very preliminary, but it’s possible that they [denialists of all stripes] will become more receptive to facts once they are in an environment without food, water, or oxygen,” is pretty macabre but tells precisely the thing to be expected when supplies falter.

Take, for another instance, the celebrity roasts that Comedy Central has revived. I’ve watched only a few clips, but roasters typically say egregiously insulting things that are quite literally true about the roastee, who laughs and smiles through the humiliation. Insult comedy may perhaps be embellished or exaggerated for effect, but it scarcely needs to be. To call someone a hack or comment on his/her undesired unwarranted overexposure (commonplace now in the era of omnimedia and leaked sex tapes) takes a little comedic shaping, but there is always a sizable kernel of truth behind the jokes. That’s what makes comedy funny, frankly. This might be best exemplified when a joke is made “too soon.” The subject matter will become funny in time, after the shocking truth has worn off some, but when too soon, the insult is just too much to take in good taste and no enjoyment can be had from exposing that particular truth.

Is there a conclusion to be drawn? I dunno. The culture has room for both seriousness and humor, gallows and otherwise. I wonder sometimes if the ability to act with seriousness of purpose to forestall the worst is even possible anymore. Instead, we’re absorbed by distractions and cheap entertainments that divert our attention to trifles. (Why am I aware in even the slightest way of the Kardashians?) A true expression of the Zeitgeist perhaps, we know deep down that the dominoes are tipping over and we’re lined up to take hit after hit until everything has crumbled around us. So why not laugh and party right up to the bitter end?

I’m not quite yet done with the idea behind this post, namely, that certain insidious ideas permit problems that more wizened thinking might avoid. If I were less judicious, I might say that lousy ideas generate many of our problems, but cause-and-effect and correlation links are too subtle to draw unambiguous conclusions. Along those lines, I’ve been puzzling the last few weeks over the Middle East, including (of course) Israel and North Africa. Everyone seems to have a pet theory how to put an end to endless violence and mayhem in the region. Most theories call for (further) bombing, strategic or otherwise, of one faction or another. Clearly, that’s not really a solution, since wreaking even more havoc and violence solves nothing, and it’s equally obvious that no pat solution exists. The situation has become a multigenerational, multinational conflict that perpetuates itself, the original provocation(s) having been long forgotten or subsumed into more recent events. Such events include no small amount of meddling and destabilization by the United States and its allies, plus economic difficulties that have people in the streets agitating for a reasonable share of what’s available, which is diminishing rapidly as overpopulation and industrial collapse ramp up in the region.

Reasons why conflict arises are many, but let’s not lose sight of our response. Statesmen of an earlier era might have been predisposed toward diplomatic and economic responses. Indeed, foreign aid and restructuring plans such as those that followed WWII might be examples of a better way to deploy our resources now to achieve desirable results for everyone (here and there). So why do today’s government policy- and decision-makers with their fingers on the buttons — those holding the presumed monopoly on the use of force — now so frequently resort to bombing and decades-long armed response, entailing boots on the ground, air strikes from carriers positioned in the region, and now drone warfare? Destroying people, infrastructure, industrial capacity, and with them means of living peaceably does not make us safer at home, unless there is something they know that I don’t. Rather, considering the apparently unlimited availability of arms to various factions (in high contrast with, um, er, well, food and jobs), it seems obvious that we’re seeding revolution while radicalizing populations that might prefer to be left alone to work out their own problems, which frankly would probably involve armed conflict. So in effect, we’re painting the bullseye on our own backs (and have been for a long time as the self-appointed World Police with strategic interests extending quite literally across the globe), uniting disparate factions against a common enemy — us.

So let me ask again: what makes this possible? In an era of psychotic knowledge and managed perception (and to a far lesser extent, managed consent), many leaders have developed bunker mentality, where everyone is a threat (even from within) and they (whoever they are, it hardly matters) all always poised to come for us and take away our vaunted freedoms (rhetoric alert). Never mind that the reverse is actually more true. I’ve argued before that bunker mentality goes hand-in-hand with Cold War paranoia drummed into the heads of folks who were children in the 1950s and 60s. Too many duck-and-cover air raid drills during primary school left indelible marks on their souls. Younger G-men and -women are undoubtedly infected by the meme now, too, by frequent security briefings that make the world look far more dangerous (to us) than it actually is, not unlike so many police shows on TV that overstate by a large margin the frequency of, say, street shootouts. (Fatalities from automobile accidents and obesity far outstrip losses from terrorism and other existential threats. Go look it up.) Fruit of that propaganda is our current fight-or-flight response: always, always fight; never, ever take flight. The mouth-breathing public is on board with this, too, always ready to throw down with reckless, half-wit commentary such as “bomb them back to the Stone Age!” Yet a few noisy pundits are beginning to suggest that the U.S. transition back to a more isolationist policy, perhaps sitting out a conflict or two rather than engaging reflexively, thoughtlessly, and pointlessly. Isolationism was our stance prior to WWII, having learned in the American Civil War and WWI that warfare absolutely sucks and should be avoided instead of relished. Living memory of those conflagrations is now gone, and we’re left instead with bullshit jingoism about the Greatest Generation having won WWII, quietly skipping over wars we lost gave up on in Korea and Vietnam.

For a long time, people have tried to draw connections between TV and videogame violence and actual crime. The same is true of pornography and rape. No direct links have been demonstrated convincingly using the tools of psychometrics, much to the chagrin of crusaders and moralists everywhere. Yet the commonsense connection has never really been dispelled: if the culture is positively saturated with images of violence and sexuality (as it is), whether actual, fabricated, or fictional (for the purpose of dramatic license and entertainment), then why wouldn’t vulnerable thinkers’ attitudes be shaped by irrational fear and lust? That’s nearly everyone, considering how few can truly think for themselves, resisting the dominant paradigm. Imagery and rhetoric deployed against us throughout the mainstream media is undoubtedly hyperviolent and hypersexual, but we’re smarter as a people than to succumb to such lures and lies? Sorry, even without peer-reviewed studies to show direct causation, that just doesn’t pass the straight-face test.

Following up the idea of resource consumption discussed in this post, I stumbled across this infographic (click on graphic for full-size version):

.
The infographic wasn’t published on Earth Day (April 22), but it might should have been. Also, concern with what starting date to use when naming the current geological epoch after ourselves (the Anthropocene), while perhaps interesting, is more than a little self-congratulatory — but in the darkest sense, since we wrecked everything. I have nothing further to say about the futility of naming a geological epoch after ourselves considering how it marks our self-annihilation and soon enough no one will be left to know or care.

Let me describe briefly what else the infographic shows. In the extremely tiny slice of geological time (1760–2010 CE) shown along the x-axis, we have been on a gradually rising trend of consumption (measured by human population, air pollution, energy use, large dams, and more recently, number of motor vehicles), which is mirrored by a decreasing trend in available resources (measured in tropical forest area and number of species). The author, Haisam Hussein, notes that around 1950, trends began a steep acceleration (in both directions), which have not yet reached their limits. Of course, there are limits, despite what ideologues may say.

To recharacterize in slightly more recognizable terms, let’s say that the entire human population is the equivalent of Easter Islanders back in the day when they were cutting down now-extinct Rapa Nui palms as part of their ongoing project of building monuments to themselves. The main difference is that the whole planet stands in for Easter Island. And instead of palm trees, let’s say our signature resource is a money tree, because, after all, money makes the world go around and it grows on trees. Easter Island was completely forested up to about 1200 CE but became treeless by around 1650 CE. The trend was unmistakable, and the mind boggles now (hindsight being 20/20) at what must have been going on in the minds of the islanders who cut down the last tree. Here’s the equally obvious part: the planet (the money tree) is also a bounded (finite) ecosystem, though larger than Easter Island, and we’re in the process of harvesting it as fast as we can go because, don’t ya know, there’s profit to be made — something quite different from having enough to live comfortable, meaningful lives.

So we’re not yet down to our final tree, but we’re accelerating toward that eventuality. It’s unclear, too, what number of trees constitutes a viable population for reproductive purposes. When considering the entire planet as an interlocking ecosystem, the question might be better posed as the number of species needed to maintain the lives of, say, large mammals like us. Aggregate human activity keeps whittling away at those species. Of course, the last money tree isn’t a physical tree like the Rapa Nui palm; it’s a social construct where ROI on continued mining, drilling, manufacturing, harvesting, building, paving, transportation, distribution, etc. runs its course and all profit-making activity comes to a screeching halt. The so-called shale oil miracle that promised eventual U.S. energy independence only a few moons ago has already busted (it was going to anyway as production tailed off quickly) and job losses keep piling up (tens and hundreds of thousands worldwide). Consider that a small, inconsequential brake on accelerating trends. Where things get really interesting is when that bust/halt spreads to every sector and food/energy supplies are no longer available in your neighborhood, or possibly, just about anywhere unless you grow your own food well away from population centers.

Virtually every failed bygone civilization provides evidence that we, too, will proceed doing what we’re doing heedlessly: cutting down trees until at last there are no more. Again, the mind boggles: what could possibly be going on in the minds of those holding the reins of power and who know where we’re headed (to oblivion) yet keep us pointed there steadfastly? And why don’t more of us regular folks also know our trajectory and take to task our leaders for failing to divert from our trip into the dustbin of history?

I’m not in the business of offering pat solutions to intractable problems plaguing modern society. That’s the mandate of government, which aims to manage the affairs of men and women as well as possible — yet typically fails abysmally. (Recurring war is an obvious example of inability to discover better solutions, except that war is now relished as a profit engine in a world gone mad, so it’s become desirable.) Rather, my interest is directed to attempting to understand the ways the world actually works. Perhaps that is too pointlessly abstract until such understanding is put into practice and acted upon, not just in policy formations but in legislation, incentives, and behaviors that make differences in how we live. However, so many well-meaning policies of the past have led us down the primrose path that I hesitate to suggest my understandings are in any way superior to or would lead to more effective policy than those formulated by folks whose mandate it is to address problems of social organization head on.

Accordingly, teasing cause-and-effect or correlation out of the hypercomplex interactions of myriad moving parts (people mostly), each possessing agency and ambition, is tantamount to chasing a chimera. Curiously, Paul Chefurka, whose writing and thinking I admire, revealed in comments at Nature Bats Last (April 2015) that he only recently abandoned his search for root causes behind the imminent collapse of industrial civilization, and with them, plausible solutions and/or ways out of the quagmire. Whereas some might take such an epistemological collapse as their cue to punt and say, “fuck it, let’s party,” I still find myself struggling to makes sense of things. So it occurs to me that without recommending root causes or solutions of my own, it may be nonetheless worthwhile to observe that specific problems often proceed from generalized attitudes that may be amenable to change. Put another way, it’s our own attitudes that give rise to our problems or make those problems possible. Let me take as an example just one issue, which is one of the larger elephants in the room.

Read the rest of this entry »

When any given technology reaches maturity, one might think that it’s time perhaps to stop innovating. A familiar, reliable example is the codex, also known as the book, now many centuries old and an obvious improvement over clay tablets and paper scrolls. Its low cost and sheer utility have yet to be surpassed. Yet damn it all if we don’t have inferior alternatives being shoved down our throats all the time, accompanied ad naseum by the marketers’ eternal siren song: “new and improved.” Never mind that novelty or improvement wasn’t even slightly needed. A more modern example might be Microsoft Word 5.1, dating from 1992, which dinosaurs like me remember fondly for its elegance and ease of use. More than 20 years later, Microsoft Office (including MS Word) is widely considered to be bloatware, which is to say, it’s gone backwards from its early maturity.

So imagine my consternation when yet another entirely mature technology, one near and dear to the hearts of music lovers (those with taste, anyway), received another obligatory attempt at an update. Behold the preposterous, ridiculous, 3D-printed, 2-string, piezoelectric violin designed by Monad Studio:

Someone teach the poor model, chosen for her midriff no doubt, how to hold the bow! The view from the opposite side offers no improvement: Read the rest of this entry »

The comic below alerted me some time ago to the existence of Vaclav Smil, whose professional activity includes nothing less than inventorying the planet’s flora and fauna.

Although the comic (more infographic, really, since it’s not especially humorous) references Smil’s book The Earth’s Biosphere: Evolution, Dynamics, and Change (2003), I picked up instead Harvesting the Biosphere: What We Have Taken from Nature (2013), which has a somewhat more provocative title. Smil observes early in the book that mankind has had a profound, some would even say geological, impact on the planet:

Human harvesting of the biosphere has transformed landscapes on vast scales, altered the radiative properties of the planet, impoverished as well as improved soils, reduced biodiversity as it exterminated many species and drove others to a marginal existence, affected water supply and nutrient cycling, released trace gases and particulates into the atmosphere, and played an important role in climate change. These harvests started with our hominin ancestors hundreds of thousands of years ago, intensified during the era of Pleistocene hunters, assumed entirely new forms with the adoption of sedentary life ways, and during the past two centuries transformed into global endeavors of unprecedented scale and intensity. [p. 3]

Smil’s work is essentially a gargantuan accounting task: measuring the largest possible amounts of biological material (biomass) in both their current state and then across millennia of history in order to observe and plot trends. In doing so, Smil admits that accounts are based on far-from-perfect estimates and contain wide margins of error. Some of the difficulty owes to lack of methodological consensus among scientists involved in these endeavors as to what counts, how certain entries should be categorized, and what units of measure are best. For instance, since biomass contains considerable amounts of water (percentages vary by type of organism), inventories are often expressed in terms of fresh or live weight (phytomass and zoomass, respectively) but then converted to dry weight and converted again to biomass carbon.

Read the rest of this entry »