Following up the idea of resource consumption discussed in this post, I stumbled across this infographic (click on graphic for full-size version):

.
The infographic wasn’t published on Earth Day (April 22), but it might should have been. Also, concern with what starting date to use when naming the current geological epoch after ourselves (the Anthropocene), while perhaps interesting, is more than a little self-congratulatory — but in the darkest sense, since we wrecked everything. I have nothing further to say about the futility of naming a geological epoch after ourselves considering how it marks our self-annihilation and soon enough no one will be left to know or care.

Let me describe briefly what else the infographic shows. In the extremely tiny slice of geological time (1760–2010 CE) shown along the x-axis, we have been on a gradually rising trend of consumption (measured by human population, air pollution, energy use, large dams, and more recently, number of motor vehicles), which is mirrored by a decreasing trend in available resources (measured in tropical forest area and number of species). The author, Haisam Hussein, notes that around 1950, trends began a steep acceleration (in both directions), which have not yet reached their limits. Of course, there are limits, despite what ideologues may say.

To recharacterize in slightly more recognizable terms, let’s say that the entire human population is the equivalent of Easter Islanders back in the day when they were cutting down now-extinct Rapa Nui palms as part of their ongoing project of building monuments to themselves. The main difference is that the whole planet stands in for Easter Island. And instead of palm trees, let’s say our signature resource is a money tree, because, after all, money makes the world go around and it grows on trees. Easter Island was completely forested up to about 1200 CE but became treeless by around 1650 CE. The trend was unmistakable, and the mind boggles now (hindsight being 20/20) at what must have been going on in the minds of the islanders who cut down the last tree. Here’s the equally obvious part: the planet (the money tree) is also a bounded (finite) ecosystem, though larger than Easter Island, and we’re in the process of harvesting it as fast as we can go because, don’t ya know, there’s profit to be made — something quite different from having enough to live comfortable, meaningful lives.

So we’re not yet down to our final tree, but we’re accelerating toward that eventuality. It’s unclear, too, what number of trees constitutes a viable population for reproductive purposes. When considering the entire planet as an interlocking ecosystem, the question might be better posed as the number of species needed to maintain the lives of, say, large mammals like us. Aggregate human activity keeps whittling away at those species. Of course, the last money tree isn’t a physical tree like the Rapa Nui palm; it’s a social construct where ROI on continued mining, drilling, manufacturing, harvesting, building, paving, transportation, distribution, etc. runs its course and all profit-making activity comes to a screeching halt. The so-called shale oil miracle that promised eventual U.S. energy independence only a few moons ago has already busted (it was going to anyway as production tailed off quickly) and job losses keep piling up (tens and hundreds of thousands worldwide). Consider that a small, inconsequential brake on accelerating trends. Where things get really interesting is when that bust/halt spreads to every sector and food/energy supplies are no longer available in your neighborhood, or possibly, just about anywhere unless you grow your own food well away from population centers.

Virtually every failed bygone civilization provides evidence that we, too, will proceed doing what we’re doing heedlessly: cutting down trees until at last there are no more. Again, the mind boggles: what could possibly be going on in the minds of those holding the reins of power and who know where we’re headed (to oblivion) yet keep us pointed there steadfastly? And why don’t more of us regular folks also know our trajectory and take to task our leaders for failing to divert from our trip into the dustbin of history?

I’m not in the business of offering pat solutions to intractable problems plaguing modern society. That’s the mandate of government, which aims to manage the affairs of men and women as well as possible — yet typically fails abysmally. (Recurring war is an obvious example of inability to discover better solutions, except that war is now relished as a profit engine in a world gone mad, so it’s become desirable.) Rather, my interest is directed to attempting to understand the ways the world actually works. Perhaps that is too pointlessly abstract until such understanding is put into practice and acted upon, not just in policy formations but in legislation, incentives, and behaviors that make differences in how we live. However, so many well-meaning policies of the past have led us down the primrose path that I hesitate to suggest my understandings are in any way superior to or would lead to more effective policy than those formulated by folks whose mandate it is to address problems of social organization head on.

Accordingly, teasing cause-and-effect or correlation out of the hypercomplex interactions of myriad moving parts (people mostly), each possessing agency and ambition, is tantamount to chasing a chimera. Curiously, Paul Chefurka, whose writing and thinking I admire, revealed in comments at Nature Bats Last (April 2015) that he only recently abandoned his search for root causes behind the imminent collapse of industrial civilization, and with them, plausible solutions and/or ways out of the quagmire. Whereas some might take such an epistemological collapse as their cue to punt and say, “fuck it, let’s party,” I still find myself struggling to makes sense of things. So it occurs to me that without recommending root causes or solutions of my own, it may be nonetheless worthwhile to observe that specific problems often proceed from generalized attitudes that may be amenable to change. Put another way, it’s our own attitudes that give rise to our problems or make those problems possible. Let me take as an example just one issue, which is one of the larger elephants in the room.

Read the rest of this entry »

When any given technology reaches maturity, one might think that it’s time perhaps to stop innovating. A familiar, reliable example is the codex, also known as the book, now many centuries old and an obvious improvement over clay tablets and paper scrolls. Its low cost and sheer utility have yet to be surpassed. Yet damn it all if we don’t have inferior alternatives being shoved down our throats all the time, accompanied ad naseum by the marketers’ eternal siren song: “new and improved.” Never mind that novelty or improvement wasn’t even slightly needed. A more modern example might be Microsoft Word 5.1, dating from 1992, which dinosaurs like me remember fondly for its elegance and ease of use. More than 20 years later, Microsoft Office (including MS Word) is widely considered to be bloatware, which is to say, it’s gone backwards from its early maturity.

So imagine my consternation when yet another entirely mature technology, one near and dear to the hearts of music lovers (those with taste, anyway), received another obligatory attempt at an update. Behold the preposterous, ridiculous, 3D-printed, 2-string, piezoelectric violin designed by Monad Studio:

Someone teach the poor model, chosen for her midriff no doubt, how to hold the bow! The view from the opposite side offers no improvement: Read the rest of this entry »

The comic below alerted me some time ago to the existence of Vaclav Smil, whose professional activity includes nothing less than inventorying the planet’s flora and fauna.

Although the comic (more infographic, really, since it’s not especially humorous) references Smil’s book The Earth’s Biosphere: Evolution, Dynamics, and Change (2003), I picked up instead Harvesting the Biosphere: What We Have Taken from Nature (2013), which has a somewhat more provocative title. Smil observes early in the book that mankind has had a profound, some would even say geological, impact on the planet:

Human harvesting of the biosphere has transformed landscapes on vast scales, altered the radiative properties of the planet, impoverished as well as improved soils, reduced biodiversity as it exterminated many species and drove others to a marginal existence, affected water supply and nutrient cycling, released trace gases and particulates into the atmosphere, and played an important role in climate change. These harvests started with our hominin ancestors hundreds of thousands of years ago, intensified during the era of Pleistocene hunters, assumed entirely new forms with the adoption of sedentary life ways, and during the past two centuries transformed into global endeavors of unprecedented scale and intensity. [p. 3]

Smil’s work is essentially a gargantuan accounting task: measuring the largest possible amounts of biological material (biomass) in both their current state and then across millennia of history in order to observe and plot trends. In doing so, Smil admits that accounts are based on far-from-perfect estimates and contain wide margins of error. Some of the difficulty owes to lack of methodological consensus among scientists involved in these endeavors as to what counts, how certain entries should be categorized, and what units of measure are best. For instance, since biomass contains considerable amounts of water (percentages vary by type of organism), inventories are often expressed in terms of fresh or live weight (phytomass and zoomass, respectively) but then converted to dry weight and converted again to biomass carbon.

Read the rest of this entry »

I have always remembered a striking line from the movie The Dancer Upstairs where the police investigator, who is tracking the leader of Shining Path in Peru in the 1980s, says (paraphrasing from Spanish), “I think there is a revolution going on.” Elsewhere on the globe today, Arab Spring has morphed from a series of U.S.-instigated regime changes into an emerging Arab state (ISIS), though establishing itself is violent and medieval. According to Tom Engelhardt, even the U.S. has a new political system rising out of the ruins of its own dysfunction. Unless I’m mistaken, a revolution is a political system being overthrown by mass uprising of the citizenry, whereas a coup is a powerful splinter within the current regime (often the military wing) seizing administrative control. What Engelhardt describes is more nearly a coup, and like the quote above, it appears to be coalescing around us in plain sight, though that conclusion is scarcely spoken aloud. It may well be that Engelhardt has succeeded in crystallizing the moment. His five principal arguments are these:

  1. 1% Elections — distortion of the electoral system by dollars and dynasties.
  2. Privatization of the State — proper functions of the state transferred into the hands of privateers (especially mercenaries and so-called warrior corporations — nice neologism).
  3. De-legitimization of Congress and the Presidency — fundamental inability to govern, regulate, and/or prosecute at the Federal level, opening up a power vacuum.
  4. Rise of the National Security State (Fourth Branch of Government) — the dragnet complex revealed (in part) by whistle-blower Edward Snowden but plain to see post-9/11.
  5. Demobilization of the American People — surprising silence of the public in the face of such unwholesome developments.

Please read the article for yourself, which is very well written. (I am no great fan of the journalistic style but must acknowledge that Engelhardt’s work is terrific.) I especially like Engelhardt’s suggestion that a grand conspiracy (e.g., New World Order) is not necessary but that instead it’s all being improvised on the run. Let me offer a couple observations of my own.

Power has several attributes, such as the position to influence events, the resources to get things done, and the ability to motivate (or quell) the public through active management of perception. High offices (both government and boardroom, both elected and appointed) are the positions, the U.S. Treasury and the wealth of the 1% are the resources, and charismatic storytelling (now outright lying) is management of perception. Actors (a word chosen purposely) across the American stage have been maneuvering for generations to wield power, often for its own sake but more generally in the pursuit of wealth. One might assume that once personal wealth has been acquired motivations would slacken, but instead they divert in not a few psychopaths to maniacal building of multigenerational dynasties.

Pulling the levers of state in one capacity or another is a timeworn mechanism for achieving the proxy immortality of the American statesman. However, as dysfunction in the political arena has grown, corporations (including banks) have assumed the reins. Despite corporate personhood being conferred and recently expanded, largely via judicial fiat, the profit motive has reasserted itself as primary, since there is no such thing as a fully self-actualized corporation. Thus, we have the Federal Reserve System acting as a de facto corporation within government — but without conscience. Multiply that hundreds of times over and voilà: an American corporatocracy.

The effect has been extrapolated in numerous movies and television shows, all offering dystopic warnings of things to come where people, domestic and alien, are all expendable as power seeks to perpetuate itself. How far this can go before financial collapse, climate change, energy scarcity, or a host of others looming calamities overtakes is yet to be seen. Some hold out hope for true revolution, but I believe that possibility has been contained. Considering how the world has been accelerating toward ecocide, I venture that at most a few more decades of desperate negotiations with fate are in store for us. Alternatively, I find it entirely feasible that the delicate web of interconnections that maintain life in all its manifestations could suffer a phase shift rather quickly, at which point all bets are off. Either way, in no one’s wildest imagination could our current civilization be considered the best we can do, much less the best of all possible worlds.

This is an unapologetic interdiction directed to the influx of new followers to this blog. It is meant to be at least partly true and maybe a bit humorous (if I can strike the right tone, which is unlikely). It is also inspired by Leavergirl’s recent post called “Pulling the Plug,” though I’d been mulling this post for at least a week prior to reading hers. Make of it what you will.

To Follow or Unfollow — That is the Question

Over the past few months, I have received a steady trickle of new followers/subscribers to this blog. The count is now over 250 (still pretty modest, I know, so why am I complaining?). Unlike most bloggers, Facebookers, and pundits who revel in increased attention that hits, likes, friending, thumbs up/down, votes, ratings, rankings, links, referrals, trackbacks, reblogs, and follows/subscriptions would suggest, I care about none of those. Evidence that anyone wandering into The Spiral Staircase is actually reading what’s written is mostly absent. (There is a surprisingly large number of Filipinos who find this blog searching for Scheler’s Hierarchy in Google, where my post is currently the fourth hit returned on the search. None of them stop to comment.) Real proof would be a thoughtful comment that addresses the subject of the post. Agreement and disagreement are both welcome but not really the point. I get some comments, but not many. However, if this blog were to receive scores of comments like successful blogs do (measured solely by numbers, of course), I would not be able to keep up. Therefore, I’m not especially desirous of voluminous commentary. Like the fellow who blogs at Gin and Tacos (see blogroll), I’d probably end up throwing up a post for consideration then ignoring the comments (or at least not deigning to reply, which I consider tantamount to the same). Admittedly, I don’t always have a reply.

I recognize that among the millions and billions of people out there surfing the Internet, lots of intelligent, thoughtful, sensitive, humane people do exist. The proportion of them who can construct a good English sentence with something worthwhile to say, on the other hand, is suspiciously small. I don’t quite know why (reckless conjecture withheld). So seriously, what the hell are you doing here? If your blog is in a foreign language (non-English) or is an obvious content farm, I’m not returning any favors. If you write a series of inspirational posts (religion, self-help, life coaching, careers, fashion, etc.) or muse on daily life, I’m not reading your posts. If you’re selling vinyl siding somewhere in Canada, probably Ontario (I’ve actually got one such follower), I’m not even remotely interested in buying. Think about going somewhere else. If you’re selling SEO, then please DIAF. Read the rest of this entry »

Giving Back to the People

Posted: February 21, 2015 in Debate, Economics, Politics
Tags: , , ,

rant on/

I’m usually content to allow abstention to be my protest vote, but in the latest round of municipal elections here in Chicago, I have been sufficiently motivated to cast my protest vote (via absentee ballot, meaning it won’t even be counted until after all the shouting is done). So what motivates me to break my voting silence? Simply put, the mayor (small m, always small).

Chicago’s Feb. 24, 2015, municipal election might as well be called the 2015 Mayoral Re-Election considering what’s at stake and how the outcome is mostly a forgone conclusion thanks to modern polling practices. Besides re-electing the mayor, three other officials are running unopposed (varies by precinct/ward) and there are four pointless nonbinding referenda. Pretty slim ballot. We already know that four challengers to the incumbent mayor will mostly likely share the minority vote and thus be unable to force a runoff necessary to focus on just two candidates (or one viable challenger). My interest in removing Rahm Emanuel from office (an intent echoed plainly by his challengers) stems mainly from reporting in The Chicago Reader by Ben Joravsky. I trust Joravsky’s view of local issues, published in an independent newspaper, far more than anything that might appear in the city’s two leading rags (The Chicago Tribune and The Chicago Sun-Times — no links), both of which I ignore. The lesser influence is Kari Lydersen’s book Mayor 1%: Rahm Emanuel and the Rise of Chicago’s 99%. I admit I haven’t read the book (I refuse to wade into that sewer) but I have read reviews, which are a nearly unanimous chorus of disgust at “Rotten Rahm.”

All this brings me back yet again to wondering why public office is a desirable destination. Half of the political rhetoric is about “taking back” (for the people), acknowledging that government at all levels has been hijacked, while the other half is “giving back” (to the people), a presumed bourgeois (Kennedyesque) devotion to public service shared only by those who are decidedly not bourgeois (read: rich, powerful, and insulated from the masses). It’s largely incumbents on one side, challengers on the other, but that’s not wholly true, as Illinois’ newly elected and installed governor (small g, always small) was a challenger. I find it difficult to judge motivations; results are more transparent. The nastiness of the results, judged individually and over time (since the early 1980s is a typical jumping off point when political economics are discussed), demonstrate that it’s been a radically uneven system of rewards and punishments. The underclass and minorities (large overlap there) are by turns abandoned to their fates and punished for their lack of success, the middle class continues to be squeezed out of existence by policies and practices that proceed with the inexorable power of demographics, and the rich get the spoils. It’s unclear whether any challenger to Chicago’s current mayor will act for or against the people, just as next years’ presidential (small p, always small) election will likely shape up as battle of political intents and promises, but I’m all for moving on from those whose results clearly demonstrate a different battle being waged and won.

rant off/

Update: Well, color me surprised! The incumbent mayor (small m, always small) failed to achieve a majority, so there will be a runoff election in April against top challenger Jesus “Chuy” Garcia. I couldn’t be more pleased. Even the media is reporting on Rahm Emanuel’s flailing attempts to polish the turd that is his administration. I guess a $16 million campaign war chest and rebranding effort proved insufficient to overcome all the bad faith he has earned over the past four years.

Yes, we’re always still at war. With whom or what exactly, in the absence of formal declarations of war, is still up for grabs. While nominally a Global War on Terror or terrorism (shades of other not-really-wars on Drugs and Poverty — each made more important by using caps), our objective remains poorly defined beyond blanket justification for an expanded national security state operating both domestically and abroad, as well as the recognition that departure of U.S. forces from foreign theaters of war would almost certainly lead to even worse civil wars and power struggles among competing warlords and emerging nation-states. So the U.S. military continues to strike against diverse targets and still has boots on the ground in the Middle East and Afghanistan. Although Pres. Obama, the Commander in Chief, inherited our military escapades from his predecessor (as do most chief executives) and campaigned on promises to, among other things, close the U.S. torture site military base in Guantanamo Bay and end the wars, the U.S. has not yet abandoned its misadventures even after numerous timetables for withdrawal have been set and surpassed.

After more than a decade, the U.S. public has grown tired of news reports on wars on multiple fronts and the mainstream media no longer reports on U.S. operations with the same diligence or breathless excitement. We have all succumbed to war fatigue. I, too, no longer track or pay attention to such old news. The same inattention is characteristic of the Fukushima nuclear disaster, truly a gift that keeps giving (and giving and giving for a thousand years). Updates on Fukushima can be found here, though I hesitate to believe fully what is presented because the truth is normally spun before being released or simply withheld. Updates and news on current operations relating to war can be found here and here, but the same caveat applies.

It’s not an innocent or passive question: why do these wars on multiple fronts continue to be prosecuted? Unlike Fukushima, they can be turned off, right? Well, in a word (or three), no, they can’t. The reason is that way, way, way too much money is made off war profiteering. The U.S. Department of Defense (DoD) consumes 22% ($496 billion) of the Federal budget for FY 2015:

This factoid is only the base budget for defense, however. Costs of foreign wars are kept on separate ledgers, such as Overseas Contingency Operations (OCO), which for FY 2015 is an additional $64 billion. Like the DoD base budget, the actual amount depends on where one seeks information and varies considerably between proposed, asked for, granted, and actual (not yet known). This link actually dares the reader to “Guess How Much America Spends on Defense” with its subtitle. Exactitude is not especially important, but trying to obtain a clear and mostly accurate picture is certainly a trip down the rabbit hole. See, for instance, this graphic based on data collected from various sources, which adds the interesting category non-DoD defense spending:

All this is our tax dollars at work. If we spent these dollars on building a stable, equitable society instead of basically blowing up other people people’s shit, I wonder what the U.S. would now look like? Of course, that hypothetical is absurd, because other countries that have been content to allow the U.S. to almost single-handedly police the world and shoulder the costs, keeping their own security costs minimal, have not fared a whole lot better. Apparently, it is not necessary for a country to operate as a full-blown military-industrial complex to own its share of corruption and inequity.

Since the eruption of bigotry against Islam on the Bill Maher’s show Real Time last October, I have been bugged by the ongoing tide of vitriol and fear-mongering as radical Islam becomes this century’s equivalent of 20th-century Nazis. There is no doubt that the Middle East is a troubled region of the world and that many of its issues are wrapped about Islamic dogma (e.g., jihad) that have been hijacked by extremists. Oppression, misogyny, violence, and terrorism will get no apologetics from me. However, the fact that deplorable behaviors often have an Islamic flavor does not, to my mind, excuse bigotry aimed at Islam as a whole. Yet that is precisely the argument offered by many pundits and trolls.

Bill Maher did not get the ball rolling, exactly, but he gave it a good shove, increasing its momentum and seeming righteousness rightness among weak thinkers who take their cues and opinions from television personalities. Maher wasn’t alone, however, as Sam Harris was among his guests and argued that Islam is “the mother lode of bad ideas.” The notable exception on the panel that episode was Ben Affleck (Nicholas Kristof also made good points, though far more diplomatically), who called bullshit on Islam-baiting but failed to convince Maher or Harris, whose minds were already made up. Maher’s appeals to authoritative “facts” and “reality” (a sad bit of failed rhetoric he trots out repeatedly) failed to convince in the other direction.

Read the rest of this entry »

A Surfeit of Awards

Posted: January 29, 2015 in Culture, Education, Idle Nonsense, Tacky, Taste
Tags: ,

/rant on

I get alumni magazines from two colleges/universities I attended. These institutional organs are unapologetic boosters of the accomplishments of alumni, faculty, and students. They also trumpet never-ending capital campaigns, improvements to facilities, and new and refurbished buildings. The latest round of news from my two schools feature significant new and rebuilt structures, accompanied by the naming of these structures after the foundations, contributors, and faculty/administrators associated with their execution. Well and good, you might surmise, but I always have mixed feelings. No doubt there are certain thresholds that must be met for programs to function and excel: stadia and gyms, locker rooms, concert halls and theaters, practice and rehearsal spaces, equipment, computer labs, libraries and their holdings, etc. Visiting smaller schools having inadequate facilities always brought that point home. Indeed, that’s one of the reasons why anyone chooses a school: for the facilities.

Since the late sixties or so, I have witnessed one school after another (not just in higher education) becoming what I think of as lifestyle schools. Facilities are not merely sufficient or superior; they range into the lap of luxury and excess. It’s frankly embarrassing that the quality and furnishings of dormitories now exceed what most students will enjoy for decades post-graduation. In my college years, no one found it the slightest bit embarrassing to have meager accommodations. That’s not why one was there. Now the expectation is to luxuriate. Schools clearly compete to attract students using a variety of enticements, but delivering the best lifestyle while in attendance was formerly not one of them. But the façades and accoutrements are much easier to evaluate than the academic programs, which have moved in the opposite direction. Both are now fraudulent at many schools; it’s a game of dress-up.

That rant, however, may only the tip of the proverbial iceberg. I cannot escape the sense that we celebrate ourselves and our spurious accomplishments with amazing disregard for their irrelevance. Unlike many, who dream of achieving immortality through proxy, the desire to see one’s name on the side of a building, in a hall of fame, on an endowed chair, etched in a record book, or otherwise gouged into posterity confounds me. Yet I can’t go anywhere without finding another new feature named after someone, usually posthumously but not always, whose memory must purportedly be preserved. (E.g., Chicago recently renamed the Circle Interchange after its first and only female mayor, Jane Byrne, causing some confusion due to inadequate signage.) The alumni magazines were all about newly named buildings, chairs, scholarships, halls, bricks, and waste cans. It got to be sickening. The reflex is now established: someone gives a pile of money or teaches (or administers) for a time, name something after him or her. And as we enter championship and awards season in sports and cinema, the surfeit of awards doled out, often just for showing up and doing one’s job, is breathtaking.

Truly memorable work and achievement need no effusive praise. They are perpetuated through subscription. Yet even they, as Shelley reminds us, pass from memory eventually. Such is the way of the world in the long stretches of time (human history) we have inhabited it. Readers of this blog will know that, in fairly awful terms, that time is rapidly drawing to a close due to a variety of factors, but primarily because of our own prominence. So one might wonder, why all this striving and achieving and luxuriating and self-celebrating when its end is our own destruction?

/rant off