This is an unapologetic interdiction directed to the influx of new followers to this blog. It is meant to be at least partly true and maybe a bit humorous (if I can strike the right tone, which is unlikely). It is also inspired by Leavergirl’s recent post called “Pulling the Plug,” though I’d been mulling this post for at least a week prior to reading hers. Make of it what you will.

To Follow or Unfollow — That is the Question

Over the past few months, I have received a steady trickle of new followers/subscribers to this blog. The count is now over 250 (still pretty modest, I know, so why am I complaining?). Unlike most bloggers, Facebookers, and pundits who revel in increased attention that hits, likes, friending, thumbs up/down, votes, ratings, rankings, links, referrals, trackbacks, reblogs, and follows/subscriptions would suggest, I care about none of those. Evidence that anyone wandering into The Spiral Staircase is actually reading what’s written is mostly absent. (There is a surprisingly large number of Filipinos who find this blog searching for Scheler’s Hierarchy in Google, where my post is currently the fourth hit returned on the search. None of them stop to comment.) Real proof would be a thoughtful comment that addresses the subject of the post. Agreement and disagreement are both welcome but not really the point. I get some comments, but not many. However, if this blog were to receive scores of comments like successful blogs do (measured solely by numbers, of course), I would not be able to keep up. Therefore, I’m not especially desirous of voluminous commentary. Like the fellow who blogs at Gin and Tacos (see blogroll), I’d probably end up throwing up a post for consideration then ignoring the comments (or at least not deigning to reply, which I consider tantamount to the same). Admittedly, I don’t always have a reply.

I recognize that among the millions and billions of people out there surfing the Internet, lots of intelligent, thoughtful, sensitive, humane people do exist. The proportion of them who can construct a good English sentence with something worthwhile to say, on the other hand, is suspiciously small. I don’t quite know why (reckless conjecture withheld). So seriously, what the hell are you doing here? If your blog is in a foreign language (non-English) or is an obvious content farm, I’m not returning any favors. If you write a series of inspirational posts (religion, self-help, life coaching, careers, fashion, etc.) or muse on daily life, I’m not reading your posts. If you’re selling vinyl siding somewhere in Canada, probably Ontario (I’ve actually got one such follower), I’m not even remotely interested in buying. Think about going somewhere else. If you’re selling SEO, then please DIAF. Read the rest of this entry »

Giving Back to the People

Posted: February 21, 2015 in Debate, Economics, Politics
Tags: , , ,

rant on/

I’m usually content to allow abstention to be my protest vote, but in the latest round of municipal elections here in Chicago, I have been sufficiently motivated to cast my protest vote (via absentee ballot, meaning it won’t even be counted until after all the shouting is done). So what motivates me to break my voting silence? Simply put, the mayor (small m, always small).

Chicago’s Feb. 24, 2015, municipal election might as well be called the 2015 Mayoral Re-Election considering what’s at stake and how the outcome is mostly a forgone conclusion thanks to modern polling practices. Besides re-electing the mayor, three other officials are running unopposed (varies by precinct/ward) and there are four pointless nonbinding referenda. Pretty slim ballot. We already know that four challengers to the incumbent mayor will mostly likely share the minority vote and thus be unable to force a runoff necessary to focus on just two candidates (or one viable challenger). My interest in removing Rahm Emanuel from office (an intent echoed plainly by his challengers) stems mainly from reporting in The Chicago Reader by Ben Joravsky. I trust Joravsky’s view of local issues, published in an independent newspaper, far more than anything that might appear in the city’s two leading rags (The Chicago Tribune and The Chicago Sun-Times — no links), both of which I ignore. The lesser influence is Kari Lydersen’s book Mayor 1%: Rahm Emanuel and the Rise of Chicago’s 99%. I admit I haven’t read the book (I refuse to wade into that sewer) but I have read reviews, which are a nearly unanimous chorus of disgust at “Rotten Rahm.”

All this brings me back yet again to wondering why public office is a desirable destination. Half of the political rhetoric is about “taking back” (for the people), acknowledging that government at all levels has been hijacked, while the other half is “giving back” (to the people), a presumed bourgeois (Kennedyesque) devotion to public service shared only by those who are decidedly not bourgeois (read: rich, powerful, and insulated from the masses). It’s largely incumbents on one side, challengers on the other, but that’s not wholly true, as Illinois’ newly elected and installed governor (small g, always small) was a challenger. I find it difficult to judge motivations; results are more transparent. The nastiness of the results, judged individually and over time (since the early 1980s is a typical jumping off point when political economics are discussed), demonstrate that it’s been a radically uneven system of rewards and punishments. The underclass and minorities (large overlap there) are by turns abandoned to their fates and punished for their lack of success, the middle class continues to be squeezed out of existence by policies and practices that proceed with the inexorable power of demographics, and the rich get the spoils. It’s unclear whether any challenger to Chicago’s current mayor will act for or against the people, just as next years’ presidential (small p, always small) election will likely shape up as battle of political intents and promises, but I’m all for moving on from those whose results clearly demonstrate a different battle being waged and won.

rant off/

Update: Well, color me surprised! The incumbent mayor (small m, always small) failed to achieve a majority, so there will be a runoff election in April against top challenger Jesus “Chuy” Garcia. I couldn’t be more pleased. Even the media is reporting on Rahm Emanuel’s flailing attempts to polish the turd that is his administration. I guess a $16 million campaign war chest and rebranding effort proved insufficient to overcome all the bad faith he has earned over the past four years.

Yes, we’re always still at war. With whom or what exactly, in the absence of formal declarations of war, is still up for grabs. While nominally a Global War on Terror or terrorism (shades of other not-really-wars on Drugs and Poverty — each made more important by using caps), our objective remains poorly defined beyond blanket justification for an expanded national security state operating both domestically and abroad, as well as the recognition that departure of U.S. forces from foreign theaters of war would almost certainly lead to even worse civil wars and power struggles among competing warlords and emerging nation-states. So the U.S. military continues to strike against diverse targets and still has boots on the ground in the Middle East and Afghanistan. Although Pres. Obama, the Commander in Chief, inherited our military escapades from his predecessor (as do most chief executives) and campaigned on promises to, among other things, close the U.S. torture site military base in Guantanamo Bay and end the wars, the U.S. has not yet abandoned its misadventures even after numerous timetables for withdrawal have been set and surpassed.

After more than a decade, the U.S. public has grown tired of news reports on wars on multiple fronts and the mainstream media no longer reports on U.S. operations with the same diligence or breathless excitement. We have all succumbed to war fatigue. I, too, no longer track or pay attention to such old news. The same inattention is characteristic of the Fukushima nuclear disaster, truly a gift that keeps giving (and giving and giving for a thousand years). Updates on Fukushima can be found here, though I hesitate to believe fully what is presented because the truth is normally spun before being released or simply withheld. Updates and news on current operations relating to war can be found here and here, but the same caveat applies.

It’s not an innocent or passive question: why do these wars on multiple fronts continue to be prosecuted? Unlike Fukushima, they can be turned off, right? Well, in a word (or three), no, they can’t. The reason is that way, way, way too much money is made off war profiteering. The U.S. Department of Defense (DoD) consumes 22% ($496 billion) of the Federal budget for FY 2015:

This factoid is only the base budget for defense, however. Costs of foreign wars are kept on separate ledgers, such as Overseas Contingency Operations (OCO), which for FY 2015 is an additional $64 billion. Like the DoD base budget, the actual amount depends on where one seeks information and varies considerably between proposed, asked for, granted, and actual (not yet known). This link actually dares the reader to “Guess How Much America Spends on Defense” with its subtitle. Exactitude is not especially important, but trying to obtain a clear and mostly accurate picture is certainly a trip down the rabbit hole. See, for instance, this graphic based on data collected from various sources, which adds the interesting category non-DoD defense spending:

All this is our tax dollars at work. If we spent these dollars on building a stable, equitable society instead of basically blowing up other people people’s shit, I wonder what the U.S. would now look like? Of course, that hypothetical is absurd, because other countries that have been content to allow the U.S. to almost single-handedly police the world and shoulder the costs, keeping their own security costs minimal, have not fared a whole lot better. Apparently, it is not necessary for a country to operate as a full-blown military-industrial complex to own its share of corruption and inequity.

Since the eruption of bigotry against Islam on the Bill Maher’s show Real Time last October, I have been bugged by the ongoing tide of vitriol and fear-mongering as radical Islam becomes this century’s equivalent of 20th-century Nazis. There is no doubt that the Middle East is a troubled region of the world and that many of its issues are wrapped about Islamic dogma (e.g., jihad) that have been hijacked by extremists. Oppression, misogyny, violence, and terrorism will get no apologetics from me. However, the fact that deplorable behaviors often have an Islamic flavor does not, to my mind, excuse bigotry aimed at Islam as a whole. Yet that is precisely the argument offered by many pundits and trolls.

Bill Maher did not get the ball rolling, exactly, but he gave it a good shove, increasing its momentum and seeming righteousness rightness among weak thinkers who take their cues and opinions from television personalities. Maher wasn’t alone, however, as Sam Harris was among his guests and argued that Islam is “the mother lode of bad ideas.” The notable exception on the panel that episode was Ben Affleck (Nicholas Kristof also made good points, though far more diplomatically), who called bullshit on Islam-baiting but failed to convince Maher or Harris, whose minds were already made up. Maher’s appeals to authoritative “facts” and “reality” (a sad bit of failed rhetoric he trots out repeatedly) failed to convince in the other direction.

Read the rest of this entry »

A Surfeit of Awards

Posted: January 29, 2015 in Culture, Education, Idle Nonsense, Tacky, Taste
Tags: ,

/rant on

I get alumni magazines from two colleges/universities I attended. These institutional organs are unapologetic boosters of the accomplishments of alumni, faculty, and students. They also trumpet never-ending capital campaigns, improvements to facilities, and new and refurbished buildings. The latest round of news from my two schools feature significant new and rebuilt structures, accompanied by the naming of these structures after the foundations, contributors, and faculty/administrators associated with their execution. Well and good, you might surmise, but I always have mixed feelings. No doubt there are certain thresholds that must be met for programs to function and excel: stadia and gyms, locker rooms, concert halls and theaters, practice and rehearsal spaces, equipment, computer labs, libraries and their holdings, etc. Visiting smaller schools having inadequate facilities always brought that point home. Indeed, that’s one of the reasons why anyone chooses a school: for the facilities.

Since the late sixties or so, I have witnessed one school after another (not just in higher education) becoming what I think of as lifestyle schools. Facilities are not merely sufficient or superior; they range into the lap of luxury and excess. It’s frankly embarrassing that the quality and furnishings of dormitories now exceed what most students will enjoy for decades post-graduation. In my college years, no one found it the slightest bit embarrassing to have meager accommodations. That’s not why one was there. Now the expectation is to luxuriate. Schools clearly compete to attract students using a variety of enticements, but delivering the best lifestyle while in attendance was formerly not one of them. But the façades and accoutrements are much easier to evaluate than the academic programs, which have moved in the opposite direction. Both are now fraudulent at many schools; it’s a game of dress-up.

That rant, however, may only the tip of the proverbial iceberg. I cannot escape the sense that we celebrate ourselves and our spurious accomplishments with amazing disregard for their irrelevance. Unlike many, who dream of achieving immortality through proxy, the desire to see one’s name on the side of a building, in a hall of fame, on an endowed chair, etched in a record book, or otherwise gouged into posterity confounds me. Yet I can’t go anywhere without finding another new feature named after someone, usually posthumously but not always, whose memory must purportedly be preserved. (E.g., Chicago recently renamed the Circle Interchange after its first and only female mayor, Jane Byrne, causing some confusion due to inadequate signage.) The alumni magazines were all about newly named buildings, chairs, scholarships, halls, bricks, and waste cans. It got to be sickening. The reflex is now established: someone gives a pile of money or teaches (or administers) for a time, name something after him or her. And as we enter championship and awards season in sports and cinema, the surfeit of awards doled out, often just for showing up and doing one’s job, is breathtaking.

Truly memorable work and achievement need no effusive praise. They are perpetuated through subscription. Yet even they, as Shelley reminds us, pass from memory eventually. Such is the way of the world in the long stretches of time (human history) we have inhabited it. Readers of this blog will know that, in fairly awful terms, that time is rapidly drawing to a close due to a variety of factors, but primarily because of our own prominence. So one might wonder, why all this striving and achieving and luxuriating and self-celebrating when its end is our own destruction?

/rant off

Dave Pollard, who blogs at How to Save the World, published an article in Shift Magazine called “See No Evil: The Morality of Collapse.” The subtitle in particular intrigued me because devising ways to respond to collapse is more realistic than forestalling it or attempting to fix things. Pollard often organizes his thinking in terms of infographics and offers the doomer taxonomy shown below. He admits considerable overlap between categories and migration between them as individuals confront the issues and learn more about what is either possible or desirable. The categories divide nearly in half (by type, not population) into collapsniks and salvationists, with two additional categories of fence-sitters at opposite ends of the vertical axis, which represents hope or optimism as one ascends to the top. Read the rest of this entry »

Difficult Pleasures

Posted: January 14, 2015 in Consumerism, Culture, Idle Nonsense, Taste

Everyone has acquaintance with what I call the menu problem. One goes to an unfamiliar restaurant and studies the menu to make a selection from among a broad range of options. Those options may change (or not) based on seasonal availability of quality ingredients or some chef/menu designer at the home office in Columbus, Ohio, changing things up to create buzz in a bid to attract greater market share. (McDonald’s rolls out or resurrects moribund sandwiches with surprising regularity.) No matter, one must content oneself with the eventual decision or else suffer buyer’s remorse.

But the problem lies not so much in the grass-is-greener syndrome of food choices not elected but in the presumption that an optimal choice is possible from myriad options. Put another way, if one can make a poor choice (say, something not to taste), then it’s implicit that one can make a superior choice, maybe even one leading to a peak experience — all this simply by showing up and paying (or overpaying) the bill. A similar quandary lies behind the problem of brand competition and fragmentation, where available options for the right (or wrong) toothpaste, cola, cell phone and plan, credit card, TV show, movie, etc. multiply and create bewilderment if one deliberates too much. Even for someone with effectively unlimited funds, time limitations result in the inability to evaluate even a majority of slated offerings.

And therein lies the rub: conspicuous consumption is too easy and carries with it the suggestion of ecstasy if only one chooses well. Little may be done to earn the enjoyment or reward, and without some struggle, getting what one wants often feels hollow. In contrast, honest gratification over even meager portions, quality, or results often follows on hardship and extended effort. For example, those who have roughed it out in the wilderness know that doing without for a spell can transform something as pedestrian as granola into an unexpectedly superlative experience. Or in the classroom, an easy A is unappreciated precisely because students know intuitively that it isn’t really evidence of learning or achievement, whereas a hard-won B– might be the source of considerable pride. Under the right conditions, one might even feel some justifiable righteousness, though in my experience, hardships endured tend to produce humility.

One of the sure-fire ways I discovered of triggering euphoria is endurance racing. When the finish line finally swings into view, I recognize that in a few more moments, I will have accomplished the distance and be able to stop pushing. My time and place relative to my age group are irrelevant. I also know that I can’t have that rush if I don’t first sacrifice and suffer for it. Further, contentment and euphoria cannot be sustained for long. Rather, they typically come at the end of something, inviting a nostalgic glow that fades as normalcy reasserts itself.

I’m writing about this because I have rubbed elbows with some folks for whom the most perfect, exquisite pleasure is their expectation for everything all the time because they use their wealth to procure only the best at every turn. Maybe they subscribe to some form of lifestyle service, such as this one, and have others pampering and praising them round the clock. I contend that they don’t actually know how to be happy or to enjoy themselves because, when something goes awry or falls short, the fits and conniptions would embarrass a three-year-old. See, for example, this. Such shameful behavior also puzzles me because the current marketplace is a veritable cornucopia (not yet the notorious deathtrap from The Hunger Games). Improved distribution and delivery make stuff available cheaply and easily to nearly everyone with a computer and a credit card. Yet many take it all for granted, grind away miserably at service providers who fail their standards, and fail to recognize that most of it is poised to go away. The current Age of Abundance is shifting toward an epoch-making contraction, but in the meantime, some feel only discontentment because their appetites can never be sated. They don’t understand that difficult pleasures and cessation of suffering are far more gratifying than the lap of luxury.

Nafeez Ahmed has published a two-part article at Motherboard entitled “The End of Endless Growth” (see part 1 and part 2). Commentary there is, as usual, pretty nasty, so I only skimmed and won’t discuss it. Ahmed’s first part says that things are coming to their useful ends after an already extended period of decline, but the second argues instead that we’re already in the midst of a phase shift as (nothing less than) civilization transforms itself, presumably into something better. Ahmed can apparently already see the end of the end (at the start of a new year, natch). In part 1, he highlights primarily the work of one economist, Mauro Bonaiuti of the University of Turin (Italy), despite Bonaiuti standing on the shoulders of numerous scientists far better equipped to read the tea leaves, diagnose, and prognosticate. Ahmed (via Bonaiuti) acknowledges that crisis is upon us:

It’s the New Year, and the global economic crisis is still going strong. But while pundits cross words over whether 2015 holds greater likelihood of a recovery or a renewed recession, new research suggests they all may be missing the bigger picture: that the economic crisis is symptomatic of a deeper crisis of industrial civilization’s relationship with nature.

“Civilization’s relationship with nature” is precisely what Ahmed misunderstands throughout the two articles. His discussion of declining EROEI and exponential increases in population, resource extraction and consumption, energy use and CO2 emissions, and species extinction are good starting points, but he connects the wrong dots. He cites Bonauiti’s conclusion that “endless growth on a finite planet is simply biophysically impossible, literally a violation of one of the most elementary laws of physics: conservation of energy, and, relatedly, entropy.” Yet he fails to understand what that means beyond the forced opportunity to reset, adapt, and reorganize according to different social models.

At no point does Ahmed mention the rather obvious scenario where many billions of people die from lack of clean water, food, and shelter when industrial civilization grinds to a halt — all this before we have time to complete our phase shift. At no point does Ahmed mention the likelihood of widespread violence sparked by desperate populations facing immediate survival pressure. At no point does Ahmed mention the even worse likelihood of multiple nuclear disasters (hundreds!) when infrastructure fails and nuclear plants start popping like firecrackers.

What does Ahmed focus on instead? He promises “cheap, distributed clean energy” (going back up the EROEI slope) and a transition away from industrial agriculture toward relocalization and agroecology. However, these are means of extending population, consumption, and despoliation further into overshoot, not plans for sustainability at a far lower population. Even more worrisome, Ahmed also cites ongoing shifts in information, finance, and ethics, all of which are sociological constructs that have been reified in the modern world. These shifts are strikingly “same, only different” except perhaps the ethics revolution. Ahmed says we’re already witnessing a new ethics arising: “a value system associated with the emerging paradigm is … supremely commensurate with what most of us recognize as ‘good’: love, justice, compassion, generosity.” I just don’t see it yet. Rather, I see continued accumulation of power and wealth among oligarchs and plutocrats, partly through the use of institutionalized force (looking increasingly like mercenaries and henchmen).

Also missing from Ahmed’s salve for our worries is discussion of ecological collapse in the form of climate change and its host of associated causes and effects. At a fundamental level, the biophysical conditions for life on earth are changing from the relative steady state of the last 200,000 years or so that humans have existed, or more broadly, the 65 million years since the last major extinction event. The current rate of change is far too rapid for evolution and culture to adapt. New ways of managing information, economics, and human social structures simply cannot keep up.

All that said, well, sure, let’s get going and do what can be done. I just don’t want to pretend that we’re anywhere close to a new dawn.

This is a rant about review of Peter Jackson’s recently completed movie trilogy based on J.R.R. Tolkien’s novel The Hobbit. My review of Tolkien’s novel is here.

Cinema and literature differ, the former being predominantly visual and the latter being textually descriptive. Oddly, the textual approach often yields better results when more is left to the reader’s imagination, sort of like simple darkness, the bogeyman in the closet, or the monster lurking under the bed but never seen. Tolkien’s Middle Earth inspired a rich tradition of illustration from the outset. Readers wanted to see what they imagined, and conceptual artists complied. I remember Middle Earth calendars from the 1970s featuring various characters, architectures, and landscapes, which now form the basis for the design aesthetic of Jackson’s endeavors. (Middle Earth calendars from the last decade often feature pictures of the actors, New Zealand, and/or the film sets, which are quite dissatisfying to me.)

Attempts to bring Middle Earth to life in cinema were exercises in failure before Jackson’s The Lord of the Rings (LoTR) trilogy appeared (2001–2003). Until then, appealing as Tolkien’s characters and stories are, they were considered unfilmable in a film era where casts of thousands no longer exist. Technological innovation (i.e., CGI, the acronym for Computer-Generated Imagery) enabled Jackson to overcome many limitations. Casts and costumes could be abbreviated and sets could be made out of foam or rendered digitally. Sadly, those same innovations have returned us to an era when Tolkien is unfilmable precisely because the unbelievable image now overwhelms the characters and story. I suspect that the combination of attributes (e.g., studio oversight of an untried director, fidelity to source material, superior conceptual design, and narrative solution-finding) that functioned so well to make LoTR successful is undercut in The Hobbit trilogy, which is another exercise in failure on a number of levels.

Read the rest of this entry »

In a recent comments thread at Nature Bats Last, I reopened the question (in a short-lived discussion) whether those of us convinced of the truth of climate change and the specter of Near-Term Extinction (NTE — always contingent upon events yet to occur but nonetheless foreseeable) should be raising awareness now that it’s too late to do anything meaningful about it. Specifically, Guy McPherson continues to travel around reporting the science and drawing conclusions, with epistemological and confidence-shaking effects on his audiences. Q&A sessions that often follow his presentations are problematical because participants, if they’re honest, find no refuge from the death sentence levied against humanity. Yet many of them, perhaps confronting the issue for the first time in earnest (it’s been out there at the fringes for decades), cling desperately to the faith and/or hope that something — anything — must be done to appeal, reverse, or forestall the inevitable. The mental gymnastics required to do so are obvious, and members of the public have plenty of company in ongoing media and political debate that has succeeded for decades in blocking responses to negative impacts of our own behaviors in favor of business as usual. So as evidence continues to mount and manifest before our eyes with, for example, habitat loss, collapsing animal populations, disappearing sea ice, and increased frequency and intensity of destructive weather events and trends, the expectation is that, on the heels of a presentation or revelation, someone will have a moment of severe existential crisis (waiting for all of us, frankly) and perhaps decide to kill the messenger.

A few days ago, I became aware (via Ugo Bardi’s blog Resource Crisis) of one such messenger: a fake expert being interviewed by a fake news anchor on a fake news show. I’ve embedded the fictional scene below. Read the rest of this entry »