Archive for June, 2011

I’ve never been up close to a windfarm, having only seen them in pictures. Like many engineering and infrastructure projects, they are pretty remarkable in terms of what the clever ape is able to accomplish, transforming the environment to suit his needs, usually with the true costs unknown and delayed in their effect.

Most criticisms of alternative energy report that because we’re still in the era of cheap oil and coal, the economics of producing energy via nuclear, solar, or wind sources are not yet advantageous. That’s the short view. The long view is that the economics of energy production will change soon enough (for the worse) so that alternatives must be brought online now, with no delay, lest we experience shortages for lack of foresight and planning. The option of doing without is unthinkable.

Another criticism is that windfarms are not scalable, especially if they were needed to supply all anticipated demand. This was addressed in the response to a question at The Straight Dope about whether windfarms could alter the weather. Admittedly, the answer is pure extrapolation, but it gets at the heart of the issue with a few telltale facts:

It’s estimated that meeting world energy demand (not just electricity) is going to take something like 44 terawatts of capacity in 2100. There’s talk of generating 10 percent of that with wind power — 4.4 terawatts … The Department of Energy estimates that meeting 20 percent of the country’s electricity demand with wind power in 2030 will require 300 gigawatts of generating capacity. That translates to 150,000 turbines in 46 states.

Could so many windfarms alter the weather? The conclusion, based on scientific modeling, was yes. I don’t find that astounding, but many who believe the volumes of the atmosphere and oceans too large for mankind to affect substantially often do and may even go so far as to deny the possibility despite plenty of evidence accumulating. To me, it’s merely the proverbial death by a thousand cuts inflicted on the body, only in this case, the body is the planet. Even when we know the disastrous effects, we still plunge headlong into the abyss, as described in this article at The Daily Mail Online, which describes offshore windfarms being built in the North Atlantic.

So what is a windfarm like up close? According to the folks in the documentary film Windfall (based on the trailer anyway, since I’ve known about the film for some time but not yet watched it), living in proximity to even one wind turbine is subtly unnerving. I suspect subsonic sound, known to create a sense of unease, is produced by those spinning blades, and disturbances to light and shadow would cause the human nervous system to react reflexively as if to movement. I can only imagine what a whole field or seascape full of them would be like. Depending on whose interpretation can be trusted, the movie is either a cautionary tale or a surrealist mock-horror documentary. The surrounding issue — where is our energy going to come from? — could also be understood as either cautionary or surreal.

Articles signaling the death of the liberal arts degree, or more generally, the liberal education it’s meant to signify, seem to be popping up with alarming regularity, especially since the job market for new college graduates has tanked at the same time their indebtedness has skyrocketed. As with most debates, both sides have their points, which I won’t pretend aren’t worthy of consideration. However, I discern significant confusion about what a liberal education is, as distinguished from what one gets from a trade school or professional school, or indeed, what one might expect by forgoing higher education or even dropping out of high school.

This confusion is on full display in a Salon article by Kim Brooks entitled “Is It Time to Kill the Liberal Arts Degree?” (I’ll use title case even though the article doesn’t.) The author takes the stupid, obvious, journalistic approach and poses the question, articulates the pro and con arguments, but then refuses to answer the question. Brooks is effectively talking out of both sides of her mouth on the topic. Still, the implied conclusion (yes, kill it!) is clear from front-loading the article with infantile demands that liberal education justify itself in financial terms and by dismissing its true merit as irrelevant, permanently one might suppose, in the first two sentences of the passage quoted below:

I’m not asking if a college education has inherent value, if it makes students more thoughtful, more informed, more enlightened and critical-minded human beings. These are all interesting questions that don’t pay the rent. What I’m asking is far more banal and far more pressing. What I’m asking is: Why do even the best colleges fail so often at preparing kids for the world?

(more…)

Back to book blogging. Iain McGilchrist in The Master and His Emissary has a discussion of music and language as they relate to brain activity and cognition. He never really defines either term, music or language, so their usual definitions presumably apply. However, while the meaning of language as a system of communication using symbols, syntax, and inflection is fairly standard, I infer that McGilchrist’s concept of music does not refer necessarily to familiar Western musical forms (some highbrow, some populist) with which most of us are familiar. Rather, he appears to reference a more global style of expression that uses primarily pitch and rhythm, typically vocal in production. In fact, he cites evidence that music has a significant role in human prehistory, long before language developed, and that language may actually have grown out of vocal utterances called musilanguage not wholly dissimilar to birdsong.

Sloppy thinkers often call music a language or wax poetic, saying that music is the language of pure emotion. McGilchrist calls out these mistakes in a number of passages. For example, he points out that while language reveals meaning, music frequently conceals it. That may be surprising to some, yet it should be obvious that music does not use well-understood symbols as spoken language does and accordingly does not mean anything in particular outside of itself (except, of course, when music uses words). Music communicates, in a sense, but is preverbal and abstract. It’s worth pointing out that McGilchrist treats music as a mode of expression or style of thought but so far ignores the question of artistic merit, which is irrelevant.

(more…)

Status Seeking

Posted: June 18, 2011 in Culture, Idealism, Idle Nonsense

I blogged briefly on esteem needs, which is closely related to status seeking, and I’m bothered by continued conjecture in some quarters about the issue, not that I ever expect anything to be settled. Human nature is too complex to boil down to a few rules, and everyone has their pet theory or plausible explanation about what motivates action in life — me no less than others. Here are a couple ideas I’ve seen floated (mostly at Ran Prieur’s blog, see his entry on June 1, 2011) about status seeking, which I will paraphrase.

Seeking and protecting status is an evolutionary holdover from a time when those of high status were more insulated from threat and violence than rank-and-file members of social systems. The basic problem I have with this is that evolution is a primarily biological process that has been misapplied to other things where change over time can be observed, which is often conflated with progress. Computer operating systems have changed over time, but they haven’t evolved in any meaningful sense. (Some would snark they haven’t progressed, either.) Status is an artifact of culture, and to speak of cultural evolution is to introduce distortions into one’s thinking. Insulation from threat is still operative, even though we live in a supposedly egalitarian society. For instance, high-ranking military officers and corporate executives are pretty well removed from flying bullets and grunt work experienced by foot soldiers and laborers. Some of high rank even have security details to block any possibility of exposure to the rabble. It’s obvious that some are valued more than others.

Because democratic societies flatten social hierarchies, one can essentially behave like an alpha male (or female) continuously and few will challenge your bluff. First, conformity is not the same as equality. It’s pretty clear that while we have a narrow range of narratives provided to explain and describe modern life, we are certainly not over hero worship or raising some few individuals to the level of idols. Athletes, actors, singers, politicians, and anyone with a modicum of fame develops a kind of floating aura that confers status over the plebes. A few people of extraordinary confidence can probably carry off the bluff necessary to establish high rank where none has actually been earned or awarded, but the rest of us would look egotistical and absurd and are likely to be ignored.

The only people who don’t care about status are those who have never experienced high status and accordingly diminish what others value. Status is a social construct, like many others, and may have some biological origins (like pecking orders within wolf and dog packs). However, among humans, social ordering doesn’t typically take root until adolescence. Most of us who attended public school are familiar with the fairly radical shift in social orientation from grade school, with its homogeneous homeroom communities of around 30 students, to middle school or junior high, where students quickly autosort themselves into cliques. A top dog (or several) often emerges, leaving some riding on coattails and others totally disenfranchised. It’s pretty intense at that stage of life, but once through the gauntlet, status typically returns to background noise as one joins the flow of adult life. Those who become fame whores and seek to perpetuate their status are ironically the “cool kids” who experience a sense of loss when others realize it’s mostly a bunch of nonsense. The rest of us just grow out it.

Hipsters are infuriating because they’ve figured out how to hack the system and obtain status markers too easily. This assumes everyone wants or admires status, which I find suspect, and relies on envy of those who supposedly have status. However, hipster credibility is primarily a commercial behavior, e.g., wearing the right clothing or eating at the right restaurants to induce others to imitate the same behaviors. Marketers employ “cool hunters” to identify and reinforce emergent trends, so manipulation of what’s cool is at work and fluctuates constantly as things rise and fall in popularity. It’s worth noting that post-ironic chic is an assiduous repudiation of style quite popular among hipsters.

No one of high status is allowed to question the dominant narrative, and anyone who does is stripped of status. This one I agree with, but as suggested above, this is really one example of the intersection between conformity and status. It’s rare to see someone who is highly regarded turn around and impeach the very thing from which he or she benefits. For instance, someone might refuse an award out of humility or to make a political statement, but the Oscar acceptance speech we will never hear is the one where the winner laughs at the audience and The Academy for buying into the notion that the Oscar (or any one of a million other benefits heaped on celebrities for being celebrated — an incestuously self-reinforcing cycle if ever there was one) means anything.

Frumpy Old Men

Posted: June 13, 2011 in Cinema, Idle Nonsense, Narrative

Hollywood has long had a problem with its leading men and women aging to the point that it no longer knows quite what to do with them or indeed has use for them. The problem is much worse for women than men, as women are usually limited to playing grande dames and grandmothers as though their lives in old age (past the, um, MILF or hotness years — unless you happen to be Helen Mirren) are defined solely by their offspring. Men sometimes get to play elder statesmen, tycoons, or wizened patriarchs, and even less frequently men of action (Patrick Stewart and Ian McKellen as two battling old dudes in X-Men (or is that too old dudes?) or McKellen (again) in The Lord of the Rings), but in today’s cinema, they’re increasingly tasked with playing frumpy old men lost in the world in some misguided attempt at cinéma vérité to reflect a wider cultural issue where people lose relevance and intrinsic interest with age, finally being reduced to biding one’s time waiting for the reaper. If movie characters don’t die in the course of the story or aren’t introduced as being terminal (The Bucket List), they almost always have a death scare and a trip to the hospital. That’s what old people are good for in movies: death gestures. But this isn’t a case of Hollywood storytelling dealing with the issue responsibly, it’s really just injection of cheap drama without needing to bother to develop worthy antagonists.

With Meet the Parents, Meet the Fockers, Little Fockers, and now Everybody’s Fine (could been titled Everybody’s Goode since the main character is named “Frank Goode” — how obvious is that?), Robert De Niro seems to have finally evolved from playing high-strung cops, mobsters, and psychopaths to playing a frumpy old dude on daily medication. Jack Nicholson could handle being cast against type in About Schmidt, and Clint Eastwood made his lonely, abandoned character in Gran Torino abundantly entertaining, but De Niro is badly miscast in Everybody’s Fine. His entire prior screen persona is about overwhelming others with intensity, and he fares really poorly playing someone diminished in retirement, left behind by his dead spouse, and ignored by his own children. In fact, very little about this character-driven movie was entertaining at all, as it was largely an extended portrayal of people mistreating and lying to De Niro as Frank. The supposed catharsis at the end rang false because even though all the children were caught in their lies, their father’s bland unmasking of their deceit cost them nothing, so there was no dramatic tension to diffuse and no apparent contrition, just a meaningless Hallmark moment around the near-death bed. In fact, the movie was so lackluster, one wonders why bother making it at all? And why bother blogging about it?

Germany has decided to decommission its nuclear power plants by 2022. In this WaPo article, it is reported with econometric dispassion, like a business article. That’s one way to approach the subject. In a related WaPo editorial, the decision is openly called a blunder, an overreaction to the Fukushima disaster. (No mention is made of Germany’s prior experience with being contaminated by Chernobyl fallout.) The principal editorial objection comes in the form of an implicit question: If not through nuclear energy, how will Germany gets its electrical power? Although the German public and its leaders can certainly imagine it, I guess American journalists at The Washington Post simply can’t fathom a low-energy future and therefore regard nuclear power as the best or maybe a necessary alternative to burning fossil fuels, because of, ya know, that nasty carbon problem contributing to global warming.

The debate about how energy is generated and used will undoubtedly heat up further as power eventually and inevitably gets so expensive it becomes a losing proposition. Germany has decided that, considering the risks involved and despite its being old technology by now, nuclear energy is no longer tenable. What emerges from this is a classic confrontation about which future must be forestalled: one of austerity, where energy abundance ebbs away, or one of unavoidable catastrophic technological failures borne out of hubris and desperation. It should be obvious that these options aren’t mutually exclusive.

For a different, more circumspect consideration of Germany’s decision to give up its nukes, see this article at openDemocracy. The author, Holger Nehring, attempts to construct a cultural narrative about Germany’s decades-long relationship with nuclear energy and other destructive technologies, but in the process, he peers into the minds of individuals, as opposed to the wider culture, and falls prey to a version of the intentional fallacy. His article also reads as though he has leapt upon the recent news as an opportunity to dump his doctoral dissertation or a book he’s preparing into a news article, where the plethora of references and complexity of themes probably don’t belong. In truth, I’m very sympathetic to the sort of analysis he appears to be undertaking, but I don’t find what he’s argued very cogent, nor do those offering commentary at the website.

Nonetheless, I’m heartened by the indication that some people are heeding history lessons, abandoning futile projects, and adopting a post-materialist agenda with stronger social concerns than economic ones. As J.H. Kunstler says with some frequency, our immediate challenge is to figure out how to manage economic contraction, not to restart the perpetual growth machine.

I’ve been blogging for a little over five years, which is a long time by blogging standards. The obvious exceptions are the big blogs with multiple writers, lots of traffic, and financial support. I’ve no intention yet to hang it up, unlike my cohorts over at Creative Destruction. This entry is my 250th post, which in five years probably isn’t a lot. But then, I lavish attention on my entries. This blog has attracted 411 comments thus far, including my own, and about 40,000 hits or views. The daily hits have edged up a bit and normally run between 20 and 50, though it’s not uncommon to spike above 60 for reasons I can’t anticipate or fathom. How many of those hits are spambots is unclear, but with 250 entries, the search engines have picked up phrases I use and direct traffic my way. The top posts haven’t changed, which are about refuse, skyscrapers, and darkened skies.

I activated controls for feedback and various networking sites a while back, but other than collecting a few “likes” and a few votes, no one has used the networking controls (at least as far as I can tell). Of course, I don’t tweet and don’t have a Facebook account, so it was just an experiment to see what would happen, which predictably enough was nothing. I tried one poll and got not even a single vote.

I’ve used the same WordPress theme since the start, but I decided it’s time to change it. I’m tired of looking at the same color scheme. So I chose a theme called Greyzed, which has one feature I really like: the number of comments appears at the top right of each post in a call-out bubble. I also like that the published date is right below the title of each entry rather than below the text. I wanted something with inverted text color (not black on white but white on gray or something darker) to avoid too much light glaring from the screen. None of the WordPress themes have quite the right combination of color scheme, layout, and customization without paying for features, so I compromised in the interest of something different.

I’ve mentioned a few times in various posts that although I’m a doomer, I can’t blog all the time about the awfulness I expect to come. It’s too much like staring into the sun. I continue to mention the collapse of industrial civilization a lot and even have some lengthy diatribes. In fact, most of what I have to say are complaints about things that appear egregiously wrong to me. So the tag line “Are you climbing or descending?” and the original choice of the spiral staircase as a metaphor for improvement in the form of cyclical ascent or decay in the form of descent has been overtaken by forbidding expectations for human history.

I got carried away in a face-to-face exchange recently and brought too much ammunition to the debate, which resulted in my blindsided interlocutor shutting down and dismissing me rather than suffer the conclusions my arguments required. I clearly overwhelmed her, which is good sport on talk TV, though overwhelming someone is usually accomplished by being louder and more insistent than by being convincing, and in retrospect, I felt some regret at being such a needless bully.

The episode made me think about how I can sometimes be insufferable, not because I’m in error necessarily (though that’s always possible) but because I’m prone to turn over rocks others would rather leave unexamined. As further ideas cascaded out of my head, I quickly hit upon three terrible truly true truths that typically cause others to dislike or even hate me for pointing them out. Under normal circumstances, I have the good sense not to shove truth in others’ faces. But I’m easily excited by ideas in an abstract way and can become intemperate. With all that in mind, I will unburden myself of the first terrible truly true truth and leave the other two unrevealed as too controversial. The first is uncontroversial yet is among the most enduring problems of philosophy: the foreknowledge of our own deaths, or what is sometimes called the human condition. Rest assured, I don’t have an exhaustive treatment. Just a few comments.

Fear of death is a sort of existential dread peculiar to humans, who alone among the animals (or so we think) can conceptualize time, escape the eternal present, and project forward to our own nonexistence. Death ranks as one of our greatest fears, but unlike others, there is nothing to project the fear onto or to rebel against. So we tend to put it out of mind or sweep it under the rug. Most of us are spared detailed knowledge of the time and manner of our demise, but as death approaches in advanced age, I’ve witnessed how some people begin to get more comfortable with it, losing the dread, perhaps even finally welcoming the inevitable.

In other cultures and other times, death probably did not haunt us through our days quite like it does now. For instance, those who obtain meat by slaughtering animals are far more intimately acquainted with death than those of us who get meat in hygienic, cellophane-wrapped packaging at the grocery. Medical practitioners are also undoubtedly very humanely aware of human mortality. Like many truths these days, however, death for many of us is stowed away and marginalized as some far-off news event that happens to other people, and even when it appears in our personal lives in, say, the loss of a family member, our grieving process is markedly dysfunctional.

My expectation is that the day may soon come when overpopulation begins to rebalance and carts will traverse the streets manned by public workers calling out “Bring out yer dead!” This isn’t a prediction of The Rapture, the rollover of the Mayan calendar, or some other end-of-times scenario. Rather, it’s a simple extrapolation of the likely effects of increased unavailability of energy to run civilization coupled with disruptions including climate change and ecological collapse. This, by the way, is not one of the three terrible truly true truths, though perhaps it ought to be.