Archive for April, 2011

Persistence Hunting

Posted: April 30, 2011 in Environment, History, Nomenclature

Idle conjecture on two subjects crops up continually in the blogs I frequent: the hunter-gatherer lifestyle and mechanisms of competition among and within species. Beyond the basic frameworks, arguments tend to proceed with astonishing confidence that by virtue of a mere thought experiment, often extrapolation from some scientific report also based on conjecture, we can uncover and know details of how hunter-gathers lived and thought or how certain human behaviors confer advantages that in turn allow natural selection to amplify those attributes over time, both today and in antiquity. I remain spectacularly unconvinced by most arguments offered by novices about these subjects, who typically tread in armed with more hubris than evidence. Some ideas manage to be plausible and exciting enough not to be discarded out of hand, but they suffice more as theories than as facts proven or adopted via consensus.

So I was especially intrigued by a story in Outside Magazine about persistence hunting, where hunters don’t outrun their prey in short races so much as persist over a period of hours to wear down their prey, which eventually collapse and are then easy to finish off. Follow the link if you want additional details in an unnecessarily protracted form, but long story short, the folks at Outside set up a trial in New Mexico to see whether marathoners and other endurance runners could actually outlast a pronghorn antelope on the run and hunt it down. The runners succeeded, though they didn’t dispatch the animal at the end.

David Attenborough has a YouTube video on the same subject featuring San bushmen and an African kudu:

To say the very least, I’m impressed that persistence hunting is doable, not just an idle thought experiment, and awed by the physical prowess of the runners. How prevalent this hunting technique was back in the Stone Age is still an open question, but as modern-day practitioners demonstrate, it’s definitely not for the faint of heart.

We’re all familiar with the term false modesty, where someone feigns humility but really has none. Usually the faker knows he’s faking, but not always. (The best liars lie best to themselves.) Is there a mirror term for people who believe they possess skills not really acquired? One might suggest fool or idiot but those are too generic. I ask because there is a growing body of tech that simulates or replicates some skill, leading users to believe (falsely) they actually have skills. I doubt someone would attempt to fly a plane solely on the strength of time spent in a flight simulator, much less a flight simulator video game. The risks are obviously too high. However, I can well imagine someone taking that risk with an automobile, confident that time riding as a passenger or driving a virtual car in a video game substitutes adequately for actual instruction and development of the driver’s feel for the vehicle in motion. No doubt they would be right some nontrivial percentage of the time.

If simulators were once intended to be instructional aids, they crossed over at some point into being games and ends in themselves, meaning that skill development would no longer be required to do a job. This is part of a larger trend toward a fully engineered environment where the knowledge, expertise, and attention formerly required to perform tasks are now mediated by tech that does some large portion of the work for you (presumably to free up your mind for other things but more likely just turning your head into sawdust). Cooking from scratch is a skill; cooking from a box recipe is false, not even cooking a lot of the time, just heating, like “baking” a frozen pizza. Another example is GarageBand for the iPad, software that simulates playing the guitar. The user/player doesn’t have to own an instrument or know beans about music to record tracks and imagine him- or herself an accomplished musician. No need to put in the time to acquire actual skills, much less good taste. The disconnect between simulation and skill has been repeated many times over (or so I hear) by players of Guitar Hero and later Rock Band, who get good at the game and then say to each other “Hey, let’s form a band!” only to discover that playing instruments is a far sight different from playing their game counterparts. But hey, if you can trick yourself into believing you have skills you don’t, well, what’s the harm?

Modern life is replete with tech assists that obviate skill. Calculators and spell checkers are two obvious, mundane examples. (Spell checkers haven’t yet solved the homophone problem.) Grammar checkers and writing templates work even less well. But don’t tell the attorneys, whose writing is often full of flaws. They are required to take numerous writing classes over the course of their law school careers, and they typically graduate with the inflated sense that they’re highly competent writers. Fine, let ’em believe whatever they want. But any idiot can see that repeated many times over across all disciplines: students who rely too easily on tech to simulate skills they will undoubtedly need but never really develop who then find themselves later in life in situations for which they’re wholly unprepared. If they cheated their way through school, they might know when they’re out of their depth. But as the products of esteem-building exercises that now pass as education, they often don’t realize when they’re lost at sea, or if they do, they don’t know why. It’s not simple overconfidence; it’s false hubris or some better term I haven’t yet discovered.

Update: I didn’t search for it, but I knew it was out there. Only one day after posting the blog above I learned of MasterWriter, a software program that makes anyone into helps anyone imagine themselves a poet, lyricist, or novelist. This is the blurb accompanying the software:

Masterwriter is simply the most powerful collection of writing tools ever assembled in one program. Whether you’re writing a song, a poem, or a novel, MasterWriter will unlock all the English language has to offer, to help you express yourself in a more unique and meaningful way.

I don’t doubt that writers can use the software advantageously, much like one would use a dictionary or thesaurus, but the launching point for being a writer is having something worthwhile to say and then crafting one’s ideas. It’s not a matter of mere word choice, and software does not substitute for ideas. Ugh.

Dino Walk

Posted: April 22, 2011 in Idle Nonsense, Technophilia

This YouTube video is rather startling:

There is a reason why the entire room is screaming continuously: they’re scared! And they ought to be. It took me a little while to spot the human legs in the middle of the dinosaur, which has probably been rescaled to fit the operator. We tend to focus on the dino’s eyes and extrapolate the direction of his gaze.

I can’t help wondering, though, whether it isn’t on some level extremely foolish to have kids petting fake dinosaurs and having footraces with them. Sure, it’s titillating and exciting, but it also inhibits our natural response to large animals, which is fear and wariness. The effects of desensitization have been obvious for years: people who want to feed bears or pet tigers or swim with sharks. (OK, swimming with sharks is probably less dangerous than we’re led to believe.) So they take unbelievable risks of profound stupidity, like jumping into the lion habitat at the zoo to get a cell phone video with their new buddies. Creating exhibits and environments that teach insensitivity to risk snares fools pretty reliably, those who rely wholly on the engineering of such environments to maintain personal safety. “Well, they wouldn’t build it unless it’s safe, would they?” says the idiot jumping up and down on the Grand Canyon Skywalk.

Not generally, no. But inevitably, people are injured or killed in titillating exhibits, such as roller coasters and other thrill rides. That risk is probably a lot more reasonable than getting up close and intimate with animals possessing the unpredictable power to kill us.

Inscrutable Enemies

Posted: April 18, 2011 in Artistry, Cinema, Tacky

I made the mistake of watching Battle: LA. Save yourselves from this utter waste of effort before seeing it. I have complained in the past about CGI festivals — films whose primary purpose is to showcase technical effects. Similar movies include Blackhawk Down, District 9, Children of Men, and Independence Day. The last three in that modest list at least have stories on which to hang their prolonged battle scenes, but Battle: LA and Blackhawk Down are nearly all battle, as if to say that the drama of men fighting and dying but always eventually triumphing is all by itself enough to green light a movie. A rather formidable list of films could easily be assembled, set in the Middle East, Afghanistan, and elsewhere, that feature the same well trod ground: blind commitment to more firepower and victory over an inscrutable enemy. I wouldn’t say that such films explore, examine, or develop these themes much except perhaps in an accidental way. (They lack even the knowing irony of Starship Troopers, which was regrettably lost in the sequels — a classic case of the underlying message from Heinlein’s novel being coopted and rendered impotent by cinematic noise.) Rather, jingoistic portrayal of American fighting men as heroes is the rule, and moral ambivalence about the justness or pointlessness or waste of war rarely appears. I can think of a few exceptions.

The inscrutable enemy is what most interests me, though. Alien invasion is a common enough theme, but it’s interesting that aliens are almost always humanoid, meaning they are mirrors or reflections of ourselves. (Steven Spielberg, I’m lookin’ at you!) After all, who can really work up indignation at the advance of mindless, soulless pea pods, corn stalks, bugs, or even blobs, whose only motivation is growth? No, we need enemies with faces and limbs to blow off, especially if they are sufficiently different to keep us from recognizing ourselves fully in them.

This calls to mind the famous Walt Kelly quote and book from 1972:

These movies all rely on dehumanizing our enemies so that we’re anesthetized from the horrors we inflict on them, much like we regard the terrorists we torture as subhuman. Yet underneath it all, we must know that we’re always really just warring against ourselves, the most inscrutable enemy. In the modern age, we’re also warring against the power complexes and technologies that have reduced human identity to being cogs in machines and processes we serve rather than being served by them. This last theme is explored repeatedly and continuously at The Compulsive Explainer, which repetition is necessary because it’s a subtle idea not easily grasped. I don’t recall the word misanthropy being used there, but I’ll use it here to observe that our obsessive film subjects — expressions of the Zeitgeist — frequently revolve around depictions of mayhem and destruction, which are at root self-destruction. This is true because at our core, we hate ourselves and the world we’ve made.

Collective Nouns

Posted: April 14, 2011 in Idle Nonsense, Nomenclature, Writing

Neologisms and archaic usage have always intrigued me. While I’m slow to adopt every hip, new coin that finds its way into use, my vocabulary is nonetheless always expanding, which I believe confers greater expressive and emotive power, even if that power is lost on most readers and interlocutors. Indeed, finding a young punk teenager with whom to have a reasonable conversation today is pretty difficult, not just because youngsters substitute attitude for knowing (probably true of youngsters from any era) but because their vocabularies are disastrously small.

Some of the more interesting examples of colorful vocabulary are collective nouns (sometimes called collective plurals, nouns of assemblage, or nouns of multitude) used to identify a group or plurality of things, ideas, concepts, or animals. Most English speakers are familiar with at least a few of the collective nouns for animals, or terms of venery, some of the more colorful ones being listed below:

  • a sleuth of bears
  • a murder of crows
  • a troop of monkeys
  • a clowder of cats
  • a leap of leopards
  • a lodge of beavers

Although usually mistaken as mere turns of phrase, some familiar collective nouns for things and concepts are listed below:

  • a wad of bills
  • a wealth of information
  • a fleet of airplanes
  • a bouquet of flowers
  • a bevy of beauties
  • an embarrassment of riches
  • a deck of cards
  • a body of knowledge
  • a pack of lies
  • a sheaf of papers
  • a course of bricks

Fairly expansive lists of collective nouns are not difficult to find on the Web, and I was surprised to learn from the Wikipedia article than many clever writers have purposely attempted to coin new collective nouns. In The Chronicles of Thomas Covenant, which I’ve been reading off and on for a few years, Stephen Donaldson frequently takes imaginative poetic license suggestive of collective nouns, usually referring to human emotion. With Donaldson’s phraseology in mind, I started expanding his usage with my own, some of which aren’t strictly plurals. If any of these ever make it into common use, I’d be surprised.

  • a wilderness of despair (Donaldson)
  • a ruin of hopes
  • an accusation of injustices
  • a chorus of freedoms
  • an apocalypse of hate (Donaldson)
  • a gruel of hatred
  • a wisp of souls
  • a coil of mortality (or mortals)
  • a mien of meanings
  • a fund of techniques
  • a trial of initiatives

Review: Babies

Posted: April 13, 2011 in Artistry, Cinema, Culture, Idle Nonsense

A movie trailer functions not as a synopsis, digest, or distillation but as an advertisement for the film from which it is drawn. As such, trailers often make films appear far more tantalizing than they turn out to be. My initial assessment based on a trailer doesn’t usually lead me astray, but from time to time, I’ll get get suckered into a movie that is either pointless or just plain bad. (Bad that crosses over into inspired camp is a rare special case.) I had seen the trailer for the French documentary Babies and knew immediately I am not part of its target audience. The inborn emotional response most people feel toward babies is absent from me. I don’t find them adorable or irresistible or desirable, nor do I enjoy holding them (based on those few occasions when someone foisted one into my arms, totally insensitive to my not having asked). They’re just extremely young people to me.

So I happened to spy the DVD on the shelf at the library, and despite my self-acknowledged disinterest, I decided to check it out (literally).


This is the second of two posts (the first is here) where I delayed blogging about an idea long enough that someone else wrote it first — and probably better than I could. I have no pretensions about being either a journalist or academic, nor do I spend my days writing for a living. Accordingly, what criticism I offer is of the armchair variety.

This time, two writers offered their take on the so-called happy chapter that concludes most works of social criticism: David Greenburg in The New York Times wrote “Why Last Chapters Disappoint” and Morris Berman at his blog Dark Ages America (see blogroll) wrote “No Exit” (the second citing the first, as it happens). Greenburg notes that every work of serious criticism seems to conclude with the happy chapter and he uses Walter Lippmann’s Public Opinion (1922) as his first example:

Lippmann’s experience will be familiar to almost anyone who has written a book aspiring to analyze a social or political problem. Practically every example of that genre, no matter how shrewd or rich its survey of the question at hand, finishes with an obligatory prescription that is utopian, banal, unhelpful or out of tune with the rest of the book. When it comes to social criticism, no one, it seems, has an exit strategy.

Greenburg goes on to identify notables Allan Bloom, Al Gore, Upton Sinclair, Eric Schlosser, Daniel Boorstin, and Robert Putnam, all of whom fall prey to the impulse to offer solutions to the problems they identify. It’s so obvious a response to lodging complaints it’s virtually a formal requirement. The alternative is to bitch and moan and then admit “but whatcha gonna do?” Berman adds journalist Chris Hedges to the list but is relatively graceful about why “pulling a rabbit out of the hat at the eleventh hour” is inevitable.

Maybe it’s worthwhile to wax philosophic for a bit, to observe that life is frequently about loss and leads eventually to death. How men and women face intractable difficulty and deal with death stalking them all their lives are frequent themes in philosophy and fiction. Whereas one sometimes triumphs over the former, one always loses to the latter.

One type of man with which we’re morbidly fascinated these days, the superhero and/or the superspy, barely registers pain (physical or psychic, though it’s becoming fashionable to explore inner conflict) and typically emerges at the end of their stories remarkably free of any haunting memories of the mayhem inflicted to finally vanquish the baddie(s) and protect/win the girl. (Strangely, those few female heroes are often hardened even worse than men and usually don’t immediately activate their libidos in celebration of victory. Instead, they reject everyone and begin brooding in preparation for the sequels.) After a fashion, it’s another form of the happy chapter, this time as an epilogue.

Closer to reality, however, some of us feel loss keenly as the culture and world around us stumble toward oblivion, and a few of us possess the integrity to recognize and state honestly that there’s nothing much to be done, really, but watch and witness. We’re along for the ride, as it were, as the institutions of our own making drive history without anyone actually being in the driver’s seat.

Update: I just finished rereading Allan Bloom’s book, and there’s no happy chapter or paragraphs at the end. Why he is included in Greenburg’s list of authors who shift their critical gaze unaccountably is unclear.

I sent this link to an online discussion group in which I participate (never mentioned here before) along with the brief comment below.

I wrote:

It’s not quite over yet, but the brief era of manned space flight is drawing to a close. Perhaps we’re finally ready to admit that everything else out there is just too far away to be of more than theoretical interest to us and the cost of of maintaining human life in space is too great. The glory of our empire in space was the stuff of dreams for a while, before we agreed to share it. Now it’s nearly over with.

The responses were excoriating. I don’t think those folks would appreciate being quoted here, so I’ll withhold, but I was raked pretty heavily. I was called incredibly pessimistic and nihilistic and accused of dancing on the grave of something not yet dead. I was also schooled on the history and motivation behind the space program (as though I knew nothing of them) and told flatly that it’s not over, that distance and cost are just numbers. After all, we’re explorers.

Oddly enough, my initial remark was meant to be wistful, not damning. I was simply observing that the romance is over, that the frontier is finally closed. But I wasn’t interpreted that way at all. Frankly, I’m a little surprised I sparked such a uniform response. Who knew that Kennedy-era idealism about manned space flight and exploration still burns so hotly? Turns out I unwittingly gored an ox that is still to many an idol.

For a far more comprehensive description of what may happen next with the U.S. space program, see “NASA’s Course Correction” at The New Atlantis.