Archive for July, 2008

The Great Cracker Controversy

Posted: July 28, 2008 in Religion, Tacky

I learned about The Great Cracker Controversy, or Crackerclysm, after it already faded, but since this stuff is ongoing in religious and public life, I don’t mind coming a little late to the party. A bit of background first:

It seems some hapless fellow (Webster Cook) at the University of Central Florida was witnessed absconding with a communion wafer instead of ingesting it per the Catholic sacrament and now faces impeachment as a member of the student senate. He claims he wanted to show the communion wafer to a friend but failed to consider that it had become a sacred object and had to be consumed immediately (according to Catholic doctrine). As things spun out of control, the Catholic League denounced him and he received numerous death threats. PZ Myers, a biology professor who writes a snarky science blog called Pharyngula, picked up the thread and added his own log to the fire by announcing his intention to desecrate the host, which is after all only a cracker (according to atheist doctrine). Myers drew his own death threats but made good on his promise to offend the sacred cracker.

In this shitstorm, there is no lack of bad behavior on all sides. Cook’s initial action may have been innocent, but he refused to return the Eucharist when asked. Students, student government, and Catholics of all sorts piled on with varying levels of intensity. What is it, BTW, with issuing death threats? Do Catholics seriously believe the offense deserves death and their cowardly threats magically transform into some sort of fatwa? Then PZ Myers fans the flames, making sport of it all. I suspect he is entirely correct not to take death threats seriously. And how stupid do you have to be to issue the threat via e-mail or in a comment on a website? Similarly, many the comments on Pharyngula say things such as “I pray for your soul” and “may god grant what you so richly deserve.” Is it normal to wield prayer and invoke one’s deity as a rhetorical weapon?

Considering that Myers is a college professor, one might expect him to exercise some restraint. But as with so many pundits and bloggers and media whores, he finds it more entertaining to deride his targets rather than treat them with compassion. This paragraph was particularly nasty:

I think if I were truly evil, I would have to demand that all of my acolytes be celibate, but would turn a blind eye to any sexual depravities they might commit. If I wanted to be an evil hypocrite, I’d drape myself in expensive jeweled robes and live in an ornate palace while telling all my followers that poverty is a virtue. If I wanted to commit world-class evil, I’d undermine efforts at family planning by the poor, especially if I could simultaneously enable the spread of deadly diseases. And if I wanted to be so evil that I would commit a devastating crime against the whole of the human race, twisting the minds of children into ignorance and hatred, I would be promoting the indoctrination of religion in children’s upbringing, and fomenting hatred against anyone who dared speak out in defiance.

I actually agree with this paragraph, but within the context of the cracker controversy, this isn’t educating or even tweaking — it’s a full-on attack against the head of the Catholic Church.

Religious leaders have enjoyed centuries of deference. In the last decade, a growing number of atheists have written books, made speeches, and otherwise challenged the automatic pass religion receives. It goes too far, though, when those challenges are childish taunts.

Darkened Skies

Posted: July 24, 2008 in Philosophy, Science

The Hubble telescope has helped cosmologists to establish that the universe is expanding at an accelerating rate, which goes against the theory of cyclical expansion and contraction of the universe leading to multiple big bangs. (The cyclical universe theory had nearly grown to be dogma among those of us of a certain age.) Popular science magazines and university academic departments have been all over this new development for a few years now, and a few are even beginning to question its implication. One such implication is that we humans are living during a lucky sliver of cosmological time when evidence of the universe around us is available — visible even to the naked eye in the form of the night sky.

current night sky

However, as the universe continues to expand over the eons, stars and galaxies will recede from view and the skies will go dark, except of course for our own sun, which will long since have swollen into a red giant, scorched the Earth to smithereens, and eventually burned out. There will also be a supergalaxy formed from the collision of the Milky Way and Andromeda galaxies, but otherwise, nothing else will be visible. This slideshow by Scientific American offers a glimpse of what once was, what is, and what’s to come for the vastly aged universe. (The pictures are oversaturated, which almost all photos of the cosmos are these days, presumably to add unnecessary punch and appeal to something that is already pretty awe insipiring.)

A fuller telling of this story in Scientific American is here. In short, as the universe expands, it will eventually erase the evidence of its own existence, and anyone or anything that persists on Earth will be part of an island galaxy surrounded by an almost infinite void — a discredited theory from the early 1900s that will ironically come to be true in cosmological time.


Lingua Nova 01

Posted: July 22, 2008 in Nomenclature

Or more simply, new lingo. The rapidity of creation of new words, language, terminology, jargon, lexicon, patois, nomenclature, argot, idiom, slang, coin, what-have-you never ceases to amaze me. It was reported recently that the English language is nearing 1 million words — a big round milestone with no real meaning or value. As with the list above, many of them are tautologies with no useful distinction from readily available words. Consider the many different terms available to express the idea of fast: rapid, quick, accelerated, speedy, hurried, swift, alacritous, brisk, expeditious, breakneck, celeritous, hasty, fleet, and precipitous. Some have distinct nuances, others are mere repetition or useless variation or meaningless noise or heedless padding or … well, you get the idea.

It was a surprise for me to learn that an authority (self-appointed, I wonder?) exists for counting words: The Global Language Monitor. The Oxford English Dictionary calls itself the definitive record of the English language. Undoubtedly someone has to decide when something becomes a legitimate word. The OED publishes a quarterly new word list, which includes some entries that are already in heavy popular use. For instance, subprime was only just admitted to the language, at least in the official sense.

Other dictionaries make their own determinations, of course, and I suspect it’s pointless to argue over which authority is correct when the new word lists don’t match up. The spell-checking features of MS Word and WordPress are certainly overmatched by this post. I don’t even dare wade into the dangerous waters of what qualifies as a legal Scrabble play. According to this article, Merriam-Webster admitted more than 100 new entries in its new Collegiate Dictionary. Among them are racino, a racetrack at which slot machines are available for gamblers; pescatarian, a vegetarian whose diet includes fish; and mondegreen, a word or phrase that results from a mishearing of something said or sung. I was charmed by the etymology behind mondegreen, but pescatarian seems to me useless hairsplitting and racino is just way too hip.

Based purely on personal preference, there are plenty of new words (or perhaps mere usages if not yet officially admitted to the language) to which I’ll never submit. For instance, I’ll never use webinar to describe a seminar broadcast over the web or stacation to describe a stay-at-home vacation (usually because of financial constraints). On the other hand, I’m prone to recover and use archaic or clever vocabulary no one uses, such as fescennine to describe something scurrilous or obscene and lethean to mean forgetful or lost to oblivion. In doing so, of course, I risk sounding not like the erudite gentleman I am but a verbose asshole, but them’s the breaks.

New Philanthropy

Posted: July 14, 2008 in Debate, Economics, Environment, Science

The article at this link presents a point of view explaining why we don’t need to worry about global warming right away. If I understand the premise correctly, the author believes that the direction trends are leading are no cause for alarm because, even if the worst case scenario is granted, we (or more accurately, our children and grandchildren) will be better equipped then to respond because of two factors: they’ll be so much better off economically than we are now and they’ll have at their disposal technological developments of which we can’t even dream. So adapting to any scenario will be far cheaper and effective for them then than for us to act now to forestall negative outcomes. which will afflict them far worse than us. I didn’t read far into the comments, but they appear to be divided. Number 6 makes sense to me; in fact, it resonates with truth, unlike the article, which is fatuous and self-serving. If the article presents a hopeful counter-balance to the relentless doomer view, my sense that we’re in deep doo-doo has not been allayed.

I’ve learned in the past couple years, for instance, that the marine ecosystem is being systematically destroyed. First, there’s all that floating plastic. Second, there’s the fishing industry that, among other things, has developed its practices to the point that it now vacuums the ocean floor and drags nets up to a mile long, discarding overcatch (other marine life killed in the process — collateral damage if you wish) in both processes like it’s slag from some strip mining process rather than being composed of living organisms. Third, the phytoplankton population is collapsing, which because it’s at the bottom of the food chain spells the demise of most of the rest of marine life higher up the food chain. By midcentury in some estimates, there will be no such thing as wild fish in the oceans. Those fish that survive will be corralled within marine farms and fed by corporations that later harvest them to sell as food. The oceans will be dead; we’re murdering them.

Considering that our planet is 2/3 water, removing, harvesting, or destroying life from a functioning ecosystem is not something to which we’re likely to be able to adapt by, for instance, forgoing tuna. Material processes trump any abstract human institution, namely, economics. When it’s gone, it will be gone permanently. The fantasy (see comment 6) of endless economic growth that will make problems far simpler to solve in the future is nice in the abstract, but it doesn’t square with reality.

The bizarre idea that we could simply wait to address problems could be extrapolated ad adsurdium to mean that we never have to act to fix anything now because it will always be easier to do so in some conceivable future. How, then, to account for the fact that among the superrich, the idea of philanthropy has shifted from constructing self-exalting monuments and institutions (buildings, schools, etc., named after themselves) to fixing the the world’s problems? Donald Trump is certainly from the old self-aggrandizing school. The first philanthropist to move instead toward being a fixer was probably Ted Turner when he pledged $1 billion toward U.N. causes. Richard Branson later pledged $3 billion towards solving the problems of climate change. Warren Buffett decided a couple years ago to donate $37 billion worth of shares in his firm, Berkshire Hathaway, to five charitable foundations. And just this month, Bill Gates retired from Microsoft to work full-time for The Bill & Melinda Gates Foundation, which specializes in global health and education projects.

Maybe these guys are proof of the premise that it is possible to address our problems via support of the superrich rather than with public funds and that government can therefore bank on “free market” solutions to our ailments. If Ted or Richard or Warren or Bill & Melinda don’t find something pressing enough to be fixed, well then, maybe it isn’t all that pressing after all. And if the philanthropic emphasis has begun shifting from fine arts, universities, and affirmative action to general health and education and environmental concerns, will that mean that in the U.S., which steadfastly underfunds public institutions such as museums, libraries, Amtrak, NPR, etc., all of which accrue tremendous value to the public, public institutions will be starved out of existence?

Rather sooner than later, we’re going to face a wholesale reevaluation of what we want to keep and what we want to jettison from the so-called American way of life. I saw projections recently that gasoline will rise to $7 per gallon by 2010 (which is too far off — I predict it will happen sooner), which would cause the percentage of income paid by someone now earning $25k per year to increase from 7% to 20% to function within our current transportation model. Those projections assume, I think, that food, heating, cooling, manufacturing, and other costs married to the price of energy won’t similarly spike. The cost of the military would spike, too, considering that the U.S. military is responsible for half of fuel consumption in the U.S. Before it all unravels, I hope to have an ethical response worked out. In the meantime, I hope it’s enough to recognize and acknowledge ours problems and expose the insanity of arguments such as the one linked to above.

Donovan’s Brain

Posted: July 12, 2008 in Consciousness, Philosophy, Science

This passage from E.O. Wilson’s book Consilience piqued my interest:

Without the stimulus and guidance of emotion, rational thought slows and disintegrates. The rational mind does not float about the irrational; it cannot free itself to engage in pure reason. There are pure theorems in mathematics but no pure thoughts that discover them. In the brain-in-the-vat fantasy of neurobiological theory and science fiction, the organ in its nutrient bath has been detached from the impediments of the body and liberated to explore the inner universe of the mind. But that is not what would ensue in reality. All the evidence from the brain sciences points in the opposite direction, to a waiting coffin-bound hell of the wakened dead, where the remembered and imagined world decays until chaos mercifully grants oblivion.

Wilson does a good job synthesizing consciousness studies and taking a stab at describing the physical and psychological bases of the mind. This paragraph, though, stuck out like a sore thumb from the chapter and book, in part because he calls back to the midcentury notion of disembodied brains controlling the world and in part because he waxes poetic in his dismissal of that idea. (Wish I could write so well ….) I vaguely remember reading Donovan’s Brain back in my youth. There is even a Wikipedia entry for the novel, which was made into a movie at least twice, one with Nancy Reagan. More familiar to the TV generation is the same basic story idea that found its way into a Star Trek episode called “Spock’s Brain.” The terseness of both titles is a humorous curiosity.

The larger point, I guess, is that the brain’s various routines that operate in parallel to construct consciousness are inseverable from sensory inputs from the body and from the emotional context used to create narrative. As such, it’s important to recognize that there is no divine Truth or Objectivity out there beyond our grasp or perception, even though we may sometimes hope for such things. (We adopt a posture of objectivity to reduce the distortions of emotion and irrational thinking, but that’s not the same thing.) There is a physical reality that is somewhat discontinuous with our perception of everyday “reality.” Such limitations are built into our physiology, just like we lack the eagle’s acute visual perception or the dog’s acute olfactory sensation.

If the brain can’t exist without the body, how much of the body can be missing before the brain fails to function? It’s a strange question, and the answer is quite a lot. None of the extremities are essential, nor are most of the sensory parts (ears, eyes, tongue, nose) — at least separately. Few examples exist of individuals deprived of their senses, but Helen Keller is one obvious example. Her sightlessness and deafness (not from birth but from an illness at nineteen months) delayed her cognitive development until she broke through the symbolic barrier and developed language later in childhood. Over 70 cases of deafblindness are known, some congenital and others acquired. If an example exists of a fully developed mind later deprived of all sensory experience, it would be curious to know how long a person stays rational before the descent into madness begins.