Archive for the ‘Health’ Category

Once in a while, a comment sticks with me and requires additional response, typically in the form of a new post. This is one of those comments. I wasn’t glib in my initial reply, but I thought it was inadequate. When looking for something more specific about Neil Postman, I found Janet Sternberg’s presentation called Neil Postman’s Advice on How to Live the Rest of Your Life (link to PDF). The 22 recommendations that form Postman’s final lecture given to his students read like aphorisms and the supporting paragraphs are largely comical, but they nonetheless suggest ways of coping with the post-truth world. Postman developed this list before Stephen Colbert had coined the term truthiness. I am listing only the recommendations and withholding additional comment, though there is plenty to reinforce or dispute. See what you think.

  1. Do not go to live in California.
  2. Do not watch TV news shows or read any tabloid newspapers.
  3. Do not read any books by people who think of themselves as “futurists,”
    such as Alvin Toffler.
  4. Do not become a jogger. If you are one, stop immediately.
  5. If you are married, stay married.
  6. If you are a man, get married as soon as possible. If you are a woman,
    you need not be in a hurry.
  7. Establish as many regular routines as possible.
  8. Avoid multiple and simultaneous changes in your personal life.
  9. Remember: It is more likely than not that as you get older you will get
  10. Keep your opinions to a minimum.
  11. Carefully limit the information input you will allow.
  12. Seek significance in your work, friends, and family, where potency and
    output are still possible.
  13. Read’s Law: Do not trust any group larger than a squad, that is, about
    a dozen.
  14. With exceptions to be noted further ahead, avoid whenever possible
    reading anything written after 1900.
  15. Confine yourself, wherever possible, to music written prior to 1850.
  16. Weingartner’s Law: 95% of everything is nonsense.
  17. Truman’s Law: Under no circumstances ever vote for a Republican.
  18. Take religion more seriously than you have.
  19. Divest yourself of your belief in the magical powers of numbers.
  20. Once a year, read a book by authors like George Orwell, E.B. White, or
    Bertrand Russell.
  21. Santha Rama Rau’s Law: Patriotism is a squalid emotion.
  22. Josephson’s Law: New is rotten.

Acid Added

Posted: February 11, 2016 in Health, Taste

I traveled to Europe recently, which I haven’t done in a couple decades, and was reminded immediately of a danger that tends to go unacknowledged: the reduction of foreign lands and peoples to a series of clichés or stereotypes. Tourist guides and websites reinforce the effect. This tendency may be forgivable with respect to food, considering that one has multiple meals per day, which thus occupy a significant portion of one’s time and attention abroad. My rediscovery of truly fresh-baked bread (given the superior European tradition of daily shopping for bakery goods, I suspect that propionic acid and sodium propionate used as preservatives in American bread and other baked goods was not present) called to mind a book recommended at Gin and Tacos (see blogroll) entitled Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat by Anastacia Marx de Salcedo. Although obvious perhaps in hindsight, it was surprising to learn (via book blurbs and recommendations) that demand for foods that could withstand transport times to combat troops without spoilage was a principal driver of innovation in food processing technologies, which have been further refined over the decades by Big Ag.

Here a good example: peeled and skinless tangerines and mandarin oranges used in salads are passed through steam or hot water at about 90ºC for 2–3 minutes to loosen the peel and make it easier to separate from the segments. Segments are then separated and the segmental membrane is removed by chemical treatment — meaning it’s dissolved in an acid solution, which is neutralized in turn with an alkaline solution. This is described in greater detail in expired U.S. Patent No. 4,294,861 entitled “Method of Separating and Taking Out Pulp from Citrus Fruits.” Here’s the abstract:

A method and an apparatus for processing citrus fruits into a drink, which is the juice of the fruits containing separate juice vesicles, or sacs, of the pulp, by cutting the fruits into pieces and directing jets of a fluid against the cut surfaces, thereby separating and forcing the pulp in the form of separate sacs away from the peel and segmental membrane of the fruit pieces.

Of course, citric acid is naturally occurring in, well, citrus fruits. (Citric acid also makes for a surprisingly potent cleaning agent.) But I find it more than a little ooky to treat foods in acid baths, or to add acids to ingested foods (is there another kind?) as preservatives. Admittedly, all sort of acids are present naturally in foods: malic acid in apples and cherries; tartaric acid in grapes, pineapples, potatoes, and carrots; acetic acid in vinegar; oxalic acid in cocoa and pepper; tannic acid in tea and coffee; and benzoic acid in cranberries, prunes, and plums. Less natural but wholly familiar to typical Murricans, corrosive phosphoric acid (also known as orthophosphoric acid) is used as an acidifying agent in soft drinks (which also contain relatively harmless carbonic acid) and jams to provide a tangy flavor. Otherwise, the syrup/sugar content alone would be enough to make one vomit. Fumaric acid is also used in noncarbonated soft drinks.

Maybe none of these rise to the level of universal acid that eats through everything, including stomach linings, or to sulfuric acid found in batteries (or more simply, battery acid). Still, our food is nonetheless suffused in acids, and the idea of adding more to bakery goods to make them shelf stable may account for why European bakery goods made for that day only are so far superior to most American bakery goods able to sit in one’s breadbox almost indefinitely. Cue the periodic newsbit about a McDonald’s meal allowed to sit out for some extended period of time (often years) without spoiling in the least.

I remember that sinking feeling when the Deepwater Horizon oil well blew out in April 2010 and gushed oil into the Gulf of Mexico for 87 days at an estimated rate of 62,000 barrels per day (9,900 m3/d) until it was reportedly capped (but may not have been fully contained). That feeling was more intense than the disgust I felt at discovering the existence of the Great Pacific Garbage Patch (and subsequently others in the Atlantic and Indian Oceans). For reasons that make no particular sense, slo-mo ecological disasters in the oceans didn’t sicken me as much as high-speed despoliation of the Gulf. More recently, I’ve been at a loss, unable to process things, actually, at two new high-speed calamities: the contaminated tap water flowing from public waterworks in Flint, MI, and the methane leak from an underground well in the Porter Ranch neighborhood of Los Angeles, CA (no links provided, search for yourself). Whereas the first two examples turned my stomach at the mere knowledge, the second two are quite literally sickening people.

These examples could be part of a daily diet of stomach-churning news if I had the nerve to gather further examples. Indeed, the doomer sites I habituate at intervals (no longer daily) gather them together for me. As with the examples above, many are industrial chemical spills and contamination; others are animal and plant populations dying en masse (e.g., bee colony collapse disorder); yet others are severe weather events (e.g., the California drought) on the rise due to the onset of climate change (which has yet to go nonlinear). Miserable examples keep piling up, yet business as usual continues while it can. Death tolls are difficult to assess, but at present, they appear to be impacting nonhuman species with greater ferocity thus far. Some characterize this as Mother Nature doing her necessary work by gradually removing the plant and animal species on which humans depend as the very top apex predator. That means eventually removing us, too. I don’t care for such a romantic anthropomorphism. Rather, I observe that we humans are doing damage to the natural world and to ourselves in perhaps the slowest slo-mo disaster, the most likely endpoint being near-term extinction.

As much, then, as the alarm has been sounding adequately with respect to high-speed disasters stemming from human greed, incompetence, and frailty, I find that even worse calamity awaiting us has yet to penetrate the popular mind. Admittedly, it’s awfully hard to get one’s head around: the extinction of the human species. Those who resign themselves to speaking the truth of inevitability are still characterized as kooks, wackos, conspiracy mongers, and worse, leaders of death cults. From my resigned side of the fence, proper characterization appears to be the very opposite: those who actively ruin nature for profit and power are the death cult leaders, while those who prophesy doom are merely run-of-the-mill Cassandras. The ranks of the latter, BTW, seem to be gaining while critical thought still exists in small, isolated oases.

This xkcd comic bugged me when I first saw it, but I didn’t give it too much thought at first because its dismissive approach to media is quite familiar and a bit tiresome:

On reflections, however, and in combination with other nonsense I’ve been reading, the irksome joke/not joke hasn’t faded from my thinking. So I’ll be very unhip and respond without the appropriate ironic detachment that modern life demands of us, where everything is cool and chill and like, dude, whatever ….


Sitting in Cars

Posted: November 29, 2014 in Culture, Health, Idle Nonsense
Tags: ,

I’m fortunate to spend minimal time in my car. I’ve logged fewer than 5,000 miles in each of the last 5 years or more, but I still find owning a vehicle indispensable for some of my activities. So when I’m out and about, it’s more likely that I’m walking on the sidewalk, pedaling my bicycle, or riding public transportation. Each has its own dynamic, but I have been noticing lately that there are surprising number of people sitting in their cars, engines running. The bicycle in particular requires hypervigilance on my part so that someone doesn’t open the car door into my path, and as I ride by I take notice of an unexpectedly large number of occupants of cars going nowhere.

Because of the durations involved, my sense is that car sitters aren’t in the process of entering and exiting; maybe they’re waiting on another person. There is almost always a smartphone or tablet active in front of their faces. Sometimes, they are sitting and smoking, listening to radio, or talking on the phone. Nonetheless, the cabin of an automobile strikes me as an odd place to hang out (and burn fuel).

I can’t profess to understand fully what’s at work here, but it’s clearly not as exceptional as I might have believed. My conjecture is that the interior of the car represents a defined personal space or refuge. It may not be entering a cocoon or reentering the womb exactly, but both comparisons spring to mind. I note, too, that walling oneself off from the external world, however temporarily, and exerting total control over the immediate environment may provide a fleeting sense of security, privacy, and wholeness lost in the wider world we inhabit. I can’t say if a growing number of automobile hermits are using their cars to regain personal equilibrium before rejoining and confronting the world anew every day. Perhaps one my my readers can provide an explanation or point to some research.

“Human beings are ashamed to have been born instead of made.” – Günther Anders

For a fertile mind, nearly everything is a potential metaphor or microcosm for something else. It gets tiresome, really. Still, I couldn’t help but to reflect on this post at On an Overgrown Path as a particularly on-point example of what I’ve been working out over numerous blog posts, namely, that our discontentment over being human, with its inherent limitations, is boiling over. Case in point: music is now routinely given a slick, post-production shove toward hyperreality. That assertion is probably not clear to anyone wandering into The Spiral Staircase without the benefit of prior familiarity with my themes, so let me unpack it a bit.

The essence of the linked blog post above is that media have altered musical perspective (e.g., stage perspective, podium perspective, audience perspective, stereo hifi perspective, and in- or over-ear perspective) to such a degree that acoustics developed intuitively over generations (and hardened into convention) to enhance natural sound must now be supplanted by subtle (or not so subtle) amplification and digital processing to satisfy a generation that may never have stepped inside a concert hall and is instead acculturated to the isolating, degraded sound of earbuds and headphones playing back mp3s. Reorienting concert soundscapes and recordings to model immersive, inside-the-head experience (VR tricks the eye in a similar fashion) is promulgated as inevitable if music presenters wish to attract new generations of concertgoers and thus retain audiences. The blogger follows up later with another post entitled “Technology Reveals Information but Annuls Perception,” which appears to be in conflict with his earlier contentions. (He also dismisses my corrective comment, but no matter.)

I don’t really care much about audience building or the business and marketing aspects of music; others can attend to those concerns. However, the engineering and construction of virtual space, head space, and/or hyperreality, proceeding in slow, incremental steps, is of grave concern to me. We are turning our backs on the world (the body and the sensorium) and fleeing into our heads and ideation. How fully does the gradual disappearance of natural sound in the ear (namely, wearing earbuds 24/7) signify the dire condition of humanity? Impossible to quantify, of course, but considering how omnipresent technology retrains attention and focus away from the environment toward itself in the form of playback devices and handheld screens, I would say that to be part of the modern world means agreeing to be media (and consumer) slaves. Furthermore, the faux reality found there is edited and distorted to achieve maximum impact in minimal time, but the result is overstimulation giving way to catatonia.

When I was a boy, I felt the shut-down reflex in response to the venerable three-ring circus that came to town periodically: too much everything, so ultimately very little or nothing. The same overkill aesthetic is true now of most media, which are saturated with blinkered content to rivet attention — a bubbling, pseudo-glamorous effervescence — but nonetheless fail to register on stripped-out senses. I can think of no better example than events where amplified sound is bone-crushingly loud, i.e., destroying the small, conductive bones in the inner ear leaving unprotected listeners’ ears ringing temporarily, and over time, damaging hearing permanently. The sheer volume has the effect of isolating everyone (alone in a crowd) and reducing them to voiceless, gesticulating grunts. For example, I have attended concerts (indoor and outdoor), dance clubs, wedding receptions, and fundraisers where the sound level was well above the 85 db sufficient to cause hearing loss, yet people just stand there and take it. The disconnect from reality and failure to react to the aural onslaught (by leaving or putting in earplugs) is astonishing. There is no sane reason to believe such conditions are enlivening and inevitable, yet those are in fact fashionable behaviors and recommendations.

Admittedly, destroying one’s ears is not the same as wrecking concert hall acoustics or recording perspective, but they are part and parcel of the same underlying mentality: a discontentment with human limitation. Cinema is going the same direction with gimmicky use of CGI and eye-popping camera effects that deliver views and perspectives that have lost all relation with mundane reality. The desire to transcend the banal is terrific when guided by a wizened aesthetic. When motivated by boredom or shame at our inability to be superhuman, well, that’s something quite different.

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

I have warned and been warned plenty about how the weird wired world is remapping our brains, our minds, and our consciousness in ways we don’t yet understand and probably never will, consider how the target keeps moving, eluding our attempts to uncover its inner nature. So it came as no surprise to me at least that, as reported in The Telegraph, there are now toddlers so deep into their iPads they require therapy. The epidemic of screenheads and pixelbrains has tracked all the way down to 3- and 4-year-olds:

Psychiatrists estimate that the number of people who have become digitally dependent has risen by 30 per cent over the past three years.

A survey last week revealed that more than half of parents allowed their babies to play with their phone or tablet device.

One in seven of more than 1,000 parents questioned by website admitted that they let them use the gadgets for four or more hours a day.

Um, what’s with the one-sentence paragraphs at The Telegraph? I also find it a little strange how the report describes the problem in terms of addiction. Lumping more and more things into that category (alcohol, drugs, smoking, and gambling being among the mainstays, but sex, porn, gaming, and now iPads?) does no one any credit but does transform the problem into something done to a person rather than something one does to him- or herself. This is certainly the case with toddlers, who really have no responsibility for what happens to them. Their agency is quite limited; parents and caregivers are the ones who set up the kids to need therapy by providing them inappropriate devices as toys/pacifiers. The iPad babysitter is not the equivalent of handing toddlers real guns with which to play, but logical effects do appear to manifest within a relatively short time frame. So parents are pulling a trigger with a delayed effect, not unlike poor diet and hygiene, which are child endangerment issues sufficient to remove the children from their parents’ care.

If a cult deprogramming style of intervention becomes necessary, which is lightly being called “digital detox,” toddlers may have to be weaned, but parents should be sat down in a circle to have overwhelming pressure applied until their characters break down and they can be taught some parenting skills. Indeed, this is an example why misanthropes suggest that procreation should be limited, not open to anyone with the right functional biological parts. That restriction will never come to pass, of course, but something ought to happen to adults who ruin their own children, unwittingly or not.

That, too, will never happen, in part because the postmodern world is in a long-term project to put everything behind glass: first the glass of the microscope and telescope, then the glass of the film projector and television, then the glass of the computer screen, smartphone, e-reader, tablet, and virtual reality headset. No small thing, then, that Google calls its eyeglass-mounted display the Google Glass. Corning had a disgusting promotional video some time back about glass countertops and appliance façades with computer displays behind them. I suspect such devices are mere baby steps until engineers figure out how to do holographic displays and bioengineers provide the data feed directly into the nervous system — the creepy Google Implant. Despite claims that these developments are about making the virtual world real, I insist it’s really about our retreat into fantasy, where the pixelated objects of our desire are antiseptic and pure, unlike, say, the messy, nauseating fecundity of biology, where plants and animals kill to eat, poop, and decay after death. It’s transhumanism carried beyond wishful thinking, and the toddlers in the linked news report above show that we won’t tolerate living without access to the feed (while it lasts), even if it proves harmful. (more…)

Satiety Signals

Posted: April 26, 2012 in Consumerism, Culture, Health, Philosophy
Tags: ,

This post is rewritten and expanded from a blog comment I made here.

In human biology, the part of the brain that controls appetite is called the hypothalamus, and it responds to four different hormones: insulin, leptin, CCK, and ghrelin. The first three signal when one has had enough to eat and the last inhibits the function of the first three. With normal foodstuffs, the satiety signal appears readily enough. However, food manufacturers through the application of science can now outwit our hormones so that, for instance, a 32-oz Big Gulp no longer seems like a lot of liquid to drink because the normal “I’m full” signal never reaches the brain even though the liquid reaches the bladder. Fructose and high-fructose corn syrup (more commonly known as HFCS, which has roughly equal parts fructose and glucose), major ingredients in soda, in particular fail to stimulate production of these hormones and are accordingly regarded as toxins or poisons by many dieticians. It might sound conspiratorial to suggest that food makers have purposely substituted HFCS for other sweeteners precisely to forestall the feeling of fullness, but then, no one believed the conspiracy to load cigarettes with addictive nicotine for a long while, either.


Nature Encounter

Posted: October 15, 2011 in Environment, Health, Idle Nonsense

I visited The River Trail Nature Center and grounds recently in Glenview, Illinois, which is part of the larger Forest Preserve District of Cook County serving the City of Chicago and its surrounding suburbs. A similar forest preserve district is located in Lake County just to the north. Cook County maintains an impressive network of public parks and forest preserves, many situated along Lake Michigan, the Chicago River, and the Des Plaines River, though I only infrequently venture into a forest preserve except when I’m driving somewhere in my car and must traverse one. Another recent day, I rode my bike on a path through the Caldwell Woods, which follows alongside the Des Plaines River. I used to ride the lake shore bike path (along Lake Michigan) quite a bit, but it’s in some ways a victim of its own success and now too crowded with people and traffic. Now I ride mostly city streets. Both of these recent experiences made curious impressions on me.

The first felt strangely artificial, as though the woods had been emptied out and sanitized for suburbanites with delicate sensibilities — people who want to be out in nature without truly encountering it. Signage warned (among other things) not to stray off the gravel paths, not to jog, and not to stay out after sunset. Sounds of passing traffic (from just beyond the grounds) intruded into the otherwise serene landscape. The Nature Center itself was a little like a radically abbreviated zoo, with its educational services thrust, child friendliness, and gawker’s sensibility: signs and pictures identified flora and fauna with clinical accuracy but only questionable relevance or intrinsic interest.

The second was a joy. The leaves had changed and begun to fall, which provided welcome sights and smells of the season. It still being Indian summer in Chicago, I raced and sweated through about 16 miles of trail, pausing at street and train crossings. Although the Caldwell Woods are situated within Chicago proper (at the southern end) and have mown grass and asphalt paths, they felt somehow less empty, less artificial than the Glenview forest preserve. I even rode past (a family of?) four deer, who were entirely unperturbed by the bicyclists speeding by.

I have commented to family and friends that Chicago is a concrete city (and brick, glass, and steel). There is precious little lawn space other than in the parks, and whenever I drive beyond the city and suburbs, I always appreciate the sense of leaving behind the city’s population density, self-conscious architecture, and clogged urban transportation infrastructure. So venturing into the woods, even those set aside specifically for our use, felt worthwhile. Still, walking and riding the paths was a far cry from tramping truly wild woods as I did when I was a Boy Scout many years ago.

Estrangement and alienation from nature are frequent themes discussed in many of the books, articles, and blogs I read. Like more than 90% of Americans, I’m an urban dweller. Only 110 years ago, more than 90% of Americans resided outside of cities and were agrarian. Connections to nature and its processes were automatic and inevitable; almost no one would have thought to “visit nature” because they already lived there. Similarly, almost no one would have gone to a gym for a workout because their lives were already full of physical activity and labor that kept them fit and active. Combining them (as with trail riding or golfing in tamed and coiffed versions of nature) is even better. That we now do both and consider it normal and desirable as a retreat from the difficulty of our modern lives is not a marker of healthy adaptation to our environment — for me no less than others. The fact that I could even have such curious impressions in encounters with nature demonstrates, however, how adapted I am to urban life despite being aware of a different sort of world beckoning.