Archive for the ‘Health’ Category

From the not-really-surprising-news category comes a New Scientist report earlier this month that the entire world was irradiated by follow-on effects of the Fukushima disaster. Perhaps it’s exactly as the article states: the equivalent of one X-ray. I can’t know with certainty, nor can bupkis be done about it by the typical Earth inhabitant (or the atypical inhabitant, I might add). Also earlier this month, a tunnel collapse at the Dept. of Energy’s Hanford nuclear waste storage site in Washington State gave everyone a start regarding possible or potential nearby release of radiation. Similar to Fukushima, I judge there is little by way of trust regarding accurate news or disclosure and fuck all anyone can do about any of it.

I’m far too convinced of collapse by now to worry too much about these Tinkerbells, knowing full well that what’s to come will be worse by many magnitudes of order when the firecrackers start popping due to inaction and inevitability. Could be years or decades away still; but as with other aspects of collapse, who knows precisely when? Risky energy plant operations and nuclear waste disposal issues promise to be with us for a very long time indeed. Makes it astonishing to think that we plunged full-steam ahead without realistic (i.e., politically acceptable) plans to contain the problems before creating them. Further, nuclear power is still not economically viable without substantial government subsidy. The likelihood of abandonment of this technological boondoggle seems pretty remote, though perhaps not as remote as the enormous expense of decommissioning all the sites currently operating.

These newsbits and events also reminded me of the despair I felt in 1986 on the heels of the Chernobyl disaster. Maybe in hindsight it’s not such a horrible thing to cede entire districts to nature for a period of several hundred years as what some have called exclusion or sacrifice zones. Absent human presence, such regions demonstrate remarkable resilience and profundity in a relatively short time. Still, it boggles the mind, doesn’t it, to think of two exclusion zones now, Chernobyl and Fukushima, where no one should go until, at the very least, the radioactive half-life has expired? Interestingly, that light at the end of the tunnel, so to speak, seems to be telescoping even farther away from the date of the disaster, a somewhat predictable shifting of the goalposts. I’d conjecture that’s because contamination has not yet ceased and is actually ongoing, but again, what do I know?

On a lighter note, all this also put me in mind of the hardiness of various foodstuffs. God knows we consume loads of crap that can hardly be called food anymore, from shelf-stable fruit juices and bakery items (e.g., Twinkies) that never go bad to not-cheese used by Taco Bell and nearly every burger joint in existence to McDonald’s burgers and fries that refuse to spoil even when left out for months to test that very thing. It give me considerable pause to consider that foodstuff half-lives have been radically and unnaturally extended by creating abominable Frankenfoods that beggar the imagination. For example, strawberries and tomatoes used to be known to spoil rather quickly and thus couldn’t withstand long supply lines from farm to table; nor were they available year round. Rather sensibly, people grew their own when they could. Today’s fruits and veggies still spoil, but interventions undertaken to extend their stability have frequently come at the expense of taste and nutrition. Organic and heirloom markets have sprung up to fill those niches, which suggest the true cost of growing and distributing everyday foods that will not survive a nuclear holocaust.

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)

Once in a while, a comment sticks with me and requires additional response, typically in the form of a new post. This is one of those comments. I wasn’t glib in my initial reply, but I thought it was inadequate. When looking for something more specific about Neil Postman, I found Janet Sternberg’s presentation called Neil Postman’s Advice on How to Live the Rest of Your Life (link to PDF). The 22 recommendations that form Postman’s final lecture given to his students read like aphorisms and the supporting paragraphs are largely comical, but they nonetheless suggest ways of coping with the post-truth world. Postman developed this list before Stephen Colbert had coined the term truthiness. I am listing only the recommendations and withholding additional comment, though there is plenty to reinforce or dispute. See what you think.

  1. Do not go to live in California.
  2. Do not watch TV news shows or read any tabloid newspapers.
  3. Do not read any books by people who think of themselves as “futurists,”
    such as Alvin Toffler.
  4. Do not become a jogger. If you are one, stop immediately.
  5. If you are married, stay married.
  6. If you are a man, get married as soon as possible. If you are a woman,
    you need not be in a hurry.
  7. Establish as many regular routines as possible.
  8. Avoid multiple and simultaneous changes in your personal life.
  9. Remember: It is more likely than not that as you get older you will get
    dumber.
  10. Keep your opinions to a minimum.
  11. Carefully limit the information input you will allow.
  12. Seek significance in your work, friends, and family, where potency and
    output are still possible.
  13. Read’s Law: Do not trust any group larger than a squad, that is, about
    a dozen.
  14. With exceptions to be noted further ahead, avoid whenever possible
    reading anything written after 1900.
  15. Confine yourself, wherever possible, to music written prior to 1850.
  16. Weingartner’s Law: 95% of everything is nonsense.
  17. Truman’s Law: Under no circumstances ever vote for a Republican.
  18. Take religion more seriously than you have.
  19. Divest yourself of your belief in the magical powers of numbers.
  20. Once a year, read a book by authors like George Orwell, E.B. White, or
    Bertrand Russell.
  21. Santha Rama Rau’s Law: Patriotism is a squalid emotion.
  22. Josephson’s Law: New is rotten.

Acid Added

Posted: February 11, 2016 in Health, Taste
Tags:

I traveled to Europe recently, which I haven’t done in a couple decades, and was reminded immediately of a danger that tends to go unacknowledged: the reduction of foreign lands and peoples to a series of clichés or stereotypes. Tourist guides and websites reinforce the effect. This tendency may be forgivable with respect to food, considering that one has multiple meals per day, which thus occupy a significant portion of one’s time and attention abroad. My rediscovery of truly fresh-baked bread (given the superior European tradition of daily shopping for bakery goods, I suspect that propionic acid and sodium propionate used as preservatives in American bread and other baked goods was not present) called to mind a book recommended at Gin and Tacos (see blogroll) entitled Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat by Anastacia Marx de Salcedo. Although obvious perhaps in hindsight, it was surprising to learn (via book blurbs and recommendations) that demand for foods that could withstand transport times to combat troops without spoilage was a principal driver of innovation in food processing technologies, which have been further refined over the decades by Big Ag.

Here a good example: peeled and skinless tangerines and mandarin oranges used in salads are passed through steam or hot water at about 90ºC for 2–3 minutes to loosen the peel and make it easier to separate from the segments. Segments are then separated and the segmental membrane is removed by chemical treatment — meaning it’s dissolved in an acid solution, which is neutralized in turn with an alkaline solution. This is described in greater detail in expired U.S. Patent No. 4,294,861 entitled “Method of Separating and Taking Out Pulp from Citrus Fruits.” Here’s the abstract:

A method and an apparatus for processing citrus fruits into a drink, which is the juice of the fruits containing separate juice vesicles, or sacs, of the pulp, by cutting the fruits into pieces and directing jets of a fluid against the cut surfaces, thereby separating and forcing the pulp in the form of separate sacs away from the peel and segmental membrane of the fruit pieces.

Of course, citric acid is naturally occurring in, well, citrus fruits. (Citric acid also makes for a surprisingly potent cleaning agent.) But I find it more than a little ooky to treat foods in acid baths, or to add acids to ingested foods (is there another kind?) as preservatives. Admittedly, all sort of acids are present naturally in foods: malic acid in apples and cherries; tartaric acid in grapes, pineapples, potatoes, and carrots; acetic acid in vinegar; oxalic acid in cocoa and pepper; tannic acid in tea and coffee; and benzoic acid in cranberries, prunes, and plums. Less natural but wholly familiar to typical Murricans, corrosive phosphoric acid (also known as orthophosphoric acid) is used as an acidifying agent in soft drinks (which also contain relatively harmless carbonic acid) and jams to provide a tangy flavor. Otherwise, the syrup/sugar content alone would be enough to make one vomit. Fumaric acid is also used in noncarbonated soft drinks.

Maybe none of these rise to the level of universal acid that eats through everything, including stomach linings, or to sulfuric acid found in batteries (or more simply, battery acid). Still, our food is nonetheless suffused in acids, and the idea of adding more to bakery goods to make them shelf stable may account for why European bakery goods made for that day only are so far superior to most American bakery goods able to sit in one’s breadbox almost indefinitely. Cue the periodic newsbit about a McDonald’s meal allowed to sit out for some extended period of time (often years) without spoiling in the least.

I remember that sinking feeling when the Deepwater Horizon oil well blew out in April 2010 and gushed oil into the Gulf of Mexico for 87 days at an estimated rate of 62,000 barrels per day (9,900 m3/d) until it was reportedly capped (but may not have been fully contained). That feeling was more intense than the disgust I felt at discovering the existence of the Great Pacific Garbage Patch (and subsequently others in the Atlantic and Indian Oceans). For reasons that make no particular sense, slo-mo ecological disasters in the oceans didn’t sicken me as much as high-speed despoliation of the Gulf. More recently, I’ve been at a loss, unable to process things, actually, at two new high-speed calamities: the contaminated tap water flowing from public waterworks in Flint, MI, and the methane leak from an underground well in the Porter Ranch neighborhood of Los Angeles, CA (no links provided, search for yourself). Whereas the first two examples turned my stomach at the mere knowledge, the second two are quite literally sickening people.

These examples could be part of a daily diet of stomach-churning news if I had the nerve to gather further examples. Indeed, the doomer sites I habituate at intervals (no longer daily) gather them together for me. As with the examples above, many are industrial chemical spills and contamination; others are animal and plant populations dying en masse (e.g., bee colony collapse disorder); yet others are severe weather events (e.g., the California drought) on the rise due to the onset of climate change (which has yet to go nonlinear). Miserable examples keep piling up, yet business as usual continues while it can. Death tolls are difficult to assess, but at present, they appear to be impacting nonhuman species with greater ferocity thus far. Some characterize this as Mother Nature doing her necessary work by gradually removing the plant and animal species on which humans depend as the very top apex predator. That means eventually removing us, too. I don’t care for such a romantic anthropomorphism. Rather, I observe that we humans are doing damage to the natural world and to ourselves in perhaps the slowest slo-mo disaster, the most likely endpoint being near-term extinction.

As much, then, as the alarm has been sounding adequately with respect to high-speed disasters stemming from human greed, incompetence, and frailty, I find that even worse calamity awaiting us has yet to penetrate the popular mind. Admittedly, it’s awfully hard to get one’s head around: the extinction of the human species. Those who resign themselves to speaking the truth of inevitability are still characterized as kooks, wackos, conspiracy mongers, and worse, leaders of death cults. From my resigned side of the fence, proper characterization appears to be the very opposite: those who actively ruin nature for profit and power are the death cult leaders, while those who prophesy doom are merely run-of-the-mill Cassandras. The ranks of the latter, BTW, seem to be gaining while critical thought still exists in small, isolated oases.

This xkcd comic bugged me when I first saw it, but I didn’t give it too much thought at first because its dismissive approach to media is quite familiar and a bit tiresome:

On reflections, however, and in combination with other nonsense I’ve been reading, the irksome joke/not joke hasn’t faded from my thinking. So I’ll be very unhip and respond without the appropriate ironic detachment that modern life demands of us, where everything is cool and chill and like, dude, whatever ….

(more…)

Sitting in Cars

Posted: November 29, 2014 in Culture, Health, Idle Nonsense
Tags: ,

I’m fortunate to spend minimal time in my car. I’ve logged fewer than 5,000 miles in each of the last 5 years or more, but I still find owning a vehicle indispensable for some of my activities. So when I’m out and about, it’s more likely that I’m walking on the sidewalk, pedaling my bicycle, or riding public transportation. Each has its own dynamic, but I have been noticing lately that there are surprising number of people sitting in their cars, engines running. The bicycle in particular requires hypervigilance on my part so that someone doesn’t open the car door into my path, and as I ride by I take notice of an unexpectedly large number of occupants of cars going nowhere.

Because of the durations involved, my sense is that car sitters aren’t in the process of entering and exiting; maybe they’re waiting on another person. There is almost always a smartphone or tablet active in front of their faces. Sometimes, they are sitting and smoking, listening to radio, or talking on the phone. Nonetheless, the cabin of an automobile strikes me as an odd place to hang out (and burn fuel).

I can’t profess to understand fully what’s at work here, but it’s clearly not as exceptional as I might have believed. My conjecture is that the interior of the car represents a defined personal space or refuge. It may not be entering a cocoon or reentering the womb exactly, but both comparisons spring to mind. I note, too, that walling oneself off from the external world, however temporarily, and exerting total control over the immediate environment may provide a fleeting sense of security, privacy, and wholeness lost in the wider world we inhabit. I can’t say if a growing number of automobile hermits are using their cars to regain personal equilibrium before rejoining and confronting the world anew every day. Perhaps one my my readers can provide an explanation or point to some research.

“Human beings are ashamed to have been born instead of made.” – Günther Anders

For a fertile mind, nearly everything is a potential metaphor or microcosm for something else. It gets tiresome, really. Still, I couldn’t help but to reflect on this post at On an Overgrown Path as a particularly on-point example of what I’ve been working out over numerous blog posts, namely, that our discontentment over being human, with its inherent limitations, is boiling over. Case in point: music is now routinely given a slick, post-production shove toward hyperreality. That assertion is probably not clear to anyone wandering into The Spiral Staircase without the benefit of prior familiarity with my themes, so let me unpack it a bit.

The essence of the linked blog post above is that media have altered musical perspective (e.g., stage perspective, podium perspective, audience perspective, stereo hifi perspective, and in- or over-ear perspective) to such a degree that acoustics developed intuitively over generations (and hardened into convention) to enhance natural sound must now be supplanted by subtle (or not so subtle) amplification and digital processing to satisfy a generation that may never have stepped inside a concert hall and is instead acculturated to the isolating, degraded sound of earbuds and headphones playing back mp3s. Reorienting concert soundscapes and recordings to model immersive, inside-the-head experience (VR tricks the eye in a similar fashion) is promulgated as inevitable if music presenters wish to attract new generations of concertgoers and thus retain audiences. The blogger follows up later with another post entitled “Technology Reveals Information but Annuls Perception,” which appears to be in conflict with his earlier contentions. (He also dismisses my corrective comment, but no matter.)

I don’t really care much about audience building or the business and marketing aspects of music; others can attend to those concerns. However, the engineering and construction of virtual space, head space, and/or hyperreality, proceeding in slow, incremental steps, is of grave concern to me. We are turning our backs on the world (the body and the sensorium) and fleeing into our heads and ideation. How fully does the gradual disappearance of natural sound in the ear (namely, wearing earbuds 24/7) signify the dire condition of humanity? Impossible to quantify, of course, but considering how omnipresent technology retrains attention and focus away from the environment toward itself in the form of playback devices and handheld screens, I would say that to be part of the modern world means agreeing to be media (and consumer) slaves. Furthermore, the faux reality found there is edited and distorted to achieve maximum impact in minimal time, but the result is overstimulation giving way to catatonia.

When I was a boy, I felt the shut-down reflex in response to the venerable three-ring circus that came to town periodically: too much everything, so ultimately very little or nothing. The same overkill aesthetic is true now of most media, which are saturated with blinkered content to rivet attention — a bubbling, pseudo-glamorous effervescence — but nonetheless fail to register on stripped-out senses. I can think of no better example than events where amplified sound is bone-crushingly loud, i.e., destroying the small, conductive bones in the inner ear leaving unprotected listeners’ ears ringing temporarily, and over time, damaging hearing permanently. The sheer volume has the effect of isolating everyone (alone in a crowd) and reducing them to voiceless, gesticulating grunts. For example, I have attended concerts (indoor and outdoor), dance clubs, wedding receptions, and fundraisers where the sound level was well above the 85 db sufficient to cause hearing loss, yet people just stand there and take it. The disconnect from reality and failure to react to the aural onslaught (by leaving or putting in earplugs) is astonishing. There is no sane reason to believe such conditions are enlivening and inevitable, yet those are in fact fashionable behaviors and recommendations.

Admittedly, destroying one’s ears is not the same as wrecking concert hall acoustics or recording perspective, but they are part and parcel of the same underlying mentality: a discontentment with human limitation. Cinema is going the same direction with gimmicky use of CGI and eye-popping camera effects that deliver views and perspectives that have lost all relation with mundane reality. The desire to transcend the banal is terrific when guided by a wizened aesthetic. When motivated by boredom or shame at our inability to be superhuman, well, that’s something quite different.

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

I have warned and been warned plenty about how the weird wired world is remapping our brains, our minds, and our consciousness in ways we don’t yet understand and probably never will, consider how the target keeps moving, eluding our attempts to uncover its inner nature. So it came as no surprise to me at least that, as reported in The Telegraph, there are now toddlers so deep into their iPads they require therapy. The epidemic of screenheads and pixelbrains has tracked all the way down to 3- and 4-year-olds:

Psychiatrists estimate that the number of people who have become digitally dependent has risen by 30 per cent over the past three years.

A survey last week revealed that more than half of parents allowed their babies to play with their phone or tablet device.

One in seven of more than 1,000 parents questioned by babies.co.uk website admitted that they let them use the gadgets for four or more hours a day.

Um, what’s with the one-sentence paragraphs at The Telegraph? I also find it a little strange how the report describes the problem in terms of addiction. Lumping more and more things into that category (alcohol, drugs, smoking, and gambling being among the mainstays, but sex, porn, gaming, and now iPads?) does no one any credit but does transform the problem into something done to a person rather than something one does to him- or herself. This is certainly the case with toddlers, who really have no responsibility for what happens to them. Their agency is quite limited; parents and caregivers are the ones who set up the kids to need therapy by providing them inappropriate devices as toys/pacifiers. The iPad babysitter is not the equivalent of handing toddlers real guns with which to play, but logical effects do appear to manifest within a relatively short time frame. So parents are pulling a trigger with a delayed effect, not unlike poor diet and hygiene, which are child endangerment issues sufficient to remove the children from their parents’ care.

If a cult deprogramming style of intervention becomes necessary, which is lightly being called “digital detox,” toddlers may have to be weaned, but parents should be sat down in a circle to have overwhelming pressure applied until their characters break down and they can be taught some parenting skills. Indeed, this is an example why misanthropes suggest that procreation should be limited, not open to anyone with the right functional biological parts. That restriction will never come to pass, of course, but something ought to happen to adults who ruin their own children, unwittingly or not.

That, too, will never happen, in part because the postmodern world is in a long-term project to put everything behind glass: first the glass of the microscope and telescope, then the glass of the film projector and television, then the glass of the computer screen, smartphone, e-reader, tablet, and virtual reality headset. No small thing, then, that Google calls its eyeglass-mounted display the Google Glass. Corning had a disgusting promotional video some time back about glass countertops and appliance façades with computer displays behind them. I suspect such devices are mere baby steps until engineers figure out how to do holographic displays and bioengineers provide the data feed directly into the nervous system — the creepy Google Implant. Despite claims that these developments are about making the virtual world real, I insist it’s really about our retreat into fantasy, where the pixelated objects of our desire are antiseptic and pure, unlike, say, the messy, nauseating fecundity of biology, where plants and animals kill to eat, poop, and decay after death. It’s transhumanism carried beyond wishful thinking, and the toddlers in the linked news report above show that we won’t tolerate living without access to the feed (while it lasts), even if it proves harmful. (more…)