Archive for the ‘Taste’ Category

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielded us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

Advertisements

I’m currently reading Go Wild by John Ratey and Richard Manning. It has some rather astounding findings on offer. One I’ll draw out is that the human brain evolved not for thinking, as one might imagine, but for coordinating complex physiological movements:

… even the simplest of motions — a flick of a finger or a turn of the hand to pick up a pencil — is maddeningly complex and requires coordination and computational power beyond electronics abilities. For this you need a brain. One of our favorites quotes on this matter comes from the neuroscientists Rodolfo Llinás: “That which we call thinking is the evolutionary internationalization of movement.” [p. 100]

Almost all the computation is unconsciousness, or maybe preconscious, and it’s learned over a period of years in infancy and early childhood (for basic locomotion) and then supplemented throughout life (for skilled motions, e.g., writing cursive or typing). Moreover, those able to move with exceptional speed, endurance, power, accuracy, and/or grace are admired and sometimes rewarded in our culture. The obvious example is sports. Whether league sports with wildly overcompensated athletes, Olympic sports with undercompensated athletes, or combat sports with a mixture of both, thrill attaches to watching someone move effectively within the rule-bound context of the sport. Other examples include dancers, musicians, circus artists, and actors who specialize in physical comedy and action. Each develops specialized movements that are graceful and beautiful, which Ratey and Manning write may also account for nonsexual appreciation and fetishization of the human body, e.g., fashion models, glammed-up actors, and nude photography.

I’m being silly saying that jocks figgered it first, of course. A stronger case could probably be made for warriors in battle, such as a skilled swordsman. But it’s jocks who are frequently rewarded all out of proportion with others who specialize in movement. True, their genetics and training enable a relatively brief career (compared to, say, surgeons or pianists) before abilities ebb away and a younger athlete eclipses them. But a fundamental lack of equivalence with artisans and artists is clear, whose value lies less with their bodies than with outputs their movements produce.

Regarding computational burdens, consider the various mechanical arms built for grasping and moving objects, some of them quite large. Mechanisms (frame and hydraulics substituting for bone and muscle) themselves are quite complex, but they’re typically controlled by a human operator rather than automated. (Exceptions abound, but they’re highly specialized, such as circuit board manufacture or textile production.) More recently, robotics demonstrate considerable advancement in locomotion without a human operator, but they’re also narrowly focused in comparison with the flexibility of motion a human body readily possesses. Further, in the case of flying drones, robots operate in wide open space, or, in the case of those designed to move like dogs or insects, use 4+ legs for stability. The latter are typically built to withstand quite a lot of bumping and jostling. Upright bipedal motion is still quite clumsy in comparison with humans, excepting perhaps wheeled robots that obviously don’t move like humans do.

Curiously, the movie Pacific Rim (sequel just out) takes notice of the computational or cognitive difficulty of coordinated movement. To operate giant robots needed to fight Godzilla-like interdimensional monsters, two mind-linked humans control a battle robot. Maybe it’s a simple coincidence — a plot device to position humans in the middle of the action (and robot) rather than killing from a distance — such as via drone or clone — or maybe not. Hollywood screenwriters are quite clever at exploiting all sorts material without necessarily divulging the source of inspiration. It’s art imitating life, knowingly or not.

Fan Service

Posted: December 27, 2017 in Artistry, Cinema, Culture, Idle Nonsense, Media, Taste
Tags:

Having just seen the latest installment of the supermegahit Star Wars franchise, my thinking drifted ineluctably to the issue of fan service. There is probably no greater example of the public claiming ownership of popular culture than with Star Wars, which has been a uniquely American phenomenon for 40 years and risen to the level of a new mythology. Never mind that it was invented out of whole cloth. (Some argue that the major religions are also invented, but that’s a different subject of debate.) Other invented, segmented mythologies include Rowling’s Harry Potter series (books before movies), Tolkien’s Lord of the Rings (books before movies), Martin’s Game of Thrones (books before TV show), and Wagner’s Ring of the Nibelung (operas). It’s little surprise (to me, at least) that the new American mythology stems from cinema rather than literature or music.

Given the general public’s deep knowledge of the Star Wars canon, it’s inevitable that some portion of the each installment of the franchise must cite and rhyme recognizable plots, dialogue, and thematic elements, which is roughly analogous to one’s favorite band playing its hits rather than offering newly composed music at every concert. With James Bond (probably the first movie franchise, though book series written by Sir Arthur Conan Doyle and Agathe Christie long ago established the model for recurring characters), story elements were formalized rather early in its history and form the foundation of each later story. Some regard the so-called formula as a straitjacket, whereas others derive considerable enjoyment out of familiar elements. So, too, with Star Wars. The light sabers, the spaceships, the light and dark sides of the force, the plucky rebels, the storm troopers, the disfigured villains, and the reluctant hero all make their appearances and reappearances in different guises. What surprised me most about The Last Jedi is how frequently and skillfully fan service was handled, typically undercutting each bit to simultaneously satisfy and taunt viewers. Some indignant fanboys (and -girls) have actually petitioned to have The Last Jedi struck from the Star Wars canon for defying franchise conventions so flagrantly.

New media have enabled regular folks to indulge their pet theories of the Star Wars universe in public fora, and accordingly, no shortage of overexcited analysis exists regarding plots, family relationships, cat-and-mouse strategics, and of course, possible stories to be told in an ever-expanding cinematic universe promising new films with nauseating regularity for the foreseeable future, or at least so long as the intellectual property owners can wring giant profits out of the series. This is what cinematic storytelling has become: setting up a series and wringing every last bit of value out of it before leaving it fallow and untended for a decade or more and then rebooting the entire stinking mess. The familiar criticism is Hollywood Out of Ideas, which often rings true except when one considers that only a few basic narrative structures exist in the first place. All the different manifestations are merely variations upon familiar themes, another form of fan service.

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (100 years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

From the not-really-surprising-news category comes a New Scientist report earlier this month that the entire world was irradiated by follow-on effects of the Fukushima disaster. Perhaps it’s exactly as the article states: the equivalent of one X-ray. I can’t know with certainty, nor can bupkis be done about it by the typical Earth inhabitant (or the atypical inhabitant, I might add). Also earlier this month, a tunnel collapse at the Dept. of Energy’s Hanford nuclear waste storage site in Washington State gave everyone a start regarding possible or potential nearby release of radiation. Similar to Fukushima, I judge there is little by way of trust regarding accurate news or disclosure and fuck all anyone can do about any of it.

I’m far too convinced of collapse by now to worry too much about these Tinkerbells, knowing full well that what’s to come will be worse by many magnitudes of order when the firecrackers start popping due to inaction and inevitability. Could be years or decades away still; but as with other aspects of collapse, who knows precisely when? Risky energy plant operations and nuclear waste disposal issues promise to be with us for a very long time indeed. Makes it astonishing to think that we plunged full-steam ahead without realistic (i.e., politically acceptable) plans to contain the problems before creating them. Further, nuclear power is still not economically viable without substantial government subsidy. The likelihood of abandonment of this technological boondoggle seems pretty remote, though perhaps not as remote as the enormous expense of decommissioning all the sites currently operating.

These newsbits and events also reminded me of the despair I felt in 1986 on the heels of the Chernobyl disaster. Maybe in hindsight it’s not such a horrible thing to cede entire districts to nature for a period of several hundred years as what some have called exclusion or sacrifice zones. Absent human presence, such regions demonstrate remarkable resilience and profundity in a relatively short time. Still, it boggles the mind, doesn’t it, to think of two exclusion zones now, Chernobyl and Fukushima, where no one should go until, at the very least, the radioactive half-life has expired? Interestingly, that light at the end of the tunnel, so to speak, seems to be telescoping even farther away from the date of the disaster, a somewhat predictable shifting of the goalposts. I’d conjecture that’s because contamination has not yet ceased and is actually ongoing, but again, what do I know?

On a lighter note, all this also put me in mind of the hardiness of various foodstuffs. God knows we consume loads of crap that can hardly be called food anymore, from shelf-stable fruit juices and bakery items (e.g., Twinkies) that never go bad to not-cheese used by Taco Bell and nearly every burger joint in existence to McDonald’s burgers and fries that refuse to spoil even when left out for months to test that very thing. It give me considerable pause to consider that foodstuff half-lives have been radically and unnaturally extended by creating abominable Frankenfoods that beggar the imagination. For example, strawberries and tomatoes used to be known to spoil rather quickly and thus couldn’t withstand long supply lines from farm to table; nor were they available year round. Rather sensibly, people grew their own when they could. Today’s fruits and veggies still spoil, but interventions undertaken to extend their stability have frequently come at the expense of taste and nutrition. Organic and heirloom markets have sprung up to fill those niches, which suggest the true cost of growing and distributing everyday foods that will not survive a nuclear holocaust.

The Internet is now a little more than two decades old (far more actually, but I’m thinking of its widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating pushback and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene, medical conditions, or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

Even before I begin, you must know what the title means. It’s the proliferation of options that induces dread in the toothpaste aisle of the store. Paste or gel? Tartar control or extra whitening? Plain, mint, cinnamon, or bubble gum? The matrix of combinations is enough to reduce the typical shopper to a quivering state of high anxiety lest the wrong toothpaste be bought. Oh, how I long for the days when choices ran solely between plain Crest and Colgate. I can’t say whether the toothpaste effect originated with oral hygiene. A similarly bewildering host of choices confronts shoppers in the soft drink aisle. Foodstuffs seem especially prone to brand fragmentation. Woe be the retailer forced to shelve all 38 Heinz products on this page. (True, some are just different packaging of the same basic item, but still.)

Purveyors of alcoholic beverages are on the bandwagon, too. I rather like the bygone cliché of the cowboy/gunslinger riding off the range, swinging into the saloon, and ordering simply “whisky.” Nowadays, even a poorly stocked bar is certain to have a dozen or so whiskys (see this big brand list, which doesn’t include sub-brands or craft distillers.) Then come all the varieties of schnapps, rum, and vodka, each brand further fragmented with infusions and flavorings of every imaginable type. Some truly weird ones are found here. Who knew that these spirits were simply blank canvases awaiting the master distiller’s crazy inventiveness.

/rant on

What really gets my bile flowing on this issue, however, is the venerable Lays potato chip. Seriously, Frito-Lay, what are you thinking? You arguably perfected the potato chip, much like McDonald’s perfected the French fry. (Both are fried potato, interestingly.) Further, you have a timeless, unbeatable slogan: “betcha can’t eat just one.” The plain, salted chip, the “Classic” of the Lays brand, cannot be improved upon and is a staple comfort food. Yet you have succumbed to the toothpaste effect and gone haywire with flavorings (I won’t even countenance the Wavy, Poppables, Kettle-Cooked, Ruffles, and STAX varieties). For variety’s sake, I’d be content with a barbecue chip, maybe even salt & vinegar, but you’ve piled on past the point of ridiculousness:

  • cheddar & sour cream (a favorite of mine)
  • Chile limón
  • deli style
  • dill pickle
  • flamin’ hot
  • honey barbecue
  • limón
  • pico de gallo
  • salt & vinegar (not to my taste)
  • sour cream & onion (a good alternative)
  • sweet Southern heat barbecue
  • Southern biscuits & gravy
  • Tapatío (salsa picante)

(more…)

The first Gilded Age in the U.S. and the Roaring Twenties were periods that produced an overabundance of wealth for a handful of people. Some of them became synonymous with the term robber baron precisely for their ability to extract and accumulate wealth, often using tactics that to say the least lacked scruples when not downright criminal. The names include Rockefeller, Carnegie, Astor, Mellon, Stanford, Vanderbilt, Duke, Morgan, and Schwab. All have their names associated in posterity with famous institutions. Some are colleges and universities, others are banking and investment behemoths, yet others are place names and commercial establishments. Perhaps the philanthropy they practiced was not entirely generous, as captains of industry (then and today) seem to enjoy burnishing their legacies with a high level of name permanence. Still, one can observe that most of the institutions bearing their names are infrastructure useful to the general public, making them public goods. This may be partly because the early 20th century was still a time of nation building, whereas today is arguably a time of decaying empire.

The second Gilded Age in the U.S. commenced in the 1980s and is still going strong as measured by wealth inequality. However, the fortunes of today’s tycoons appear to be directed less toward public enrichment than toward self-aggrandizement. The very nature of philanthropy has shifted. Two modern philanthropists appear to be transitional: Bill Gates and Ted Turner. The Gates Foundation has a range of missions, including healthcare, education, and access to information technology. Ted Turner’s well-publicized $1 billion gift to the United Nations Foundation in 1997 was an open dare to encourage similar philanthropy among the ultrarich. The Turner Foundation website’s byline is “protecting & restoring the natural world.” Not to be ungrateful or uncharitable, but both figureheads are renowned for highhandedness in the fashion in which they gathered up their considerable fortunes and are redirecting some portion of their wealth toward pet projects that can be interpreted as a little self-serving. Philanthropic efforts by Warren Buffet appear to be less about giving away his own fortune to charities or establishing institutions bearing his name as they are about using his notoriety to raise charitable funds from others sources and thus stimulating charitable giving. The old saying applies especially to Buffet: “no one gets rich by giving it away.” More galling, perhaps, is another group of philanthropists, who seem to be more interested in building shrines to themselves. Two entries stand out: The Lucas Museum (currently seeking a controversial site in Chicago) and The Walmart Museum. Neither resembles a public good, though their press packages may try to convince otherwise.

Charity has also shifted toward celebrity giving, with this website providing philanthropic news and profiles of celebrities complete with their causes and beneficiaries. With such a wide range of people under consideration, it’s impossible to make any sweeping statements about the use or misuse of celebrity, the way entertainers are overcompensated for their talents, or even how individuals such as Richard Branson and Elon Musk have been elevated to celebrity status primarily for being rich. (They undoubtedly have other legitimate claims to fame, but they’re overshadowed in a culture that celebrates wealth before any other attribute.) And then there are the wealthy contributors to political campaigns, such as the Koch brothers, George Soros, and Sheldon Adelson, just to name a few. It’s fair to say that every contributor wants some bang for their buck, but I daresay that political contributors (not strictly charity givers) expect a higher quotient of influence, or in terms more consistent with their thinking, a greater return on investment.

None of this takes into account the charitable work and political contributions stemming from corporations and unions, or indeed the umbrella corporations that exist solely to raise funds from the general public, taking a sizeable share in administrative fees before passing some portion onto the eventual beneficiary. Topical charities and scams also spring up in response to whatever is the latest natural disaster or atrocity. What’s the average citizen to do when the pittance they can donate pales in comparison to that offered by the 1% (which would be over 3 million people in the U.S. alone)? Or indeed how does one guard against being manipulated by scammers (including the burgeoning number of street panhandlers) and political candidates into throwing money at fundamentally insoluble problems? Are monetary gifts really the best way of demonstrating charity toward the needy? Answers to these questions are not forthcoming.

Update: Closure has been achieved on the Lucas Museum coming to Chicago. After 2 years of litigation blocking any building on his proposed site on the lakefront, George Lucas has decided to seek a site in California instead. Both sides had to put their idiotic PR spin on the result, but most people I know are relieved not to have George Lucas making inroads into Chicago architecture. Now if only we could turn back time and stop Donald Trump.

Acid Added

Posted: February 11, 2016 in Health, Taste
Tags:

I traveled to Europe recently, which I haven’t done in a couple decades, and was reminded immediately of a danger that tends to go unacknowledged: the reduction of foreign lands and peoples to a series of clichés or stereotypes. Tourist guides and websites reinforce the effect. This tendency may be forgivable with respect to food, considering that one has multiple meals per day, which thus occupy a significant portion of one’s time and attention abroad. My rediscovery of truly fresh-baked bread (given the superior European tradition of daily shopping for bakery goods, I suspect that propionic acid and sodium propionate used as preservatives in American bread and other baked goods was not present) called to mind a book recommended at Gin and Tacos (see blogroll) entitled Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat by Anastacia Marx de Salcedo. Although obvious perhaps in hindsight, it was surprising to learn (via book blurbs and recommendations) that demand for foods that could withstand transport times to combat troops without spoilage was a principal driver of innovation in food processing technologies, which have been further refined over the decades by Big Ag.

Here a good example: peeled and skinless tangerines and mandarin oranges used in salads are passed through steam or hot water at about 90ºC for 2–3 minutes to loosen the peel and make it easier to separate from the segments. Segments are then separated and the segmental membrane is removed by chemical treatment — meaning it’s dissolved in an acid solution, which is neutralized in turn with an alkaline solution. This is described in greater detail in expired U.S. Patent No. 4,294,861 entitled “Method of Separating and Taking Out Pulp from Citrus Fruits.” Here’s the abstract:

A method and an apparatus for processing citrus fruits into a drink, which is the juice of the fruits containing separate juice vesicles, or sacs, of the pulp, by cutting the fruits into pieces and directing jets of a fluid against the cut surfaces, thereby separating and forcing the pulp in the form of separate sacs away from the peel and segmental membrane of the fruit pieces.

Of course, citric acid is naturally occurring in, well, citrus fruits. (Citric acid also makes for a surprisingly potent cleaning agent.) But I find it more than a little ooky to treat foods in acid baths, or to add acids to ingested foods (is there another kind?) as preservatives. Admittedly, all sort of acids are present naturally in foods: malic acid in apples and cherries; tartaric acid in grapes, pineapples, potatoes, and carrots; acetic acid in vinegar; oxalic acid in cocoa and pepper; tannic acid in tea and coffee; and benzoic acid in cranberries, prunes, and plums. Less natural but wholly familiar to typical Murricans, corrosive phosphoric acid (also known as orthophosphoric acid) is used as an acidifying agent in soft drinks (which also contain relatively harmless carbonic acid) and jams to provide a tangy flavor. Otherwise, the syrup/sugar content alone would be enough to make one vomit. Fumaric acid is also used in noncarbonated soft drinks.

Maybe none of these rise to the level of universal acid that eats through everything, including stomach linings, or to sulfuric acid found in batteries (or more simply, battery acid). Still, our food is nonetheless suffused in acids, and the idea of adding more to bakery goods to make them shelf stable may account for why European bakery goods made for that day only are so far superior to most American bakery goods able to sit in one’s breadbox almost indefinitely. Cue the periodic newsbit about a McDonald’s meal allowed to sit out for some extended period of time (often years) without spoiling in the least.