Archive for the ‘Consumerism’ Category

In the lost decades of my youth (actually, early adulthood, but to an aging fellow like me, that era now seems like youth), I began to acquire audio equipment and recordings (LPs, actually) to explore classical music as an alternative to frequent concert attendance. My budget allowed only consumer-grade equipment, but I did my best to choose wisely rather than guess and end up with flashy front-plates that distract from inferior sound (still a thing, as a visit to Best Buy demonstrates). In the decades since, I’ve indulged a modest fetish for high-end electronics that fits neither my budget nor lifestyle but nonetheless results in my simple two-channel stereo (not the surround sound set-ups many favor) of individual components providing fairly astounding sonics. When a piece exhibits problems or a connection gets interrupted, I often resort to older, inferior, back-up equipment before troubleshooting and identifying the problem. Once the correction is made, return to premium sound is an unmistakable improvement. When forced to resort to less-than-stellar components, I’m sometimes reminded of a remark a friend once made, namely, that when listening, he tries to hear the quality in the performance despite degraded reproduced sound (e.g., surface noise on the LP).

Though others may argue, I insist that popular music does not requires high fidelity to enjoy. The truth in that statement is evidenced by how multifunction devices such as phones and computers are used by most people to listen to music. Many influencers laugh and scoff at the idea that anyone would buy physical media or quality equipment anymore; everything now is streamed to their devices using services such as Spotify, Apple Music, or Amazon Prime. From my perspective, they’re fundamentally insensitive to subtle gradations of sound. Thumping volume (a good beat) is all that’s needed or understood.

However, multifunction devices do not aim at high fidelity. Moreover, clubs and outdoor festivals typically use equipment designed for sheer volume rather than quality. Loud jazz clubs might be the worst offenders, especially because intimate, acoustic performance (now mostly abandoned) set an admirable artistic standard only a few decades ago. High volume creates the illusion of high energy, but diminishing returns set in quickly as the human auditory system reacts to extreme volume by blocking as much sound as possible to protect itself from damage, or more simply, by going deaf slowly or quickly. Reports of performers whose hearing is wrecked from short- or long-term overexposure to high volume are legion. Profound hearing loss is already appearing throughout the general public the same way enthusiastic sunbathers are developing melanoma.

As a result of technological change, notions of how music is meant to sound is shifting. Furthermore, the expectation that musical experiences are to be shared by audiences of more than, say, a few people at a time is giving way to the singular, private listening environment enabled by headphones and earbuds. (Same thing happened with reading.) Differences between music heard communally in a purposed performance space (whether live or reproduced) and music reproduced in the ear (earbuds) or over the ear (headphones) canal — now portable and ubiquitous — lead to audio engineers shifting musical perspective yet again (just as they did at the onset of the radio and television eras) to accommodate listeners with distorted expectations how music should sound.

No doubt, legitimate musical experiences can be had through reproduced sound, though degraded means produce lesser approximations of natural sound and authenticity as equipment descends in price and quality or the main purpose is simply volume. Additionally, most mainstream popular musics require amplification, as opposed to traditional acoustic forms of musicmaking. Can audiences/listeners actually get beyond degradation and experience artistry and beauty? Or must we be content with facsimiles that no longer possess the intent of the performers or a robust aesthetic experience? These may well be questions for the ages for which no solid answers obtain.

Advertisements

See this post on Seven Billion Day only a few years ago as a launching point. We’re now closing in on 7.5 billion people worldwide according to the U.S. Census Bureau. At least one other counter indicates we’ve already crossed that threshold. What used to be called the population explosion or the population bomb has lost its urgency and become generically population growth. By now, application of euphemism to mask intractable problems should be familiar to everyone. I daresay few are fooled, though plenty are calmed enough to stop paying attention. If there is anything to be done to restrain ourselves from proceeding down this easily recognized path to self-destruction, I don’t know what it is. The unwillingness to accept restraints in other aspects of human behavior demonstrate pretty well that consequences be damned — especially if they’re far enough delayed in time that we get to enjoy the here and now.

Two additional links (here and here) provide abundant further information on population growth if one desired to delve more deeply into the topic. The tone of these sites is sober, measured, and academic. As with climate change, hysterical and panic-provoking alarmism is avoided, but dangers known decades and centuries ago have persisted without serious redress. While it’s true that growth rate (a/k/a replacement rate) has decreased considerably since its peak in 1960 or so (the height of the postwar baby boom), absolute numbers continue to climb. The lack of immediate concern reminds me of Al Bartlett’s articles and lectures on the failure to understand the exponential function in math (mentioned in my prior post). Sure, boring old math about which few care. The metaphor that applies is yeast growing in a culture with a doubling factor that makes everything look just peachy until the final doubling that kills everything. In this metaphor, people are the unthinking yeast that believe there’s plenty of room and food and other resources in the culture (i.e., on the planet) and keep consuming and reproducing until everyone dies en mass. How far away in time that final human doubling is no one really knows.

Which brings me to something rather ugly: hearings to confirm Brett Kavanaugh’s appointment to the U.S. Supreme Court. No doubt conservative Republican presidents nominate similarly conservative judges just as Democratic presidents nominate progressive centrist judges. That’s to be expected. However, Kavanaugh is being asked pointed questions about settled law and legal precedents perpetually under attack by more extreme elements of the right wing, including Roe v. Wade from 1973. Were we (in the U.S.) to revisit that decision and remove legal abortion (already heavily restricted), public outcry would be horrific, to say nothing of the return of so-called back-alley abortions. Almost no one undertakes such actions lightly. A look back through history, however, reveals a wide range of methods to forestall pregnancy, end pregnancies early, and/or end newborn life quickly (infanticide). Although repugnant to almost everyone, attempts to legislate abortion out of existence and/or punish lawbreakers will succeed no better than did Prohibition or the War Against Drugs. (Same can be said of premarital and underage sex.) Certain aspects of human behavior are frankly indelible despite the moral indignation of one or another political wing. Whether Kavanaugh truly represents the linchpin that will bring new upheavals is impossible to know with certainty. Stay tuned, I guess.

Abortion rights matter quite a lot when placed in context with population growth. Aggregate human behaviors drive out of existence all sorts of plant and animal populations routinely. This includes human populations (domestic and foreign) reduced to abject poverty and mad, often criminal scrambles for survival. The view from on high is that those whose lives fall below some measure of worthwhile contribution are useless eaters. (I don’t recommend delving deeper into that term; it’s a particularly ugly ideology with a long, tawdry history.) Yet removing abortion rights would almost certainly  swell those ranks. Add this topic to the growing list of things I just don’t get.

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielding us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

Haven’t purged my bookmarks in a long time. I’ve been collecting material about technological dystopia already now operating but expected to worsen. Lots of treatments out there and lots of jargon. My comments are limited.

Commandeering attention. James Williams discusses his recognition that interference media (all modern media now) keep people attuned to their feeds and erode free will, ultimately threatening democratic ideals by estranging people from reality. An inversion has occurred: information scarcity and attention abundance have become information abundance and attention scarcity.

Outrage against the machines. Ran Prieur (no link) takes a bit of the discussion above (probably where I got it) to illustrate how personal responsibility about media habits is confused, specifically, the idea that it’s okay for technology to be adversarial.

In the Terminator movies, Skynet is a global networked AI hostile to humanity. Now imagine if a human said, “It’s okay for Skynet to try to kill us; we just have to try harder to not be killed, and if you fail, it’s your own fault.” But that’s exactly what people are saying about an actual global computer network that seeks to control human behavior, on levels we’re not aware of, for its own benefit. Not only has the hostile AI taken over — a lot of people are taking its side against their fellow humans. And their advice is to suppress your biological impulses and maximize future utility like a machine algorithm.

Big Data is Big Brother. Here’s a good TedTalk by Zeynep Tufekci on how proprietary machine-learning algorithms we no longer control or understand, ostensibly used to serve targeted advertising, possess the power to influence elections and radicalize people. I call the latter down-the-rabbit-hole syndrome, where one innocuous video or news story is followed by another of increasing extremity until the viewer or reader reaches a level of outrage and indignation activating an irrational response.

(more…)

rant on/

Four years, ago, the Daily Mail published an article with the scary title “HALF the world’s wild animals have disappeared in 40 years” [all caps in original just to grab your eyeballs]. This came as no surprise to anyone who’s been paying attention. I blogged on this very topic in my review of Vaclav Smil’s book Harvesting the Biosphere, which observed at the end a 50% decrease in wild mammal populations in the last hundred years. The estimated numbers vary according to which animal population and what time frame are under consideration. For instance, in 2003, CNN reported that only 10% of big ocean fish remain compared to 47 years prior. Predictions indicate that the oceans could be without any fish by midcentury. All this is old news, but it’s difficult to tell what we humans are doing about it other than worsening already horrific trends. The latest disappearing act is flying insects, whose number have decreased by 75% in the last 25 years according to this article in The Guardian. The article says, um, scientists are shocked. I don’t know why; these articles and indicators of impending ecological collapse have been appearing regularly for decades. Similar Malthusian prophesies are far older. Remember colony collapse disorder? Are they surprised it’s happening now, as opposed to the end of the 21st century, safely after nearly everyone now alive is long dead? C’mon, pay attention!

Just a couple days ago, the World Meteorological Association issued a press release indicating that greenhouse gases have surged to a new post-ice age record. Says the press release rather dryly, “The abrupt changes in the atmosphere witnessed in the past 70 years are without precedent.” You don’t say. Even more astoundingly, the popular online news site Engadget had this idiotic headline: “Scientists can’t explain a ‘worrying’ rise in methane levels” (sourcing Professor Euan Nisbet of Royal Holloway University of London). Um, what’s to explain? We’ve been burning the shit out of planetary resources, temperatures are rising, and methane formerly sequestered in frozen tundra and below polar sea floors is seeping out. As I said, old news. How far up his or her ass has any reputable scientist’s head got to be to make such an outta-touch pronouncement? My answer to my own question: suffocation. Engadget made up that dude just for the quote, right? Nope.

Not to draw too direct a connection between these two issues (wildlife disappearances and greenhouse gases — hey, I said pay attention!) because, ya know, reckless conjecture and unproven conclusions (the future hasn’t happened yet, duh, it’s the future, forever telescoping away from us), but a changing ecosystem means evolutionary niches that used to support nature’s profundity are no longer doing so reliably. Plus, we just plain ate a large percentage of the animals or drove them to extinction, fully or nearly (for now). As these articles routinely and tenderly suggest, trends are “worrying” for humans. After all, how are we gonna put seafood on our plates when all the fish have been displaced by plastic?

rant off/

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (100 years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

When I first wrote about this topic back in July 2007, I had only just learned of the Great Pacific Garbage Patch (and similar garbage gyres in others oceans). Though I’d like to report simply that nothing has changed, the truth is that conditions have worsened. Some commentators have rationalized contextualized the issue by observing that the Earth, the environment, the ecosphere, the biosphere, Gaia, or whatever one wishes to call the natural world has always been under assault by humans, that we’ve never truly lived in balance with nature. While that perspective may be true in a literal sense, I can’t help gnashing my teeth over the sheer scale of the assault in the modern industrial age (extending back 250+ years but really getting going once the steam engine was utilized widely). At that point, production and population curves angled steeply upwards, where they continue point as though there be no biophysical limits to growth or the amount and degree of destruction that can be absorbed by the biosphere. Thus, at some undetermined point, industrial scale became planetary scale and humans became terraformers.

News reports came in earlier this month that the remote and uninhabited (by humans) Henderson Island in the Pacific is now an inadvertent garbage dump, with estimates of over 17 tons of debris littering its once-pristine shores.

17hendersonisland1-superjumbo

This despoliation is a collateral effect of human activity, not the predictable result of direct action, such as with the Alberta Tar Sands, another ecological disaster (among many, many others). In the U.S., the Environmental Protection Agency (EPA) describes its mission as protecting human health and the environment and has established a Superfund to clean up contaminated sites. Think of this as a corporate subsidy, since the principal contaminators typically inflict damage in the course of doing business and extracting profit then either move on or cease to exist. Standard Oil is one such notorious entity. Now that the EPA is in the process of being defunded (and presumably on its way to being deauthorized) by the current administration of maniacs, the ongoing death-by-a-thousand-cuts suffered by the natural world will likely need to be revised to death-by-millions-of-cuts, a heedless acceleration of the death sentence humans have set in motion. In the meantime, industry is being given a freer hand to pollute and destroy. What could possibly go wrong?

If all this weren’t enough, another development darkened my brow recently: the horrific amount of space debris from decades of missions to put men, communications and surveillance satellites, and (one would presume) weapons in orbit. (Maybe the evil brainchild of inveterate cold warriors known unironically as “Star Wars” never actually came into being, but I wouldn’t place any bets on that.) This video from the Discovery Network gives one pause, no?

Admittedly, the dots are not actual size and so would not be as dense or even visible from the point of view of the visualization, but the number of items (20,000+ pieces) is pretty astonishing. (See this link as well.) This report describes some exotic technologies being bandied about to address problem of space junk. Of course, that’s just so that more satellites and spacecraft can be launched into orbit as private industry takes on the mantle once enjoyed exclusively by NASA and the Soviet space program. I suppose the explorer’s mindset never diminishes even as the most remote places on and now around Earth are no longer untouched but human refuse.

From the not-really-surprising-news category comes a New Scientist report earlier this month that the entire world was irradiated by follow-on effects of the Fukushima disaster. Perhaps it’s exactly as the article states: the equivalent of one X-ray. I can’t know with certainty, nor can bupkis be done about it by the typical Earth inhabitant (or the atypical inhabitant, I might add). Also earlier this month, a tunnel collapse at the Dept. of Energy’s Hanford nuclear waste storage site in Washington State gave everyone a start regarding possible or potential nearby release of radiation. Similar to Fukushima, I judge there is little by way of trust regarding accurate news or disclosure and fuck all anyone can do about any of it.

I’m far too convinced of collapse by now to worry too much about these Tinkerbells, knowing full well that what’s to come will be worse by many magnitudes of order when the firecrackers start popping due to inaction and inevitability. Could be years or decades away still; but as with other aspects of collapse, who knows precisely when? Risky energy plant operations and nuclear waste disposal issues promise to be with us for a very long time indeed. Makes it astonishing to think that we plunged full-steam ahead without realistic (i.e., politically acceptable) plans to contain the problems before creating them. Further, nuclear power is still not economically viable without substantial government subsidy. The likelihood of abandonment of this technological boondoggle seems pretty remote, though perhaps not as remote as the enormous expense of decommissioning all the sites currently operating.

These newsbits and events also reminded me of the despair I felt in 1986 on the heels of the Chernobyl disaster. Maybe in hindsight it’s not such a horrible thing to cede entire districts to nature for a period of several hundred years as what some have called exclusion or sacrifice zones. Absent human presence, such regions demonstrate remarkable resilience and profundity in a relatively short time. Still, it boggles the mind, doesn’t it, to think of two exclusion zones now, Chernobyl and Fukushima, where no one should go until, at the very least, the radioactive half-life has expired? Interestingly, that light at the end of the tunnel, so to speak, seems to be telescoping even farther away from the date of the disaster, a somewhat predictable shifting of the goalposts. I’d conjecture that’s because contamination has not yet ceased and is actually ongoing, but again, what do I know?

On a lighter note, all this also put me in mind of the hardiness of various foodstuffs. God knows we consume loads of crap that can hardly be called food anymore, from shelf-stable fruit juices and bakery items (e.g., Twinkies) that never go bad to not-cheese used by Taco Bell and nearly every burger joint in existence to McDonald’s burgers and fries that refuse to spoil even when left out for months to test that very thing. It give me considerable pause to consider that foodstuff half-lives have been radically and unnaturally extended by creating abominable Frankenfoods that beggar the imagination. For example, strawberries and tomatoes used to be known to spoil rather quickly and thus couldn’t withstand long supply lines from farm to table; nor were they available year round. Rather sensibly, people grew their own when they could. Today’s fruits and veggies still spoil, but interventions undertaken to extend their stability have frequently come at the expense of taste and nutrition. Organic and heirloom markets have sprung up to fill those niches, which suggest the true cost of growing and distributing everyday foods that will not survive a nuclear holocaust.

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)