Archive for the ‘Idle Nonsense’ Category

Some phrases have a wide range of applicability, such as the book title “______ for Dummies.” The popular Netflix show Orange is the New Black is another, claiming cachet in criminality. Let me jump on the bandwagon and observe how Transgression is the New Chic. There are two aspects to how transgression has become the new “it” thing: committing a transgression and being transgressed. Seems these days everyone is positioning themselves along one axis or another, sometimes both.

Not many of us possess the ability to transgress others without consequences. To do so basically requires fuck-you money. Celebrity also helps. With those characteristics, however, one can get away with an awful lot of mischief and make themselves look pretty damn cool in the process (if one is impressed by such foolishness). At the top of the heap is our Bully-in-Chief, who is busy testing another man-child in a reckless exercise in brinkmanship that could easily blow up in our faces (and theirs, too, which would be criminal considering the mismatch of power — like a billionaire stealing from a fast food worker). Yet the impulse to puff up one’s chest and appear unwavering in resolve or whatever other silly justification enters the minds of status seekers is awfully strong. To rational minds, it looks like insanity. Sadly, the masses do not possess rational minds and so give the game credibility.

On the flip side, claiming victimization at imagined transgressions is another fantasy league populated by the emotionally needy. Snowflakes. Or the Strawberry Generation of people prone to spoil at the slightest whiff of life’s difficulties. The so-called microaggression and the demand for safe spaces are frequent power plays used by star players, where points are scored by cowing into submission administrators too timid to call bullshit on the charade. Berkeley administrators offering counseling for students “terrorized” by a speech delivered by Ben Shapiro (whether students actually attend is beside the point) is a good example. Feigning offense works when lying to oneself, too, so the master player get double points for transgressing him- or herself. Well played. When the cycle of blaming and bullying will subside is anyone’s guess.

Advertisements

rant on/

As the next in an as-yet unnumbered series of Storms of the Century (I predict more than a dozen at least) is poised to strike nearly the entirety of the State of Florida, we know with confidence from prior experience, recent and not so recent, that any lessons we might take regarding how human habitation situated along or near coastlines vulnerable to extreme weather events, now occurring with increasing frequency and vehemence, will remain intransigently unlearned. Instead, we’ll begin rebuilding on the very same sites as soon as construction labor and resources can be mustered and deployed. Happened in New Orleans and New Jersey; is about to happen in Houston; and will certainly happen all across Florida — even the fragile Florida Keys. I mean, shit, we can’t do without The Magic Kingdom and other attractions in the central-Florida tourist mecca, now can we?

This predictable spin around the dance floor might look like a tragicomic circus waltz (e.g., The Daring Young Man on the Flying Trapeze), or even out-of-tune, lopsided calliope music from the carousel, except that positioning ourselves right back in harm’s way would be better characterized as a danse macabre. I dub it the Builder’s Waltz, which could also be the Rebuilder’s Rumba, the Catastrophe Tango, the Demolition Jive … take your pick.

Obstinate refusal to apprehend reality as it slams into us is celebrated as virtue these days. Can’t lose hope even as dark forces coalesce all around us, right? Was it always so? Still, an inkling might be dawning on some addle-brained deniers that perhaps science-informed global warming and climate change news might actually be about something with real-world impact, such as dramatic reduction of oil refinery output or a lost citrus crop. So much for illusions of business as usual continuing unhindered into the foreseeable future. Instead, our future looks more like dominoes lined up to fall — like the line of hurricanes formed in the Atlantic. Good luck hunkering down and weathering once-in-a-lifetime storms that just keep coming. And rebuilding the same things in the same places, well, just let it go, man, ’cuz it’s already gone.

rant off/

Having been asked to contribute to a new group blog (Brains Unite), this is my first entry, which is cross-posted here at The Spiral Staircase. The subject of this post is the future of transportation. I’ve no expertise in this area, so treat this writing with the authority it deserves, which is to say, very little.

Any prediction of what awaits us must inevitably consider what has preceded us. Britain and the United States were both in the vanguard during the 19th and early 20th centuries when it came to innovation, and this is no less true of transportation than any other good or service. I’m not thinking of the routine travels one makes in the course of a day (e.g., work, church, school) but rather long excursions outside one’s normal range, a radius that has expanded considerably since then. (This hold true for freight transportation, too, but I am dropping that side of the issue in favor of passenger transit.) What is interesting is that travel tended to be communal, something we today call mass transit. For example, the Conestoga wagon, the stagecoach, the riverboat, and the rail car are each iconic of the 19th-century American West.

Passenger rail continued into the mid-20th century but was gradually replaced (in the U.S.) by individual conveyance as the automobile became economically available to the masses. Air travel commenced about the same time, having transitioned fairly quickly from 1 or 2 seats in an exposed cockpit to sealed fuselages capable of transporting 30+ people (now several hundred) at once. Still, as romantic as air travel may once have been (it’s lost considerable luster since deregulation as airlines now treat passengers more like freight), nothing beats the freedom and adventure of striking out on the road in one’s car to explore the continent, whether alone or accompanied by others.

The current character of transportation is a mixture of individual and mass transit, but without consulting numbers at the U.S. Dept. of Transportation, I daresay that the automobile is the primary means of travel for most Americans, especially those forced into cars by meager mass transit options. My prediction is that the future of transportation will be a gradual return to mass transit for two reasons: 1) the individual vehicle will become too costly to own and operate and 2) the sheer number of people moving from place to place will necessitate large transports.

While techno-utopians continue to conjure new, exotic (unfeasible) modes of transportation (e.g., the Hyperloop, which will purportedly enable passengers to make a 100-mile trip in about 12 minutes), they are typically aimed at individual transport and are extremely vulnerable to catastrophic failure (like the Space Shuttles) precisely because they must maintain human environments in difficult spaces (low orbit, underwater, inside depressurized tubes, etc.). They are also aimed at circumventing the congestion of conventional ground transportation (a victim of its own success now that highways in many cities resemble parking lots) and shortening transit times, though the extraordinary costs of such systems far exceed the benefit of time saved.

Furthermore, as climate change ramps up, we will witness a diaspora from regions inundated by rising waters, typically along the coasts where some 80% of human population resides (so I’ve read, can’t remember where). Mass migration out of MENA is already underway and will be joined by large population flows out of, for example, the Indian subcontinent, the Indonesian archipelago, and dustbowls that form in the interiors of continents. Accordingly, the future of transportation may well be the past:

46244-ngsversion-1422030682033-adapt-676-1
photo: National Geographic

and

2015-04-20a1-981x552
photo: UN Refugee Agency

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

From the not-really-surprising-news category comes a New Scientist report earlier this month that the entire world was irradiated by follow-on effects of the Fukushima disaster. Perhaps it’s exactly as the article states: the equivalent of one X-ray. I can’t know with certainty, nor can bupkis be done about it by the typical Earth inhabitant (or the atypical inhabitant, I might add). Also earlier this month, a tunnel collapse at the Dept. of Energy’s Hanford nuclear waste storage site in Washington State gave everyone a start regarding possible or potential nearby release of radiation. Similar to Fukushima, I judge there is little by way of trust regarding accurate news or disclosure and fuck all anyone can do about any of it.

I’m far too convinced of collapse by now to worry too much about these Tinkerbells, knowing full well that what’s to come will be worse by many magnitudes of order when the firecrackers start popping due to inaction and inevitability. Could be years or decades away still; but as with other aspects of collapse, who knows precisely when? Risky energy plant operations and nuclear waste disposal issues promise to be with us for a very long time indeed. Makes it astonishing to think that we plunged full-steam ahead without realistic (i.e., politically acceptable) plans to contain the problems before creating them. Further, nuclear power is still not economically viable without substantial government subsidy. The likelihood of abandonment of this technological boondoggle seems pretty remote, though perhaps not as remote as the enormous expense of decommissioning all the sites currently operating.

These newsbits and events also reminded me of the despair I felt in 1986 on the heels of the Chernobyl disaster. Maybe in hindsight it’s not such a horrible thing to cede entire districts to nature for a period of several hundred years as what some have called exclusion or sacrifice zones. Absent human presence, such regions demonstrate remarkable resilience and profundity in a relatively short time. Still, it boggles the mind, doesn’t it, to think of two exclusion zones now, Chernobyl and Fukushima, where no one should go until, at the very least, the radioactive half-life has expired? Interestingly, that light at the end of the tunnel, so to speak, seems to be telescoping even farther away from the date of the disaster, a somewhat predictable shifting of the goalposts. I’d conjecture that’s because contamination has not yet ceased and is actually ongoing, but again, what do I know?

On a lighter note, all this also put me in mind of the hardiness of various foodstuffs. God knows we consume loads of crap that can hardly be called food anymore, from shelf-stable fruit juices and bakery items (e.g., Twinkies) that never go bad to not-cheese used by Taco Bell and nearly every burger joint in existence to McDonald’s burgers and fries that refuse to spoil even when left out for months to test that very thing. It give me considerable pause to consider that foodstuff half-lives have been radically and unnaturally extended by creating abominable Frankenfoods that beggar the imagination. For example, strawberries and tomatoes used to be known to spoil rather quickly and thus couldn’t withstand long supply lines from farm to table; nor were they available year round. Rather sensibly, people grew their own when they could. Today’s fruits and veggies still spoil, but interventions undertaken to extend their stability have frequently come at the expense of taste and nutrition. Organic and heirloom markets have sprung up to fill those niches, which suggest the true cost of growing and distributing everyday foods that will not survive a nuclear holocaust.

The Internet is now a little more than two decades old (far more actually, but I’m thinking of its widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating pushback and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene, medical conditions, or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.

Punchfest

Posted: February 26, 2017 in Cinema, Culture, Idle Nonsense, Sports
Tags: , , ,

Early in the process of socialization, one learns that the schoolyard cry “Fight!” is half an alert (if one is a bystander) to come see and half an incitement to violence (if one is just entering into conflict). Fascination with seeing people duke it out, ostensibly to settle conflicts, never seems to grow old, though the mixed message about violence never solving anything sometimes slows things down. (Violence does in fact at least put an end to things. But the cycle of violence continues.) Fights have also lost the respectability of yore, where the victor (as with a duel or a Game of Thrones fight by proxy) was presumed to be vindicated. Now we mostly know better than to believe that might makes right. Successful aggressors can still be villains. Still, while the primal instinct to fight can be muted, it’s more typically channeled into entertainment and sport, where it’s less destructive than, say, warrior culture extending all the way from clans and gangs up to professional militaries.

Fighting in entertainment, especially in cinema, often depicts invulnerability that renders fighting pointless and inert. Why bother hitting Superman, the Incredible Hulk, Wolverine, or indeed any number of Stallone, Schwarzenegger, Segal, or Statham characters when there is no honest expectation of doing damage? They never get hurt, just irritated. Easy answer: because the voyeurism inherent in fighting endures. Even when the punchfest is augmented by guns we watch, transfixed by conflict even though outcomes are either predictable (heroes and good guys almost always win), moot, or an obvious set-up for the next big, stupid, pointless battle.

Fighting in sport is perhaps most classical in boxing, with weight classes evening out the competition to a certain degree. Boxing’s popularity has waxed and waned over time as charismatic fighters come and go, but like track and field, it’s arguably one of the purest expressions of sport, being about pure dominance. One could also argue that some team sports, such as hockey and American-style football, are as much about the collateral violence as about scoring goals. Professional wrestling, revealed to be essentially athletic acting, blends entertainment and sport, though without appreciable loss of audience appeal. As with cinema, fans seem to not care that action is scripted. Rising in popularity these days is mixed martial arts (MMA), which ups the ante over boxing by allowing all manner of techniques into the ring, including traditional boxing, judo, jiu-jitsu, wrestling, and straight-up brawling. If brawling works in the schoolyard and street against unwilling or inexperienced fighters, it rarely succeeds in the MMA ring. Skill and conditioning matter most, plus the lucky punch.

Every kid, boy or girl, is at different points bigger, smaller, or matched with someone else when things start to get ugly. So one’s willingness to engage and strategy are situational. In childhood, conflict usually ends quickly with the first tears or bloodied nose. I’ve fought on rare occasion, but I’ve never ever actually wanted to hurt someone. Truly wanting to hurt someone seems to be one attribute of a good fighter; another is the lack of fear of getting hit or hurt. Always being smaller than my peers growing up, if I couldn’t evade a fight (true for me most of the time), I would defend myself, but I wasn’t good at it. Reluctant willingness to fight was usually enough to keep aggressors at bay. Kids who grow up in difficult circumstances, fighting with siblings and bullies, and/or abused by a parent or other adult, have a different relationship with fighting. For them, it’s unavoidable. Adults who relish being bullies join the military and/or police or maybe become professional fighters.

One would have to be a Pollyanna to believe that we will eventually rise above violence and use of force. Perhaps it’s a good thing that in a period of relative peace (in the affluent West), we have alternatives to being forced to defend ourselves on an everyday basis and where those who want to can indulge their basic instinct to fight and establish dominance. Notions of masculinity and femininity are still wrapped up in how one expresses these urges, though in characteristic PoMo fashion, traditional boundaries are being erased. Now, everyone can be a warrior.

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

So the deed is done: the winning candidate has been duly delivered and solemnly sworn in as President of the United States. As I expected, he wasted no time and repaired to the Oval Office immediately after the inauguration (before the inaugural ball!) to sign an executive order aimed at the Affordable Care Act (a/k/a Obamacare), presumably to “ease the burden” as the legislative branch gets underway repealing and replacing the ACA. My only surprise is that he didn’t have a stack of similar executive orders awaiting signature at the very first opportunity. Of course, the president had not held back in the weeks up to the inauguration from issuing intemperate statements, or for that matter, indulging in his favorite form of attack: tweet storms against his detractors (lots of those). The culmination (on the very short term at least — it’s still only the weekend) may well have been the inaugural address itself, where the president announced that American interests come first (when has that ever not been the case?), which is being interpreted by many around the globe as a declaration of preemptive war.

The convention with each new presidential administration is to focus on the first hundred days. Back in November 2016, just after the election, National Public Radio (NPR) fact-checked the outline for the first hundred days provided by the campaign at the end of October 2016. With history speeding by, it’s unclear what portion of those plans have survived. Time will tell, of course, and I don’t expect it will take long — surely nowhere near 100 days.

So what is the difference between fulfilling one’s destiny and meeting one’s fate? The latter has a rather unsavory character to it, like the implied curse of the granted genie’s wish. The former smells vaguely of success. Both have a distinctly tragic whiff of inevitability. Either way, this new president appears to be hurrying headlong to effect changes promised during his campaign. If any wisdom is to be gathered at this most unpredictable moment, perhaps it should be a line offered today by fellow blogger the South Roane Agrarian (which may have in turn been stolen from the British version of House of Cards): “Beware of old men in a hurry.”

Aside: I was going to call this post “Fools Rush In,” but I already have one with that title and the slight revision above seems more accurate, at least until the bandwagon fills up.

Addendum: Seems I was partially right. There was a stack of executive orders ready to sign. However, they’ve been metered out over the course of the week rather than dumped in the hours shortly after the inauguration. What sort of calculation is behind that is pure conjecture. I might point out, though, that attention is riveted on the new president and will never subside, so there is no need, as in television, to keep priming the pump.