Archive for the ‘Idle Nonsense’ Category

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.

Punchfest

Posted: February 26, 2017 in Cinema, Culture, Idle Nonsense, Sports
Tags: , , ,

Early in the process of socialization, one learns that the schoolyard cry “Fight!” is half an alert (if one is a bystander) to come see and half an incitement to violence (if one is just entering into conflict). Fascination with seeing people duke it out, ostensibly to settle conflicts, never seems to grow old, though the mixed message about violence never solving anything sometimes slows things down. (Violence does in fact at least put an end to things. But the cycle of violence continues.) Fights have also lost the respectability of yore, where the victor (as with a duel or a Game of Thrones fight by proxy) was presumed to be vindicated. Now we mostly know better than to believe that might makes right. Successful aggressors can still be villains. Still, while the primal instinct to fight can be muted, it’s more typically channeled into entertainment and sport, where it’s less destructive than, say, warrior culture extending all the way from clans and gangs up to professional militaries.

Fighting in entertainment, especially in cinema, often depicts invulnerability that renders fighting pointless and inert. Why bother hitting Superman, the Incredible Hulk, Wolverine, or indeed any number of Stallone, Schwarzenegger, Segal, or Statham characters when there is no honest expectation of doing damage? They never get hurt, just irritated. Easy answer: because the voyeurism inherent in fighting endures. Even when the punchfest is augmented by guns we watch, transfixed by conflict even though outcomes are either predictable (heroes and good guys almost always win), moot, or an obvious set-up for the next big, stupid, pointless battle.

Fighting in sport is perhaps most classical in boxing, with weight classes evening out the competition to a certain degree. Boxing’s popularity has waxed and waned over time as charismatic fighters come and go, but like track and field, it’s arguably one of the purest expressions of sport, being about pure dominance. One could also argue that some team sports, such as hockey and American-style football, are as much about the collateral violence as about scoring goals. Professional wrestling, revealed to be essentially athletic acting, blends entertainment and sport, though without appreciable loss of audience appeal. As with cinema, fans seem to not care that action is scripted. Rising in popularity these days is mixed martial arts (MMA), which ups the ante over boxing by allowing all manner of techniques into the ring, including traditional boxing, judo, jiu-jitsu, wrestling, and straight-up brawling. If brawling works in the schoolyard and street against unwilling or inexperienced fighters, it rarely succeeds in the MMA ring. Skill and conditioning matter most, plus the lucky punch.

Every kid, boy or girl, is at different points bigger, smaller, or matched with someone else when things start to get ugly. So one’s willingness to engage and strategy are situational. In childhood, conflict usually ends quickly with the first tears or bloodied nose. I’ve fought on rare occasion, but I’ve never ever actually wanted to hurt someone. Truly wanting to hurt someone seems to be one attribute of a good fighter; another is the lack of fear of getting hit or hurt. Always being smaller than my peers growing up, if I couldn’t evade a fight (true for me most of the time), I would defend myself, but I wasn’t good at it. Reluctant willingness to fight was usually enough to keep aggressors at bay. Kids who grow up in difficult circumstances, fighting with siblings and bullies, and/or abused by a parent or other adult, have a different relationship with fighting. For them, it’s unavoidable. Adults who relish being bullies join the military and/or police or maybe become professional fighters.

One would have to be a Pollyanna to believe that we will eventually rise above violence and use of force. Perhaps it’s a good thing that in a period of relative peace (in the affluent West), we have alternatives to being forced to defend ourselves on an everyday basis and where those who want to can indulge their basic instinct to fight and establish dominance. Notions of masculinity and femininity are still wrapped up in how one expresses these urges, though in characteristic PoMo fashion, traditional boundaries are being erased. Now, everyone can be a warrior.

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

So the deed is done: the winning candidate has been duly delivered and solemnly sworn in as President of the United States. As I expected, he wasted no time and repaired to the Oval Office immediately after the inauguration (before the inaugural ball!) to sign an executive order aimed at the Affordable Care Act (a/k/a Obamacare), presumably to “ease the burden” as the legislative branch gets underway repealing and replacing the ACA. My only surprise is that he didn’t have a stack of similar executive orders awaiting signature at the very first opportunity. Of course, the president had not held back in the weeks up to the inauguration from issuing intemperate statements, or for that matter, indulging in his favorite form of attack: tweet storms against his detractors (lots of those). The culmination (on the very short term at least — it’s still only the weekend) may well have been the inaugural address itself, where the president announced that American interests come first (when has that ever not been the case?), which is being interpreted by many around the globe as a declaration of preemptive war.

The convention with each new presidential administration is to focus on the first hundred days. Back in November 2016, just after the election, National Public Radio (NPR) fact-checked the outline for the first hundred days provided by the campaign at the end of October 2016. With history speeding by, it’s unclear what portion of those plans have survived. Time will tell, of course, and I don’t expect it will take long — surely nowhere near 100 days.

So what is the difference between fulfilling one’s destiny and meeting one’s fate? The latter has a rather unsavory character to it, like the implied curse of the granted genie’s wish. The former smells vaguely of success. Both have a distinctly tragic whiff of inevitability. Either way, this new president appears to be hurrying headlong to effect changes promised during his campaign. If any wisdom is to be gathered at this most unpredictable moment, perhaps it should be a line offered today by fellow blogger the South Roane Agrarian (which may have in turn been stolen from the British version of House of Cards): “Beware of old men in a hurry.”

Aside: I was going to call this post “Fools Rush In,” but I already have one with that title and the slight revision above seems more accurate, at least until the bandwagon fills up.

Addendum: Seems I was partially right. There was a stack of executive orders ready to sign. However, they’ve been metered out over the course of the week rather than dumped in the hours shortly after the inauguration. What sort of calculation is behind that is pure conjecture. I might point out, though, that attention is riveted on the new president and will never subside, so there is no need, as in television, to keep priming the pump.

I see plenty of movies over the course of a year but had not been to a theater since The Force Awakens came out slightly over a year ago. The reason is simple: it costs too much. With ticket prices nearing $15 and what for me had been obligatory popcorn and soda (too much of both the way they’re bundled and sold — ask anyone desperately holding back their pee until the credits roll!), the endeavor climbed to nearly $30 just for one person. Never mind that movie budgets now top $100 million routinely; the movie-going experience simply isn’t worth $30 a pop. Opening weekend crowds (and costumes)? Fuggedaboudit! Instead, I view films at home on DVD (phooey on Blueray) or via a streaming service. Although I admit I’m missing out on being part of an audience, which offers the possibility of being carried away on a wave of crowd emotion, I’m perfectly happy watching at home, especially considering most films are forgettable fluff (or worse) and filmmakers seem to have forgotten how to shape and tell good stories. So a friend dragged me out to see Rogue One, somewhat late after its opening by most standards. Seeing Star Wars and other franchise installments now feels like an obligation just to stay culturally relevant. Seriously, soon enough it will be Fast & Furious Infinitum. We went to a newly built theater with individual recliners and waiters (no concession stands). Are film-goers no longer satisfied by popcorn and Milk Duds? No way would I order an $80 bottle of wine to go with Rogue One. It’s meant to be a premium experience, with everything served to you in the recliner, and accordingly, charges premium prices. Too bad most films don’t warrant such treatment. All this is preliminary to the actual review, of course.

I had learned quite a bit about Rogue One prior to seeing it, not really caring about spoilers, and was pleasantly surprised it wasn’t as bad as some complain. Rogue One brings in all the usual Star Wars hallmarks: storm troopers, the Force, X-Wings and TIE Fighters, ray guns and light sabers, the Death Star, and familiar characters such as Grand Moff Tarkin, Darth Vader, Princess Leia, etc. Setting a story within the Star Wars universe makes most of that unavoidable, though some specific instances did feel like gratuitous fan service, such as the 3-second (if that) appearance of C3PO and R2D2. The appearance of things and characters I already knew about didn’t feel to me like an extra thrill, but how much I needed to already know about Star Wars just to make sense of Rogue One was a notable weakness. Thus, one could call Rogue One a side story, but it was by no means a stand-alone story. Indeed, characters old and new were given such slipshod introductions (or none at all!) that they functioned basically as chess pieces moved around to drive the game forward. Good luck divining their characteristic movements and motivations. Was there another unseen character manipulating everyone? The Emperor? Who knows? Who cares! It was all a gigantic, faceless, pawn sacrifice. When at last the main rebels died, there was no grief or righteousness over having at least accomplished their putative mission. Turns out the story was all about effects, not emotional involvement. And that’s how I felt: uninvolved. It was a fireworks display ending with a pointless though clichéd grand finale. Except I guess that watching a bunch of fake stuff fake blow up was the fake point.

About what passed for a story: the Rebellion learns (somehow?!) that they face total annihilation from a new superweapon called the Death Star. (Can’t remember whether that term was actually used in the film.) While the decision of leadership is to scatter and flee, a plucky band of rebels within the rebellion insist on flinging themselves against the enemy without a plan except to improvise once on site, whereupon leadership decides irrationally to do the same. The lack of strategy is straight out of The Return of the King, distracting the enemy from the true mission objective, but the visual style is more like the opening of Saving Private Ryan, which is to say, full, straight-on bombardment and invasion. Visual callbacks to WWII infantry uniforms and formations couldn’t be more out of place. To call these elements charmless is to give them too much credit. Rather, they’re hackneyed. However, they probably fit well enough within the Saturday-morning cartoon, newsreel, swashbuckler sensibility that informed the original Star Wars films from the 1970s. Problem is, those 1970s kids are grown and want something with greater gravitas than live-action space opera. Newer Star Wars audiences are stuck in permanent adolescence because of what cinema has become, with its superhero franchises and cynical money grabs.

As a teenager when the first trilogy came out, I wanted more of the mystical element — the Force — than I wanted aerial battles, sword fights, or chase scenes. The goofy robots, reluctant heroes, and bizarre aliens were fun, but they were balanced by serious, steady leadership (the Jedi) and a couple really bad-ass villains. While it’s known George Lucas had the entire character arc of Anakin Skywalker/Darth Vader in mind from the start, it’s also fair to say that no one quite knew in Episode 4 just how iconic Vader the villain would become, which is why his story became the centerpiece of the first two trilogies (how many more to come?). However, Anakin/Vader struggled with the light/dark sides of the Force, which resonated with anyone familiar with the angel/demon nomenclature of Christianity. When the Force was misguidedly explained away as Midi-clorians (science, not mysticism), well, the bottom dropped out of the Star Wars universe. At that point, it became a grand WWII analogue populated by American GIs and Nazis — with some weird Medievalism and sci-fi elements thrown in — except that the wrong side develops the superweapon. Rogue One makes that criticism even more manifest, though it’s fairly plain to see throughout the Star Wars films.

Let me single out one actor for praise: Ben Mendelsohn as Orson Krennic. It’s hard for me to decide whether he chews the scenery, upstaging Darth Vader as a villain in the one scene they share, or he’s among a growing gallery of underactors whose flat line delivery and blandness invites viewers to project upon them characterization telegraphed through other mechanisms (costuming, music, plot). Either way, I find him oddly compelling and memorable, unlike the foolish, throwaway, sacrificial band of rebellious rebels against the rebellion and empire alike. Having seen Ben Mendelsohn in other roles, he possesses an unusual screen magnetism that reminds me of Sean Connery. He tends to play losers and villains and be a little one-note (not a bag of tricks but just one trick), but he is riveting on-screen for the right reasons compared to, say, the ookiness of the two gratuitous CGI characters in Rogue One.

So Rogue One is a modestly enjoyable and ephemeral romp through the Star Wars universe. It delivers and yet fails to deliver, which about as charitable as I can be.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

Once in a while, a comment sticks with me and requires additional response, typically in the form of a new post. This is one of those comments. I wasn’t glib in my initial reply, but I thought it was inadequate. When looking for something more specific about Neil Postman, I found Janet Sternberg’s presentation called Neil Postman’s Advice on How to Live the Rest of Your Life (link to PDF). The 22 recommendations that form Postman’s final lecture given to his students read like aphorisms and the supporting paragraphs are largely comical, but they nonetheless suggest ways of coping with the post-truth world. Postman developed this list before Stephen Colbert had coined the term truthiness. I am listing only the recommendations and withholding additional comment, though there is plenty to reinforce or dispute. See what you think.

  1. Do not go to live in California.
  2. Do not watch TV news shows or read any tabloid newspapers.
  3. Do not read any books by people who think of themselves as “futurists,”
    such as Alvin Toffler.
  4. Do not become a jogger. If you are one, stop immediately.
  5. If you are married, stay married.
  6. If you are a man, get married as soon as possible. If you are a woman,
    you need not be in a hurry.
  7. Establish as many regular routines as possible.
  8. Avoid multiple and simultaneous changes in your personal life.
  9. Remember: It is more likely than not that as you get older you will get
    dumber.
  10. Keep your opinions to a minimum.
  11. Carefully limit the information input you will allow.
  12. Seek significance in your work, friends, and family, where potency and
    output are still possible.
  13. Read’s Law: Do not trust any group larger than a squad, that is, about
    a dozen.
  14. With exceptions to be noted further ahead, avoid whenever possible
    reading anything written after 1900.
  15. Confine yourself, wherever possible, to music written prior to 1850.
  16. Weingartner’s Law: 95% of everything is nonsense.
  17. Truman’s Law: Under no circumstances ever vote for a Republican.
  18. Take religion more seriously than you have.
  19. Divest yourself of your belief in the magical powers of numbers.
  20. Once a year, read a book by authors like George Orwell, E.B. White, or
    Bertrand Russell.
  21. Santha Rama Rau’s Law: Patriotism is a squalid emotion.
  22. Josephson’s Law: New is rotten.

The last traffic report observed the 10-year anniversary of this blog. For this traffic report, I am on the cusp of achieving another significant threshold: 1,000 subscribers (just five more to go). A while back, I tried (without success) to discourage others from subscribing to this blog in hopes that it would provide responsive traffic. Since then, more than 700 new subscribers have appeared, many of them commercial blogs hawking things like photography, technology services (especially SEO), fashion, and celebrity gossip. I used to at least have one look at them, but I no longer do. The most incongruent (to those who are familiar with the themes of this blog) are the testimonial blogs in praise of (someone’s) god. If I could unsubscribe others on my end, I probably would; but alas, my basic WordPress blog does not have that feature.

So what besides the almost 1,000 subscribers has occurred here since the last report? Not a whole lot besides my regular handwringing about things still wrong in the world. There was that small matter of the U.S. presidential election, which garnered some of my attention, but that really falls within the wider context of the U.S. destroying itself in fits and starts, or even more generally, the world destroying itself in fits and starts. More than usual, I’ve reblogged and updated several old posts, usually with the suffix redux. I haven’t had any multipart blogs exploring ideas at length.

The Numbers

Total posts (not counting this one) are 474. Unique visitors are 22,017. Daily hits (views) range from 10 to 60 or so. Total hits are 95,081. Annual hits had climbed to about 12,500 in 2013 but have since declined steadily. The most-viewed post by far continues to be Scheler’s Hierarchy, with most of the traffic coming from the Philippines.

Doom Never Dies

Whereas the so-called greatest story ever told refers to Jesus for most people, I think the most important story ever told (and ignored) is how we humans drove the planet into the Sixth Extinction and in the process killed ourselves. I find more and more people simply acknowledging the truth of climate change (though not yet NTE) even as Republicans continue to deny it aggressively. Now that Republicans will control both houses of Congress and the White House (debatable whether Trump is truly a Republican), those already convinced expect not just an acceleration of weather-related calamity but accelerated stoking of the engine powering it. I leave you with this relevant quote from an article in Harper’s called “The Priest in the Trees“:

What must die is the materialist worldview in which physical reality is viewed as just stuff: “The world is not merely physical matter we can manipulate any damn way we please.” The result of that outlook is not just a spiritual death but a real, grisly, on-the-cross kind of death. “We are erecting that cross even now,” he said.

Addendum

A meaningless milestone (for me at least), but a milestone nonetheless:

1000-followers

Back in the day, I studied jazz improvisation. Like many endeavors, it takes dedication and continuous effort to develop the ear and learn to function effectively within the constraints of the genre. Most are familiar with the most simple form: the 12-bar blues. Whether more attuned to rhythm, harmony, lyrics, or structure doesn’t much matter; all elements work together to define the blues. As a novice improviser, structure is easy to grasp and lyrics don’t factor in (I’m an instrumentalist), but harmony and rhythm, simple though they may be to understand, are formidable when one is making up a solo on the spot. That’s improvisation. In class one day, after two passes through the chord changes, the instructor asked me how I thought I had done, and I blurted out that I was just trying to fill up the time. Other students heaved a huge sigh of recognition and relief: I had put my thumb on our shared anxiety. None of us were skilled enough yet to be fluent or to actually have something to say — the latter especially the mark of a skilled improvisor — but were merely trying to plug the whole when our turn came.

These days, weekends feel sorta the same way. On Friday night, the next two days often feel like a yawning chasm where I plan what I know from experience will be an improvisation, filling up the available time with shifting priorities, some combination of chores, duties, obligations, and entertainments (and unavoidable bodily functions such as eating, sleeping, etc.). Often enough I go back to work with stories to tell about enviable weekend exploits, but just I often have a nagging feeling that I’m still a novice with nothing much to say or contribute, just filling up the time with noise. And as I contemplate what years and decades may be left to me (if the world doesn’t crack up first), the question arises: what big projects would I like to accomplish before I’m done? That, too, seems an act of improvisation.

I suspect recent retirees face these dilemmas with great urgency until they relax and decide “who cares?” What is left to do, really, before one finally checks out? If careers are completed, children are raised, and most of life’s goals are accomplished, what remains besides an indulgent second childhood of light hedonism? Or more pointedly, what about one’s final years keeps it from feeling like quiet desperation or simply waiting for the Grim Reaper? What last improvisations and flourishes are worth undertaking? I have no answers to these questions. They don’t press upon me just yet with any significance, and I suffered no midlife crisis (so far) that would spur me to address the questions head on. But I can feel them gathering in the back of my mind like a shadow — especially with the specters of American-style fascism, financial and industrial collapse, and NTE looming.

The U.S. election has come and gone. Our long national nightmare is finally over; another one is set to begin after a brief hiatus. (I’m not talking about Decision 2020, though that spectre has already reared its ugly head.) Although many were completely surprised by the result of the presidential race in particular, having placed their trust in polls, statistical models, and punditry to project a winner (who then lost), my previous post should indicate that I’m not too surprised. Michael Moore did much better taking the temperature of the room (more accurately, the nation) than all the other pundits, and even if the result had differed, the underlying sentiments remain. It’s fair to say, I think, that people voted with their guts more than their heads, meaning they again voted their fears, hates, and above all, for revolution change. No matter that the change in store for us will very likely be destructive and against self-interest. Truth is, it would have had to end with destruction with any of the candidates on the ballot.

Given the result, my mind wandered to Hillary Clinton’s book It Takes a Village, probably because we, the citizens of the Unites States of America, have effectively elected the village idiot to the nation’s highest office. Slicing and dicing the voting tallies between the popular vote, electoral votes, and states and counties carried will no doubt be done to death. Paths to victory and defeat will be offered with the handsome benefit of hindsight. Little of that matters, really, when one considers lessons never learned despite ample opportunity. For me, the most basic lesson is that for any nation of people, leaders must serve the interests of the widest constituency, not those of a narrow class of oligarchs and plutocrats. Donald Trump addressed the people far more successfully than did Hillary Clinton (with her polished political doubletalk) and appealed directly to their interests, however base and misguided.

My previous post called Barstool Wisdom contained this apt quote from The Brothers Karamazov by Dostoevsky:

The more stupid one is, the closer one is to reality. The more stupid one is, the clearer one is. Stupidity is brief and artless, while intelligence squirms and hides itself.

We have already seen that our president-elect has a knack for stating obvious truths no one else dares utter aloud. His clarity in that regard, though coarse, contrasts completely with Hillary’s squirmy evasions. Indeed, her high-handed approach to governance, more comfortable in the shadows, bears a remarkable resemblance to Richard Nixon, who also failed to convince the public that he was not a crook. My suspicion is that as Donald Trump gets better acquainted with statecraft, he will also learn obfuscation and secrecy. Some small measure of that is probably good, actually, though Americans are pining for greater transparency, one of the contemporary buzzwords thrown around recklessly by those with no real interest in it. My greater worry is that through sheer stupidity and bullheadedness, other obvious truths, such as commission of war crimes and limits of various sorts (ecological, energetic, financial, and psychological), will go unheeded. No amount of barstool wisdom can overcome those.