Archive for the ‘Tacky’ Category

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

Advertisements

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (1oo years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitutional scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. (more…)

I’m not a serious cineaste, but I have offered a few reviews on The Spiral Staircase. There are many, many cineastes out there, though, and although cinema is now an old medium (roughly 100 years old), cineastes tend to be on the younger side of 35 years. Sure, lots of established film critics are decidedly older, typically acting under the aegis of major media outlets, but I’m thinking specifically of the cohort who use new, democratized media (e.g., cheap-to-produce and -distribute YouTube channels) to indulge in their predilections. For example, New Media Rockstars has a list of their top 100 YouTube channels (NMR No. 1 contains links to the rest). I have heard of almost none of them, since I don’t live online like so many born after the advent of the Information/Communications Age. The one I pay particular attention to is Screen Junkies (which includes Honest Trailers, the Screen Junkies Show, and Movie Fights), and I find their tastes run toward childhood enthusiasms that mire their criticism in a state of permanent adolescence and self-mocking geekdom. The preoccupation with cartoons, comic books, action figures, superheros, and popcorn films couldn’t be more clear. Movies Fights presumes to award points on the passion, wit, and rhetoric of the fighters rather than quality of the films they choose to defend. However, adjudication is rarely neutral, since trump cards tend to get played when a superior film or actor is cited against an inferior one.

So I happened to catch three recent flicks that are central to Screen Junkies canon: Captain America: Winter Soldier, The Avengers: Age of Ultron, and Transformers: Age of Extinction (links unnecessary). They all qualify as CGI festivals — films centered on hyperkinetic action rather than story or character (opinions differ, naturally). The first two originate from the MCU (acronym alert: MCU = Marvel Cinematic Universe, which is lousy with comic book superheros) and the last is based on a Saturday-morning children’s cartoon. Watching grown men and a few women on Screen Junkies getting overexcited about content originally aimed at children gives me pause, yet I watch them to see what fighters say, knowing full well that thoughtful remarks are infrequent.

Were I among the fighters (no chance, since I don’t have my own media fiefdom), I would likely be stumped when a question needs immediate recall (by number, as in M:I:3 for the third Mission Impossible film) of a specific entry from any of numerous franchises pumping out films regularly like those named above. Similarly, my choices would not be so limited to films released after 1990 as theirs, that year being the childhood of most of the fighters who appear. Nor would my analysis be so embarrassingly visual in orientation, since I understand good cinema to be more about story and character than whiz-bang effects.

Despite the visual feast fanboys adore (what mindless fun!), lazy CGI festivals suffer worst from overkill, far outstripping the eye’s ability to absorb onscreen action fully or effectively. Why bother with repeat viewing of films with little payoff in the first place? CGI characters were interesting in and of themselves the first few times they appeared in movies without causing suspension of belief, but now they’re so commonplace that they feel like cheating. Worse, moviegoers are now faced with so many CGI crowds, clone and robot armies, zombie swarms, human-animal hybrids, et cetera ad nauseum, little holds the interest of jaded viewers. Thus, because so few scenes resonate emotionally, sheer novelty substitutes (ineffectively) for meaning, not that most chases or slugfests in the movies offer much truly original. The complaint is heard all the time: we’ve seen it before.

Here’s my basic problem with the three CGI-laden franchise installments I saw recently: their overt hypermilitarism. When better storytellers such as Kubrick or Coppola make films depicting the horrors of war (or other existential threats, such as the ever-popular alien invasion), their perspective is indeed that war is horrible, and obvious moral and ethical dilemmas flow from there. When hack filmmakers pile up frenzied depictions of death and destruction, typically with secondary or tertiary characters whose dispatch means and feels like nothing, and with cities destroyed eliciting no emotional response because it’s pure visual titillation, they have no useful, responsible, or respectable commentary. Even the Screen Junkies recognize that, unlike, say, Game of Thrones, none of their putative superheroes really face much more than momentary distress before saving the day in the third act and certainly no lasting injury (a little make-up blood doesn’t convince me). Dramatic tension simply drains away, since happy resolutions are never in doubt. Now, characters taking fake beatdowns are laughter inducing, sorta like professional wrestling after the sheepish admission that they’ve been acting all along. Frankly, pretend drama with nothing at stake is a waste of effort and the audience’s time and trust. That so many fanboys enjoy being goosed or that some films make lots of money is no justification. The latter is one reason why cinema so often fails to rise to the aspiration of art: it’s too bound up in grubbing for money.

A Surfeit of Awards

Posted: January 29, 2015 in Culture, Education, Idle Nonsense, Tacky, Taste
Tags: ,

/rant on

I get alumni magazines from two colleges/universities I attended. These institutional organs are unapologetic boosters of the accomplishments of alumni, faculty, and students. They also trumpet never-ending capital campaigns, improvements to facilities, and new and refurbished buildings. The latest round of news from my two schools feature significant new and rebuilt structures, accompanied by the naming of these structures after the foundations, contributors, and faculty/administrators associated with their execution. Well and good, you might surmise, but I always have mixed feelings. No doubt there are certain thresholds that must be met for programs to function and excel: stadia and gyms, locker rooms, concert halls and theaters, practice and rehearsal spaces, equipment, computer labs, libraries and their holdings, etc. Visiting smaller schools having inadequate facilities always brought that point home. Indeed, that’s one of the reasons why anyone chooses a school: for the facilities.

Since the late sixties or so, I have witnessed one school after another (not just in higher education) becoming what I think of as lifestyle schools. Facilities are not merely sufficient or superior; they range into the lap of luxury and excess. It’s frankly embarrassing that the quality and furnishings of dormitories now exceed what most students will enjoy for decades post-graduation. In my college years, no one found it the slightest bit embarrassing to have meager accommodations. That’s not why one was there. Now the expectation is to luxuriate. Schools clearly compete to attract students using a variety of enticements, but delivering the best lifestyle while in attendance was formerly not one of them. But the façades and accoutrements are much easier to evaluate than the academic programs, which have moved in the opposite direction. Both are now fraudulent at many schools; it’s a game of dress-up.

That rant, however, may only the tip of the proverbial iceberg. I cannot escape the sense that we celebrate ourselves and our spurious accomplishments with amazing disregard for their irrelevance. Unlike many, who dream of achieving immortality through proxy, the desire to see one’s name on the side of a building, in a hall of fame, on an endowed chair, etched in a record book, or otherwise gouged into posterity confounds me. Yet I can’t go anywhere without finding another new feature named after someone, usually posthumously but not always, whose memory must purportedly be preserved. (E.g., Chicago recently renamed the Circle Interchange after its first and only female mayor, Jane Byrne, causing some confusion due to inadequate signage.) The alumni magazines were all about newly named buildings, chairs, scholarships, halls, bricks, and waste cans. It got to be sickening. The reflex is now established: someone gives a pile of money or teaches (or administers) for a time, name something after him or her. And as we enter championship and awards season in sports and cinema, the surfeit of awards doled out, often just for showing up and doing one’s job, is breathtaking.

Truly memorable work and achievement need no effusive praise. They are perpetuated through subscription. Yet even they, as Shelley reminds us, pass from memory eventually. Such is the way of the world in the long stretches of time (human history) we have inhabited it. Readers of this blog will know that, in fairly awful terms, that time is rapidly drawing to a close due to a variety of factors, but primarily because of our own prominence. So one might wonder, why all this striving and achieving and luxuriating and self-celebrating when its end is our own destruction?

/rant off

I don’t normally concern myself overly much with B movies. I may watch one while munching my popcorn, but they hardly warrant consideration beyond the time lost spent plopped in front of the screen. My first thought about World War Z is that there hasn’t been another case of a special effect in search of a story since, well, any of the films from the Transformers franchise (new one due out in a couple weeks). WWZ is a zombie film — the kind with fast zombies (running, jumping, and busting their heads through glass instead of just lumbering around) who transform from the living into the undead in under 20 seconds. None of this works without the genre being well established for viewers. Yet World War Z doesn’t hew to the implicit understanding that it should essentially be a snuff film, concocting all manner of never-before-seen gore from dispatching them-no-longer-us. Instead, its main visual play is distant CGI crowd scenes (from helicopters — how exciting!) of self-building Jenga piles of zombies.

Two intertwined stories run behind the ostensible zombie dreck: (1) an investigation into the origin of the viral outbreak that made the zombies, leading to a pseudo-resolution (not quite a happy ending) Hollywood writers apparently find obligatory, and (2) reuniting the investigator with his family, who has been separated because he’s the kind of reluctant hero with such special, unique skills that he’s extorted into service by his former employer. Why an A-list actor such as Brad Pitt agrees to associate himself with such moronic fare is beyond me. The character could have been played by any number of action stars aging past their ass-kicking usefulness as we watch: Bruce Willis, John Travolta, Nicolas Cage, Pierce Brosnan, Mel Gibson, Liam Neeson, Wesley Snipes, Keanu Reeves (who can at least project problem-solving acumen), and Sylvester Stallone, just to name a few. This list could actually go on quite a bit further.

This is the kind of film for which the term suspension of disbelief was coined. The implausibly fortunate survival of the hero through a variety of threats is assured, tying the story together from front to back, which is a cliché that drains dramatic tension out of the story despite everyone around him perishing. I was curious to read P.Z. Myers’ rant discussing the awful science of World War Z, which also observes plot holes and strategic WTFs. The bad science doesn’t stick in my craw quite like it does for Myers, but then, my science background is pretty modest. Like so many fight scenes in action movies where the hero is never really injured, I just sorta go with it.

What really interests me about WWZ, however, is that it presents yet another scenario (rather uninspired, actually) of what might happen when society breaks apart. Since the film features a fast crash where everything goes utterly haywire within hours — yet the electrical grid stays up — the first harrowing scene is the family fleeing, first in a car and then a commandeered mobile home, before seeking temporary refuge in a tenement. The main character states early on that people on the move survive and people who hunker down are lost. That may be true in a theater of war, but I can’t judge whether it’s also true with a virulent contagion scenario. In any case, the investigator alternates between movement and refuge as his situation changes.

Because the zombie horde is a functionally external threat, survivors (however temporary) automatically unite and cooperate. This behavior is borne out in various real-world responses to fast-developing events. However, slow-mo threats without the convenient external enemy, such as we’re now experiencing in the real world with protracted industrial collapse, provides a different picture: dog eating dog and fighting to survive another day. Such alternatives cause many who foresee extraordinary difficulties in the decades ahead to wish for events to break over civilization like a tsunami, taking many all at once and uniting those unlucky enough to survive. But until that happens, we’re faced with slow death by a thousand cuts.

Any given species has its unique behaviors and preferred habitat, inevitably overlapping with others that are predator or prey. The human species has spread geographically to make nearly the entire world its habitat and every species its prey (sometimes unintentionally). But it’s a Pyrrhic success, because for the ecosystem to work as our habitat as well as theirs, diversity and abundance is needed. As our numbers have expanded to over 7 billion, nonhuman populations have often declined precipitously (when we don’t farm them for food). When we humans are not otherwise busy hunting, harvesting, and exterminating, we harass them and claim their habitats as uniquely our own. Our unwillingness to share space and/or tolerate their presence except on our own terms is audacious, to say the least.

(more…)

Morris Berman came forward with an interesting bit about the New Monastic Individual (NMI) first described in his book The Twilight of American Culture. He wrote two addition books to complete his second trilogy: Dark Ages America (also the title of his blog) and Why America Failed, taking from the latter the initials WAF to denote followers, commentators, acolytes, and habitués of his blog using the term Wafer. I hesitate to quote too liberally, since Prof. Berman sometimes puts up copyright notices at the ends of his blogs; I’ll redact this at the slightest whiff of an infringement challenge:

  1. Wafers recognize that 99% of those around them, if they are living in the United States, are basically stupid and nasty. This is not said so much as a judgment as a description: it’s simply the way things are, and these things are not going to change any time soon. Wafers know this, and they accept it.
  2. The lives of Wafers are driven by knowledge, not fear or fantasy. They are living in reality, in short, not drowning in the mass illusions of contemporary America.
  3. Wafers are serious about their lives. They are not here on this earth to waste time, to piss their lives away on other people’s agendas, as are most Americans — right up to and including the president. Their goals are truth, love, and joy, and they are dedicated to pursuing them.
  4. Finally, Wafers feel sorry for non-Wafers, and if they can, try to help them. They recognize, of course (see #1), that most cannot be helped; but if they come across someone who shows signs of potential Waferdom, of awakening to the the three points mentioned above, they try to fish them out of the drink, so to speak, and set them on the path of dignity, intelligence, integrity, and self-respect. Noblesse oblige, that sort of thing.

Numbers 1–3 are well and good. I’ve been a subscriber since Twilight was published. Evidence for the negative assessments is obvious and easy to obtain. Carving out a special place for a few Wafers to congratulate themselves (no. 4), however, strikes me as pissy and ungracious. But this isn’t precisely what I want to blog about. Rather, it’s how a former intellectual model of mine has fallen into disgrace, not that he would recognize or admit it. (This is IMO worse than the irrelevance complained about at his blog.) Prof. Berman was among the first to awaken in me a real curiosity in deeper stories behind cheap façades offered by most historical accounts, which form a dissatisfying consensus reality. I don’t possess the academic wherewithal to emulate him, but I’m a critical reader and can synthesize a lot of information.

(more…)