I admit it: I’m a bit triggered. Storming of the U.S. Capitol Building last week, even though it was over in one day, sent a lot of us back to the drawing board, wondering how things could come to that. Not that civil unrest, attempted coups and secession, and even revolution haven’t been predicted for months. Still, the weirdness of this particular manifestation of citizen frustrations is hard to fathom. See, for instance, this blog post, which offers a reckoning not easy to face. Simply put, crowds that form into protests and physical occupations fully recognize their abandonment at the hand of oligarchs and political leaders and as a result act out their desperation and nihilism. Their question becomes “why not take over and occupy a building?” Doesn’t matter, nothing to lose anymore. It’s already all gone. Whether it’s a college administrative building, governor’s mansion, federal or state office building, or the U.S. Capitol Building, the sentiment appears to be the same: why the hell not? Doesn’t matter there was no plan what to do once the building was breached; doesn’t matter that it wasn’t occupied for long; doesn’t matter that property was damaged; doesn’t matter that lives were ruined and lost; doesn’t matter that no replacement government or executive was installed like a real coup or revolution would demand. Still works as an expression of outrage over the dysfunctions of society.

On the bright side, actual death and injury were quite limited compared to what might have obtained. Mayhem was largely limited to property destruction. Plus, it was a potent reminder to legislators (filmed scrambling for safety) that maybe they ought to fear backing the citizenry into corners with nowhere to turn. Conjecture that, had the racial make-up of the protesters been different, a massacre would have ensued remains just that: conjecture.

Read the rest of this entry »

The end of every U.S. presidential administration is preceded by a spate of pardons and commutations — the equivalents of a get-out-of-jail-free card offered routinely to conspirators collaborators with the outgoing executive and general-purpose crony capitalists. This practice, along with diplomatic immunity and supranational elevation of people (and corporations-as-people) beyond the reach of prosecution, is a deplorable workaround obviating the rule of law. Whose brilliant idea it was to offer special indulgence to miscreants is unknown to me, but it’s pretty clear that, with the right connections and/or with enough wealth, you can essentially be as bad as you wanna be with little fear of real consequence (a/k/a too big to fail a/k/a too big to jail). Similarly, politicians, whose very job it is to manage the affairs of society, are free to be incompetent and destructive in their brazen disregard for needs of the citizenry. Only modest effort (typically a lot of jawing directed to the wrong things) is necessary to enjoy the advantages of incumbency.

In this moment of year-end summaries, I could choose from among an array of insane, destructive, counter-productive, and ultimately self-defeating nominees (behaviors exhibited by elite powers that be) as the very worst, the baddest of the bad. For me, in the largest sense, that would be the abject failure of the rule of law (read: restraints), which has (so far) seen only a handful of high-office criminals prosecuted successfully (special investigations leading nowhere and failed impeachments don’t count) for their misdeeds and malfeasance. I prefer to be more specific. Given my indignation over the use of torture, that would seem an obvious choice. However, those news stories have been shoved to the back burner, including the ongoing torture of Julian Assange for essentially revealing truths cynics like me already suspected and now know to be accurate, where they general little heat. Instead, I choose war as the very worst, an example of the U.S. (via its leadership) being as bad as it can possibly be. The recent election cycle offered a few candidates who bucked the consensus that U.S. involvement in every unnecessary, undeclared war since WWII is justified. They were effectively shut out by the military-industrial complex. And as the incoming executive tweeted on November 24, 2020, America’s back, baby! Ready to do our worst again (read: some more, since we [the U.S. military] never stopped [making war]). A sizeable portion of the American public is aligned with this approach, too.

So rule of law has failed and we [Americans] are infested with crime and incompetence at the highest levels. Requirements, rights, and protections found in the U.S. Constitution are handily ignored. That means every administration since Truman has been full of war criminals, because torture and elective war are crimes. The insult to my sensibilities is far worse than the unaffordability of war, the failure to win or end conflicts, or the lack of righteousness in our supposed cause. It’s that we [America, as viewed from outside] are belligerent, bellicose aggressors. We [Americans] are predators. And we [Americans, but really all humans] are stuck in an adolescent concept of conduct in the world shared with animals that must kill just to eat. We [humans] make no humanitarian progress at all. But the increasing scale of our [human] destructiveness is progress if drones, robots, and other DARPA-developed weaponry impress.

Read the rest of this entry »

I’ve reached another crossroads. Chalk it up to pandemic exhaustion at being mostly cooped up for the better part of a year. Of course, this state is on top of other sources of exhaustion (politics, doom, the news grind cycle) that drained my enthusiasm for things I used to do before meaningful (to me) endeavors were all cancelled and everyone was forced to search for meaning staring at surfaces (e.g., the walls, pages, and screens — especially screens for most Americans, I daresay). So as the year and decade draw to a close, I anticipate a spate of lists and summaries as we move into 2021 with the hope it won’t be worse than 2020 — a faint hope, I might add, since nothing has been resolved except perhaps (!) which listless septuagenarian gets to sit in the Oval Office. The jury is still out whether vaccines will have the intended effect.

Aside: The calendar is not a timer or odometer. So although we change the calendar to 2021, the new year is the first year of the new decade (third decade of the 21st century, obviously). We struggled with this issue at the end of the previous century/millennium when 2000 became 2001, not more popularly when 1999 became 2000. This discrepancy is because calendars begin counting each month, year, etc. with 1, not 0. So the first ten counting numbers are 1–10, not 0–9, and all decades run from xx01 to xx10. However, timers and odometers begin counting at 0 and show elapsed intervals, so the first ten minutes or miles run from the start (at 0) to the end of 9, at which point the odometer in particular rolls to 10 and begins a new sequence. I realize I’m being a pointy-headed snoot about this, but it’s a relatively easy concept to understand. Innumeracy evident among the public is a microcosm for all the other easy concepts so badly misunderstood.

I’ve admitted to feelings of exhaustion and defeat numerous times, and indeed, hope eludes me whilst a narrow group of things still produce enjoyment. But my blogroll is no longer one of those things. I recently wrote the following to an acquaintance of mine:

Now that collapse narratives have matured, over two decades old for some (about 14 years for me), I notice that existential threats are still too remote and contingent for most to do more than signal some vague level of awareness and/or concern before returning to normal life. A few who sank into it deeply recognized that nothing positive comes out of it and have retreated from public life, or at least ongoing tracking and reporting. Several of the sites I used to frequent for news and perspective have dried up, and my finding is that adding more awfulness to the pile doesn’t enhance my understandings anymore, so I’ve also largely stopped gathering information. I still cite collapse frequently at my doom blog, but I have other things to write about.

I’m one of those who sank into the collapse narrative rather deeply and blogged about it consistently. By now, the sole available positive outcome has manifested: the recognition (and with it, resignation) that nothing will or can be done to avert disaster. So I’m dumping the doom and inactive blogs from my blogroll. I’ll continue to blog about and bear witness to the gathering storm: the cascade failure of industrial civilization. It’s proven to be a more protracted process than expected (at least by me), but no promises that it will stall until the end of the century at the conclusion of 2100 for sea level to rise and flora and fauna to expire. Human habitat will continue to diminish decade by decade, and at some point, so will human population — already shown to be rather precariously perched on an illusory safety and security we take as business as usual. I’ll keep a couple of the respectable truth-telling blogs just to have something to which to link. I have no links to add at this point.

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

Read the rest of this entry »

Returning to the subject of this post, I asserted that the modern era frustrates a deep, human yearning for meaning. As a result, the Medieval Period, and to a lesser degree, life on the highroad, became narrative fixations. Had I time to investigate further, I would read C.S. Lewis’ The Discarded Image (1964), but my reading list is already overfull. Nonetheless, I found an executive summary of how Lewis describes the Medieval approach to history and education:

Medieval historians varied in that some of them were more scientific, but most historians tried to create a “picture of the past.” This “picture” was not necessarily based in fact and was meant more to entertain curiosity than to seriously inform. Educated people in medieval times, however, had a high standard for education composed of The Seven Liberal Arts of grammar, dialectic, rhetoric, arithmetic, music, geometry, and astronomy.

In the last chapter, Lewis summarizes the influence of the Medieval Model. In general, the model was widely accepted, meaning that most people of the time conformed to the same way of thinking. The model, he reiterates, satisfied imagination and curiosity, but was not necessarily accurate or factual, specifically when analyzed by modern thinkers.

Aside. Regular readers of The Spiral Staircase may also recognize how consciousness informs this blog post. Historical psychology offers a glimpse into worldviews of bygone eras, with the Medieval Period perhaps being the easiest to excavate contemplate due to proximity. Few storytellers (cinema or literature) attempt to depict what the world was truly like in the past (best as we can know) but instead resort to an ahistorical modern gloss on how men and women thought and behaved. One notable exception may be the 1986 film The Name of the Rose, which depicts the emerging rational mind in stark conflict with the cloistered Medieval mind. Sword-and-sandal epics set in ancient Rome and Greece get things even worse.

Read the rest of this entry »

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.

Black Friday has over the past decades become the default kickoff of annual consumer madness associated with the holiday season and its gift-giving tradition. Due to the pandemic, this year has been considerably muted in comparison to other years — at least in terms of crowds. Shopping has apparently moved online fairly aggressively, which is an entirely understandable result of everyone being locked down and socially distanced. (Lack of disposable income ought to be a factor, too, but American consumers have shown remarkable willingness to take on substantial debt when able in support of mere lifestyle.) Nevertheless, my inbox has been deluged over the past week with incessant Black Friday and Cyber Monday advertising. Predictably, retailers continue feeding the frenzy.

Uncharacteristically, perhaps, this state of affairs is not the source of outrage on my part. I recognize that we live in a consumerist, capitalist society that will persist in buying and selling activities even in the face of increasing hardship. I’m also cynical enough to expect retailers (and the manufacturers they support, even if those manufacturers are Chinese) to stoke consumer desire through advertising, promotions, and discount sales. It’s simply what they do. Why stop now? Thus far, I’ve seen no rationalizations or other arguments excusing how it’s a little ghoulish to be profiting while so many are clearly suffering and facing individual and household fiscal cliffs. Instead, we rather blandly accept that the public needs to be served no less by mass market retailers than by, say, grocery and utility services. Failure by the private sector to maintain functioning supply lines (including nonessentials, I suppose) during a crisis would look too much like the appalling mismanagement of the same crisis by local, state, and federal governments. Is it ironic that centralized bureaucracies reveal themselves as incompetent at the very same time they consolidate power? Or more cynically, isn’t it outrageous that they barely even try anymore to address the true needs of the public?

One of the questions I’ve posed unrhetorically is this: when will it finally become undeniably clear that instead of being geared to growth we should instead be managing contraction? I don’t know the precise timing, but the issue will be forced on us sooner or later as a result of radically diminishing return (compared to a century ago, say) on investment (ROI) in the energy sector. In short, we will be pulled back down to earth from the perilous heights we scaled as resources needed to keep industrial civilization creaking along become ever more difficult to obtain. (Maybe we’ll have to start using the term unobtainium from the Avatar movies.) Physical resources are impossible to counterfeit at scale, unlike the bogus enormous increase in the fiat money supply via debt creation. If/when hyperinflation makes us all multimillionaires because everything is grossly overvalued, the absurd paradox of being cash rich yet resource poor ought to wake up some folks.

I’ve never before gone straight back with a redux treatment of a blog post. More typically, it takes more than a year before revisiting a given topic, sometimes several years. This time, supplemental information came immediately, though I’ve delayed writing about it. To wit, a Danish study published November 18, 2020, in the Annals of Internal Medicine indicates our face mask precautions against the Coronavirus may be ineffective:

Our results suggest that the recommendation to wear a surgical mask when outside the home among others did not reduce, at conventional levels of statistical significance, the incidence of SARS-CoV-2 infection in mask wearers in a setting where social distancing and other public health measures were in effect, mask recommendations were not among those measures, and community use of masks was uncommon. Yet, the findings were inconclusive and cannot definitively exclude a 46% reduction to a 23% increase in infection of mask wearers in such a setting. It is important to emphasize that this trial did not address the effects of masks as source control or as protection in settings where social distancing and other public health measures are not in effect.

The important phrase there is “did not reduce, at conventional levels of statistical significance,” which is followed by the caveat that the study was partial and so is inconclusive. To say something is statistically insignificant means that results do not exceed the calculated margin of error or randomness. A fair bit of commentary follows the published study, which I have not reviewed.

We’re largely resorting to conventional wisdom with respect to mask wearing. Most businesses and public venues (if open at all) have adopted the mask mandate out of conformity and despite wildly conflicting reports of their utility. Compared to locking down all nonessential social and economic activity, however, I remain resigned to their adoption even though I’m suspicious (as any cynic or skeptic should be) that they don’t work — at least not after the virus is running loose. There is, however, another component worth considering, namely, the need to been seen doing something, not nothing, to address the pandemic. Some rather bluntly call that virtue signalling, such as the pathologist at this link.

In the week since publication of the Danish study and the pathologist’s opinion (note the entirely misleading title), there has been a deluge of additional information, editorials, and protests (no more links, sorry) calling into question recommendations from health organizations and responses by politicians. Principled and unprincipled dissent was already underway since May 2020, which is growing with each month hardship persists. Of particular note is the Supreme Court’s 5-4 decision against New York Gov. Andrew Cuomo’s mandate that religious services be restricted to no more than 10 people in red zones and no more than 25 in orange zones. Score one for the Bill of Rights being upheld even in a time of crisis.

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

Medieval Pilgrimage

Posted: November 16, 2020 in Artistry, Culture, Music
Tags: , , ,

Listening to the recording shown at left, my mind drifted to various cinematic treatments of Medievalism, including The Lord of the Rings, Game of Thrones, The Chronicles of Narnia, and too many others to cite. Other associations also came tumbling out of memory, including my review of The Hobbit (the book, not the movie, though I reviewed both) and a previous blog post called “What’s Missing.” That post was a rumination on community and meaning lost in modern technocratic societies. In light of fetishization of the Medieval Period, including for example the popularity of Renaissance Faires, there seems to be more to say about what’s missing.

The Llibre Vermell de Montserrat (English: Red Book of Montserrat), known as such because of its 19th-century binding and its being held the Monastery of Montserrat in Catalonia (a region of Spain), is a collection of devotional texts also containing late Medieval songs. The Wikipedia article indicates that the monastery also holds the shrine of the Virgin of Montserrat, a major site of pilgrimage at the time the Red Book was compiled. Accordingly, its songs and dances were probably intended for pilgrims to the shrine and were part of a well-developed oral folk tradition. The 14th-century manuscript does not identify authors or composers. Furthermore, it predates modern musical notation, so performances and recordings today are reconstructions.

The music on the recording fuses sacred and secular (folk) elements and strongly suggests communal participation. In contrast, the modern concert hall has become the scene of rigid propriety. Audience members try to sit in stone silence (notwithstanding inevitable cell phone interruptions) while performers demonstrate their, um, emotionless professionalism. Live concerts of popular musics (multiple genres) instead feature audiences dancing and singing along, creating an organic experience that transforms the concertgoer into a participant situated in the middle of the flow rather than at the distant receiving end. Middle ground, such as when symphony orchestras perform film or video game music, often draws untutored audiences who may want to participate and in doing so frankly offend others trained to be still.

Is there cultural connection between pilgrimages, processions, and parades? The first is plainly religious is motivation, such as visits to Catholic shrines, the Wailing Wall in Jerusalem, or Mecca. Processions are more ceremonial and may not be religious in orientation. A wedding procession is a good example. Parades are more nearly civil in character, unless one insists on nationalism (e.g., Independence Day in the U.S., Bastille Day in France, Victory Day in Russia) being civil religions. The sense of devotion, sacrifice, and hardship associated with pilgrimage, historical or modern, contrasts with the party atmosphere of a parade, where Carnival, Mardi Gras, and Día de Muertos in particular invite licentious participation. Typical U.S. holiday parades (e.g., Independence Day, Thanksgiving) feature spectators arrayed lazily along the streets. There is even a subgenre of march form (used in band concerts) called a “patrol” that employs a broad crescendo-diminuendo (getting louder then fading away) to depict a military column as it marches by.

I suspect that modern processions and parades are weak echos of pilgrimage, a gradual transformation of one thing into something else. Yet the call of the open road (a/k/a wanderlust) resurfaces periodically even when not specifically religious in motivation. The great westward migration of Europeans to North American and then Americans across the untamed frontiers attests to that venturing spirit. In literature, Jack London’s memoir The Road (1907) describes the hobo life hopping trains in the 1890s, while Jack Kerouac’s On the Road (1957) tells of traveling across America by car. Another expression of wanderlust was penned by forgotten American poet Vachel Lindsay in his self-published War Bulletin #3 (1909):

Let us enter the great offices and shut the desk lids and cut the telephone wires. Let us see that the skyscrapers are empty and locked, and the keys thrown into the river. Let us break up the cities. Let us send men on a great migration: set free, purged of the commerce-made manners and fat prosperity of America; ragged with the beggar’s pride, starving with the crusader’s fervor. Better to die of plague on the highroad seeing the angels, than live on iron streets playing checkers with dollars ever and ever.

Lindsay invites his readers to embrace a life better lived traversing the landscape in a voyage of self-discovery. His version lacks the religious orientation of pilgrimage, but like the Medieval cultures depicted in film and music from the period, possesses tremendous appeal for modern Westerners starved of meaning that arises naturally out of tradition.