Archive for the ‘Artistry’ Category

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

(more…)

Medieval Pilgrimage

Posted: November 16, 2020 in Artistry, Culture, Music
Tags: , , ,

Listening to the recording shown at left, my mind drifted to various cinematic treatments of Medievalism, including The Lord of the Rings, Game of Thrones, The Chronicles of Narnia, and too many others to cite. Other associations also came tumbling out of memory, including my review of The Hobbit (the book, not the movie, though I reviewed both) and a previous blog post called “What’s Missing.” That post was a rumination on community and meaning lost in modern technocratic societies. In light of fetishization of the Medieval Period, including for example the popularity of Renaissance Faires, there seems to be more to say about what’s missing.

The Llibre Vermell de Montserrat (English: Red Book of Montserrat), known as such because of its 19th-century binding and its being held the Monastery of Montserrat in Catalonia (a region of Spain), is a collection of devotional texts also containing late Medieval songs. The Wikipedia article indicates that the monastery also holds the shrine of the Virgin of Montserrat, a major site of pilgrimage at the time the Red Book was compiled. Accordingly, its songs and dances were probably intended for pilgrims to the shrine and were part of a well-developed oral folk tradition. The 14th-century manuscript does not identify authors or composers. Furthermore, it predates modern musical notation, so performances and recordings today are reconstructions.

The music on the recording fuses sacred and secular (folk) elements and strongly suggests communal participation. In contrast, the modern concert hall has become the scene of rigid propriety. Audience members try to sit in stone silence (notwithstanding inevitable cell phone interruptions) while performers demonstrate their, um, emotionless professionalism. Live concerts of popular musics (multiple genres) instead feature audiences dancing and singing along, creating an organic experience that transforms the concertgoer into a participant situated in the middle of the flow rather than at the distant receiving end. Middle ground, such as when symphony orchestras perform film or video game music, often draws untutored audiences who may want to participate and in doing so frankly offend others trained to be still.

Is there cultural connection between pilgrimages, processions, and parades? The first is plainly religious is motivation, such as visits to Catholic shrines, the Wailing Wall in Jerusalem, or Mecca. Processions are more ceremonial and may not be religious in orientation. A wedding procession is a good example. Parades are more nearly civil in character, unless one insists on nationalism (e.g., Independence Day in the U.S., Bastille Day in France, Victory Day in Russia) being civil religions. The sense of devotion, sacrifice, and hardship associated with pilgrimage, historical or modern, contrasts with the party atmosphere of a parade, where Carnival, Mardi Gras, and Día de Muertos in particular invite licentious participation. Typical U.S. holiday parades (e.g., Independence Day, Thanksgiving) feature spectators arrayed lazily along the streets. There is even a subgenre of march form (used in band concerts) called a “patrol” that employs a broad crescendo-diminuendo (getting louder then fading away) to depict a military column as it marches by.

I suspect that modern processions and parades are weak echos of pilgrimage, a gradual transformation of one thing into something else. Yet the call of the open road (a/k/a wanderlust) resurfaces periodically even when not specifically religious in motivation. The great westward migration of Europeans to North American and then Americans across the untamed frontiers attests to that venturing spirit. In literature, Jack London’s memoir The Road (1907) describes the hobo life hopping trains in the 1890s, while Jack Kerouac’s On the Road (1957) tells of traveling across America by car. Another expression of wanderlust was penned by forgotten American poet Vachel Lindsay in his self-published War Bulletin #3 (1909):

Let us enter the great offices and shut the desk lids and cut the telephone wires. Let us see that the skyscrapers are empty and locked, and the keys thrown into the river. Let us break up the cities. Let us send men on a great migration: set free, purged of the commerce-made manners and fat prosperity of America; ragged with the beggar’s pride, starving with the crusader’s fervor. Better to die of plague on the highroad seeing the angels, than live on iron streets playing checkers with dollars ever and ever.

Lindsay invites his readers to embrace a life better lived traversing the landscape in a voyage of self-discovery. His version lacks the religious orientation of pilgrimage, but like the Medieval cultures depicted in film and music from the period, possesses tremendous appeal for modern Westerners starved of meaning that arises naturally out of tradition.

The Anton Bruckner symphony cycle recorded by the Berlin Philharmonic under Herbert von Karajan (the Wing Cycle to some collectors) has long been known to me and cherished. Based on Amazon reviews noting remastered and improved sound over previous releases of the same recorded performances, I decided the relatively low cost was worth trying out Blu-ray Audio, my first such disc. Although the entire cycle fits on a single Blu-ray Audio disc, nine CDs are included in the box. Duplication seems unnecessary, but the inclusion of both may be desirable for some listeners. Pleasingly, original cover art (the aforementioned wing) from the LPs appears on the 2020 rerelease. Shamefully, like another recent set of Bruckner symphonies, DG put the conductor’s name above the composer’s. This practice ought to stop. This review is about comparing versions/media in addition to reviewing the performances. Caveat: a superior stereo system is a prerequisite. If listening on some device not intended for high fidelity (phone, computer, etc.), save your dollars and find the recordings on a streaming service. Both CD and Blu-ray players in my system are connected to the preamp via digital cables to use the better DAC in the preamp rather than those in the players.

My comparison involves four releases of the cycle: (1) original LPs from the 1970s and 80s, (2) 1990 CDs, (3) 2020 CDs, and (4) the sole 2020 Blu-ray Audio disc. All are the same works and same performances. Direct A/B/C/D comparisons are difficult, and I didn’t listen to every version of each, which would have required far too many hours. Rather, I focused on two representative movements well established in my ear: the 2nd movt. (Adagio) of the 5th and the 4th movt. (Finale) of the 8th. Because every recording has its own unique characteristics borne out of the era (typically divided by decade), the engineering team, and the producer, it’s normal for me to attempt to “hear through” all the particulars of a recording to the actual performance. For a recording to be badly flawed or unlistenable is fairly exceptional, and I still tend to make mental adjustments to accommodate what I’m hearing. Similar perceptual adjustments in the visual spectrum known as “white balance” reflect how outdoor lighting and color shift as the sun transits across the sky. Accordingly, there is probably no such thing as authentic or natural sound as one’s perceptual apparatus adjusts automatically, accommodating itself to what is heard to fit circumstance. That said, it’s obvious that some recorded media and listening environments are superior to others.

(more…)

Ours is an era when individuals are encouraged to explore, amplify, and parade various attributes of their identities out in public, typically via social media. For those just coming of age and/or recently having entered adulthood, because identity is not yet fully formed, defining oneself is more nearly a demand. When identity is further complicated by unusual levels of celebrity, wealth, beauty, and athleticism (lots of overlap there), defining oneself is often an act of rebellion against the perceived demands of an insatiable public. Accordingly, it was unsurprising to me at least to learn of several well-known people unhappy with their lives and the burdens upon them.

Regular folks can’t truly relate the glitterati, who are often held up aspirational models. For example, many of us look upon the discomforts of Prince Harry and Meghan Markle with a combination of perverse fascination and crocodile tears. They were undoubtedly trapped in a strange, gilded prison before repudiating the duties expected of them as “senior royals,” attempting an impossible retreat to normalcy outside of England. Should be obvious that they will continue to be hounded while public interest in them persists. Similarly, Presley Gerber made news, fell out of the news, and then got back into the news as a result of his growing collection of tattoos. Were he simply some anonymous fellow, few would care. However, he has famous parents and already launched a modeling career before his face tattoo announced his sense of being “misunderstood.” Pretty bold move. With all the presumed resources and opportunities at his disposal, many have wondered in comments and elsewhere whether another, better declaration of self might have been preferred.

Let me give these three the benefit of doubt. Although they all have numerous enviable attributes, the accident of birth (or in Markle’s case, decision to marry) landed them in exceptional circumstances. The percentage of celebrities who crack under the pressure of unrelenting attention and proceed to run off the rails is significant. Remaining grounded is no doubt easier if one attains celebrity (or absurd immense wealth) after, say, the age of 25 or even later. (On some level, we’ve all lost essential groundedness with reality, but that’s another blog post.) Those who are children of celebrities or who become child prodigies may not all be consigned to character distortion or a life irrevocably out of balance, but it’s at least so commonplace that the dangerous potential should be recognized and embraced only with wariness. I’ve heard of programs designed to help professional athletes who become sudden multimillionaires (and thus targets of golddiggers and scammers) make the transition. Good for them that structured support is available. Yet another way average folks can’t relate: we have to work things out for ourselves.

Here’s the example I don’t get: Taylor Swift. She was the subject of a Netflix biography called Miss Americana (2020) that paints her as, well, misunderstood. Thing is, Swift is a runaway success story, raking in money, fans, awards, attention, and on balance, detractors. That success is something she earnestly desired and struggled to achieve only to learn that the glossy, popstar image sold especially but nonexclusively to 14-year-old girls comes with a lot of heavy baggage. How can the tragic lives of so many musicians launched into superstardom from the late 1950s onward have escaped Swift’s awareness in our media-saturated world? Naming names is sorta tacky, so I demur, but there are lots of them. Swift obtained her heart’s desire, found her songwriting and political voice, maintains a high public profile, and shows no lack of productivity. Sure, it’s a life out of balance, not remotely normal the way most noncelebrities would understand. However, she signed up for it willingly (if naïvely) and by all accounts perpetuates it. She created her own distinctive gilded prison. I don’t envy her, nor do I particularly feel sorry for her, as the Netflix show appears to instruct.

The old saw goes that acting may be just fine as a creative endeavor, but given the opportunity, most actors really want to direct. A similar remark is often made of orchestral musicians, namely, that most rank-and-file players would really rather conduct. Directing and conducting may not be the central focus of creative work in their respective genres. After all, directors don’t normally appear onscreen and conductors make no sound. Instead, they coordinate the activities of an array of creative folks, putting directors in a unique position to bring about a singular vision in otherwise collaborative work. A further example is the Will to Power (associated with Friedrich Nietzsche and Arthur Schopenhauer) characteristic of those who wish to rule (as distinguished from those who wish to serve) such as regents, dictators, and autocrats. All of this sprang to mind because, despite outward appearance of a free, open society in the U.S., recent history demonstrates that the powers that be have instituted a directed election and directed economy quite at odds with democracy or popular opinion.

The nearest analogy is probably the directed verdict, where a judge removes the verdict from the hands or responsibility of the jury by directing the jury to return a particular verdict. In short, the judge decides the case for the jury, making the jury moot. I have no idea how commonplace directed verdicts are in practice.

Directed Election

Now that progressive candidates have been run out of the Democratic primaries, the U.S. presidential election boils down to which stooge to install (or retain) in November. Even if Biden is eventually swapped out for another Democrat in a brokered nominating convention (highly likely according to many), it’s certain to be someone fully amenable to entrenched corporate/financial interests. Accordingly, the deciders won’t be the folks who dutifully showed up and voted in their state primaries and caucuses but instead party leaders. One could try to argue that as elected representatives of the people, party leaders act on behalf of their constituencies (governing by consent of the people), but some serious straining is needed to arrive at that view. Votes cast in the primaries thus far demonstrate persistent desire for something distinctly other than the status quo, at least in the progressive wing of the Democratic party. Applying the cinematic metaphor of the top paragraph, voters are a cast of thousands millions being directed within a larger political theater toward a predetermined result.

Anyone paying attention knows that voters are rarely given options that aren’t in fact different flavors of the same pro-corporate agenda. Thus, no matter whom we manage to elect in November, the outcome has already been engineered. This is true not only by virtue of the narrow range of candidates able to maneuver successfully through the electoral gauntlet but also because of perennial distortions of the balloting process such as gerrymandering, voter suppression, and election fraud. Claims that both sides (really just one side) indulge in such practices so everything evens out don’t convince me.

Directed Economy

Conservative economists and market fundamentalists never seem to tire of arguments in the abstract that capitalist mechanisms of economics, left alone (unregulated, laissez-faire) to work their magic, deliver optimal outcomes when it comes to social and economic justice. Among the primary mechanisms is price discovery. However, economic practice never even remotely approaches the purity of abstraction because malefactors continuously distort and game economic systems out of self-interest greed. Price discovery is broken and equitable economic activity is made fundamentally fictitious. For example, the market for gemstones is famously inflated by a narrow consortium of sellers having successfully directed consumers to adopt a cultural standard of spending three months’ wages/salary for a wedding band as a demonstration of one’s love and devotion. In the opposite direction, precious metal spot prices are suppressed despite very high demand and nearly nonexistent supply. Current quoted premiums over spot silver price, even though no delivery is contemplated, range from roughly 20% to an absurd 2,000%. Supply and demand curves no longer function to aid in true price discovery (if such a thing ever existed). In a more banal sense, what people are willing to pay for a burger at a fast food joint or a loaf of bread at the grocery may affect the price charged more directly.

Nowhere is it more true that we’ve shifted to a directed economy than with the stock market (i.e., Wall Street vs. Main Street). As with the housing market, a real-world application with which many people have personal experience, if a buyer of a property or asset fails to appear within a certain time frame (longer for housing, shorter for stock, bonds, and other financial instruments), the seller is generally obliged to lower the price until a buyer finally appears. Some housing markets extraordinarily flush with money (e.g., Silicon Valley and Manhattan) trigger wild speculation and inflated prices that drive out all but the wealthiest buyers. Moreover, when the eventual buyer turns out to be a bank, corporation, or government entity willing to overpay for the property or asset using someone else’s money, the market becomes wholly artificial. This has been the case with the stock market for the last twelve years, with cheap money being injected nonstop via bailouts and quantitative easing to keep asset prices inflated. When fundamental instabilities began dragging the stock market down last fall, accelerating precipitous in early spring of this year and resulting in yet another crash (albeit brief), the so-called Plunge Protection Team sprang into action and wished trillions of dollars (taxpayer debt, actually, and over the objections of taxpayers in a classic fool-me-once scenario) into existence to perpetuate the casino economy and keep asset prices inflated for the foreseeable future, which isn’t very long.

The beneficiaries of this largesse are the same as they have always been when tax monies and public debt are concerned: corporations, banks, and the wealthy. Government economic supports are directed to these entities, leaving all others in the lurch. Claims that bailouts to keep large corporate entities and wealthy individuals whole so that the larger economy doesn’t seize up and fail catastrophically are preposterous because the larger economy already has seized up and failed catastrophically while the population is mostly quarantined, throwing many individuals out of work and shuttering many businesses. A reasonable expectation of widespread insolvency and bankruptcy lingers, waiting for the workouts and numbers to mount up.

The power of the purse possessed by the U.S. Congress hasn’t been used to help the citizenry since the New Deal era of FDR. Instead, military budgets and debts expand enormously while entitlements and services to the needy and vulnerable are whittled away. Citizen rebellions are already underway in small measure, mostly aimed at the quarantines. When bankruptcies, evictions, and foreclosures start to swell, watch out. Our leaders’ fundamental mismanagement of human affairs is unlikely to be swallowed quietly.

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicit narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provokes all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves with a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

That man is me. Thrice in the last month I’ve stumbled headlong into subjects where my ignorance left me grasping in the dark for a ledge or foothold lest I be swept into a maelstrom of confusion by someone’s claims. This sensation is not unfamiliar, but it’s usually easy to beat back. Whereas I possess multiple areas of expertise and as an autodidact am constantly absorbing information, I nonetheless recognize that even in areas where I consider myself qualified to act and/or opine confidently, others possess authority and expertise far greater than mine. Accordingly, I’ve always considered myself a generalist. (A jack of all trades is not quite the same thing IMO, but I decline to draw that distinction here.)

Decisions must inevitably be made on insufficient information. That’s true because more information can always be added on top, which leads to paralysis or infinite regress if one doesn’t simply draw an arbitrary line and stop dithering. This is also why I aver periodically that consciousness is based on sufficiency, meaning “good enough.” A paradox exists between a decision being good enough to proceed despite the obvious incompleteness of information that allows for full, balanced analysis, if fullness can even be achieved. Knowledge is thus sufficient and insufficient at the same time. Banal, everyday purchasing decisions at the grocery store are low risk. Accepting a job offer, moving to a new city, and proposing marriage carry significant risks but are still decisions made on insufficient information precisely because they’re prospective. No way of knowing with certainty how things will turn out. (more…)

Color me surprised to learn that 45 is considering a new executive order mandating that the “classical architectural style shall be the preferred and default style” for new and upgraded federal buildings, revising the Guiding Principles for Federal Architecture issued in 1962. Assuredly, 45 is hardly expected to weigh in on respectable aesthetic choices considering his taste runs toward gawdy, glitzy, ostentatious surface display (more Baroque) than restraint, dignity, poise, and balance (more Classical or Neoclassical).

Since I pay little attention to mainstream news propaganda organs, I learned of this from James Howard Kunstler’s blog Clusterfuck Nation (see blogroll) as though the order had already issued, but it’s apparently still in drafting. Twas nice to read Kunstler returning to his roots in architectural criticism. He’s never left it behind entirely; his website has a regular feature called Eyesore of the Month, which I rather enjoy reading. He provides a brief primer how architectural styles in the 20th century (all lumped together as Modernism) embody the Zeitgeist, namely, techno-narcissism. (I’m unconvinced that Modernism is a direct rebuke of 20th-century fascists who favored Classicism.) Frankly, with considerably more space at his disposal, Iain McGilchrist explores Modernist architecture better and with far greater erudition in The Master and his Emissary (2010), which I blogged through some while ago. Nonetheless, this statement by Kunstler deserves attention:

The main feature of this particular moment is that techno-industrial society has entered an epochal contraction presaging collapse due to over-investments in hyper-complexity. That hyper-complexity has come to be perfectly expressed in architecture lately in the torqued and tortured surfaces of gigantic buildings designed by computers, with very poor prospects for being maintained, or even being useful, as we reel into a new age of material scarcity and diminished expectations …

This is the life-out-of-balance statement in a nutshell. We are over-extended and wedded to an aesthetic of power that requires preposterous feats of engineering to build and continuous resource inputs to operate and maintain. (Kunstler himself avers elsewhere that an abundance of cheap, easily harvested energy enabled the Modern Era, so chalking up imminent collapse due primarily to over-investment in hyper-complexity seems like substitution of a secondary or follow-on effect for the main one.) My blogging preoccupation with skyscrapers demonstrates my judgment that the vertical dimension of the human-built world in particular is totally out of whack, an instantiation of now-commonplace stunt architecture. Should power ever fail for any sustained duration, reaching floors above, say, the 10th and delivering basic services to them, such as water for sinks and toilets, quickly becomes daunting.

However, that’s a technical hurdle, not an aesthetic consideration. The Modernist government buildings in question tend to be Brutalist designs, which often look like high-walled concrete fortresses or squat, impenetrable bunkers. (Do your own image search.) They project bureaucratic officiousness and disconcern if not open hostility toward the people they purport to serve. Basically, enter at your own risk. They share with the International Style a formal adherence to chunky geometric forms, often presented impassively (as pure abstraction) or in an exploded view (analogous to a cubist painting showing multiple perspectives simultaneously). Curiously, commentary at the links above is mostly aligned with perpetuating the Modernist project and aesthetic as described by Kunstler and McGilchrist. No interruptions, difficulties, or vulnerabilities are contemplated. Commentators must not be reading the same analyses I am, or they’re blithely supportive of progress in some vague sense, itself a myth we tell ourselves.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

Magnitude

Posted: January 6, 2020 in Artistry, Corporatism, Culture, Science
Tags: ,

Something I read somewhere (lost track of what and when) sparked some modest inquiry into the mathematical concept of magnitude, or more specifically, the order of magnitude. I suspect, consistent with the doomer themes of this blog, that it was a statement to the effect that the sixth extinction (or Holocene extinction if you prefer) is proceeding at some order of magnitude faster than previous mass extinction events.

Within various scientific fields, magnitude has specific and specialized meanings. For instance, the Richter Scale, used to denote the power of earthquakes, is a familiar though poorly understood measure reported in the aftermath of an event. Magnitudes of distance and time are more immediately understood in the mundane sense of how far/long to travel somewhere (via foot, bicycle, car, train, plane, etc.) and more exotically outside Earth orbit as depicted in science fiction. Perhaps the most cognitively accessible illustration of magnitude, however, is scale. Arguably, size (absolute?) and scale (comparative?) are intertwined with distance, or even more broadly, time-space. I’ll leave that discussion to someone who knows better than I do.

All that said, I recalled from boyhood a short film depicting scale in terms of Powers of Ten. Unsurprisingly, I found it on YouTube (embedded below).

Perhaps it’s just my refurbishing of memory, but this film (now video) has a sense of wonder and amazement, sort of like how Disney properties (e.g., films, TV shows, theme parks, merchandise) from the 1960s and 70s retained an innocence from the time when Walt Disney himself was in charge. Early NASA orbital missions and moonshots had that quality, too, but NASA’s wonder years dissipated around the time space shuttles went into service, demonstrating that NASA’s primary objective was neither technical innovation nor exploration anymore but rather commerce, namely, putting satellites into orbit for communications services. Just this past year, the risible U.S. Space Force, wished into existence by 45 single-handedly over all reasonable objections (not unlike the border wall with Mexico), demonstrates a similar loss of innocence. It’s essentially an attempt to patrol and/or weaponize the commons. I’d like to believe that military personnel are dutifully obeying a pointless command from the commander-in-chief and will abandon or scuttle the new military branch once 45 is out of office. Time will tell.

Loss of innocence may be inevitable in the postmodern world given our jadedness, cynicism, and oh-so-hip ironic detachment. It’s not a good look on us. For instance, once Disney went corporate, the aesthetic pioneered and guided by old Walt changed for the worse. Relatively recent acquisitions of Pixar, Marvel, and Star Wars (among others) and expansion of theme parks and resorts reveal an entertainment behemoth geared more cynically toward money-making than artistry or inspiration. The apparent insufficiency of earlier incarnations of NASA and Disney find a parallel with an updated version of Powers of Ten (not embedded), narrated by Morgan Freeman (because … why not?) and using the same basic script but employing whiz-bang graphics considerably enhanced over their 1977 counterparts. Even the pop-culture digital network Buzzfeed (not exactly a venerated news source) gets some action with its derivative, examination-lite of cosmic scale (ignoring the microscopic and subatomic):

Going back to the idea of magnitude, I’m aware of four time-scales in common use: human history, evolutionary time, geological time, and cosmic time. Few contextualize the last 2–3 centuries this way, but human activity has had substantial effects that collapse events usually occurring over evolutionary or geological time into human history. We unwittingly launched a grand terraforming project but have yet to demonstrate overriding care for the eventual outcomes. The, um, magnitude of our error cannot be overstated.