Archive for the ‘Science’ Category

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.

Coming back to this topic after some time (pt. 1 here). My intention was to expand upon demands for compliance, and unsurprisingly, relevant tidbits continuously pop up in the news. The dystopia American society is building for itself doesn’t disappoint — not that anyone is hoping for such a development (one would guess). It’s merely that certain influential elements of society reliably move toward consolidation of power and credulous citizens predictably forfeit their freedom and autonomy with little or no hesitation. The two main examples to discuss are Black Lives Matter (BLM) and the response to to the global pandemic, which have occurred simultaneously but are not particularly related.

The BLM movement began in summer 2013 but boiled over in summer 2020 on the heels of the George Floyd killing, with protests spilling over into straightforward looting, mayhem, and lawlessness. That fit of high emotional pique found many protester accosting random strangers in public and demanding a raised fist in support of the movement, which was always ideologically disorganized but became irrational and power-hungry as Wokedom discovered its ability to submit others to its will. In response, many businesses erected what I’ve heard called don’t-hurt-me walls in apparent support of BLM and celebration of black culture so that windows would not be smashed and stores ransacked. Roving protests in numerous cities demanded shows of support, though with what exactly was never clear, from anyone encountered. Ultimately, protests morphed into a sort of protection racket, and agitators learned to enjoy making others acquiesce to arbitrary demands. Many schools and corporations now conduct mandatory training to, among other things, identify unconscious bias, which has the distinct aroma of original sin that can never be assuaged or forgiven. It’s entirely understandable that many individuals, under considerable pressure to conform as moral panic seized the country, play along to keep the peace or keep their jobs. Backlash is building, of course.

The much larger example affecting everyone, nationwide and globally, is the response to the pandemic. Although quarantines have been used in the past to limit regional outbreaks of infectious disease, the global lockdown of business and travel was something entirely new. Despite of lack of evidence of efficacy, the precautionary principle prevailed and nearly everyone was forced into home sequestration and later, after an embarrassingly stupid scandal (in the U.S.), made to don masks when venturing out in public. As waves of viral infection and death rolled across the globe, political leaders learned to enjoy making citizens acquiesce to capricious and often contradictory demands. Like BLM, a loose consensus emerged about the “correct” way to handle the needs of the moment, but the science and demographics of the virus produced widely variant interpretations of such correctness. A truly coordinated national response in the U.S. never coalesced, and hindsight has judged the whole morass a fundamentally botched job of maintaining public health in most countries.

But political leaders weren’t done demanding compliance. Any entirely novel vaccine protocol was rushed into production after emergency use authorization was obtained and indemnification (against what?) was granted to the pharma companies that developed competing vaccines. Whether this historical moment will turn out to be something akin to the thalidomide scandal remains to be seen, but at the very least, the citizenry is being driven heavily toward participation in a global medical experiment. Some states even offer million-dollar lotteries to incentivize individuals to comply and take the jab. Open discussion of risks associated with the new vaccines has been largely off limits, and a two-tier society is already emerging: the vaccinated and the unclean (which is ironic, since many of the unclean have never been sick).

Worse yet (and like the don’t-hurt-me walls), many organizations are adopting as-yet-unproven protocols and requiring vaccination for participants in their activities (e.g., schools, sports, concerts) or simply to keep one’s job. The mask mandate was a tolerable discomfort (though not without many principled refusals), but forcing others to be crash test dummies experimental test subjects is well beyond the pale. Considering how the narrative continues to evolve and transform, thoughtful individuals trying to evaluate competing truth claims for themselves are unable to get clear, authoritative answers. Indeed, it’s hard to imagine a situation where authorities in politics, medicine, science, and journalism could worked so assiduously to undermine their own credibility. Predictably, heads (or boards of directors) of many organizations are learning to enjoy the newly discovered power to transform their organizations into petty fiefdoms and demand compliance from individuals — usually under the claim of public safety (“for the children” being unavailable this time). Considering how little efficacy has yet been truly demonstrated with any of the various regimes erected to contain or stall the pandemic, the notion that precautions undertaken have been worth giving injudicious authority to people up and down various power hierarchies to compel individuals remains just that: a notion.

Tyrants and bullies never seem to tire of watching others do the submission dance. In the next round, be ready to hop on one leg and/or bark like a dog when someone flexes on you. Land of the free and home of the brave no longer.

Addendum

The CDC just announced an emergency meeting to be held (virtually) June 18 to investigate reports (800+ via the Vaccination Adverse Effect Reporting System (VAERS), which almost no one had heard of only a month ago) of heart inflammation in adolescents following vaccination against the covid virus. Significant underreporting is anticipated following the circular logic that since authorities declared the vaccines safe prematurely (without standard scientific evidence to support such a statement), the effects cannot be due to the vaccine. What will be the effect of over 140 million people having been assured that vaccination is entirely safe, taken the jab, and then discovered “wait! maybe not so much ….” Will the complete erosion of trust in what we’re instructed told by officialdom and its mouthpieces in journalism spark widespread, organized, grassroots defiance once the bedrock truth is laid bare? Should it?

A listicle called “10 Things We Have Learned During the Covid Coup,” supporting text abbreviated ruthlessly:

1. Our political system is hopelessly corrupt …

2. Democracy is a sham. It has been a sham for a very long time …

3. The system will stop at nothing to hold on to its power …

4. So-called radical movements are usually nothing of the sort …

5. Any “dissident” voice you have ever heard of through corporate media is probably a fake …

6. Most people in our society are cowards …

7. The mainstream media is nothing but a propaganda machine for the system …

8. Police are not servants of the public but servants of a powerful and extremely wealthy minority …

9. Scientists cannot be trusted …

10. Progress is a misleading illusion …

Wanted to provide an update to the previous post in my book-blogging project on Walter Ong’s Orality and Literacy to correct something that wasn’t clear to me at first. The term chirographic refers to writing, but I conflated writing more generally with literacy. Ong actually distinguishes chirographic (writing) from typographic (type or print) and includes another category: electronic media.

Jack Goody … has convincingly shown how shifts hitherto labeled as shifts from magic to science, or from the so-called ‘prelogical’ to the more and more ‘rational’ state of consciousness, or from Lévi-Strauss’s ‘savage’ mind to domesticated thought, can be more economically and cogently explained as shifts from orality to various stages of literacy … Marshall McLuhan’s … cardinal gnomic saying, ‘The medium is the message’, registered his acute awareness of the importance of the shift from orality through literacy and print to electronic media. [pp. 28–29]

So the book’s primary contrast is between orality and literacy, but literacy has a sequence of historical developments: chirographic, typographic, and electronic media. These stages are not used interchangeably by Ong. Indeed, they exist simultaneously in the modern world and all contribute to overall literacy while each possesses unique characteristics. For instance, reading from handwriting (printing or cursive, the latter far less widely used now except for signatures) is different from reading from print on paper or on the screen. Further, writing by hand, typing on a typewriter, typing into a word-processor, and composing text on a smartphone each has its effects on mental processes and outputs. Ong also mentions remnants of orality that have not yet been fully extinguished. So the exact mindset or style of consciousness derived from orality vs. literacy is neither fixed nor established universally but contains aspects from each category and subcategory.

Ong also takes a swing at Julian Jaynes. Considering that Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind (1977) (see this overview) was published only seven years prior to Orality and Literacy (1982), the impact of Jaynes’ thesis must have still been felt quite strongly (as it is now among some thinkers). Yet Ong disposes of Jaynes rather parsimoniously, stating

… if attention to sophisticated orality-literacy contrasts is growing in some circles, it is still relatively rare in many fields where it could be helpful. For example, the early and late stages of consciousness which Julian Jaynes (1977) describes and related to neuro-physiological changes to the bicameral mind would also appear to lend themselves largely to much simpler and more verifiable descriptions in terms of a shift from orality to literacy. [p. 29]

In light of the details above, it’s probably not accurate to say (as I did before) that we are returning to orality from literacy. Rather, the synthesis of characteristics is shifting, as it always has, in relation to new stimuli and media. Since the advent of cinema and TV — the first screens, now supplemented by the computer and smartphone — the way humans consume information is undergoing yet another shift. Or perhaps it’s better to conclude that it’s always been shifting, not unlike how we have always been and are still evolving, though the timescales are usually too slow to observe without specialized training and analysis. Shifts in consciousness arguably occur far more quickly than biological evolution, and the rate at which new superstimuli are introduced into the information environment suggest radical discontinuity with even the recent past — something that used to be call the generation gap.

I’ve always wondered what media theorists such as McLuhan (d. 1980), Neil Postman (d. 2003), and now Ong (d. 2003) would make of the 21st century had they lived long enough to witness what has been happening, with 2014–2015 being the significant inflection point according to Jonathan Haidt. (No doubt there are other media theorists working on this issue who have not risen to my attention.) Numerous other analyses point instead to the early 20th century as the era when industrial civilization harnessed fossil fuels and turned the mechanisms and technologies of innovators decidedly against humanity. Pick your branching point.

Returning to the subject of this post, I asserted that the modern era frustrates a deep, human yearning for meaning. As a result, the Medieval Period, and to a lesser degree, life on the highroad, became narrative fixations. Had I time to investigate further, I would read C.S. Lewis’ The Discarded Image (1964), but my reading list is already overfull. Nonetheless, I found an executive summary of how Lewis describes the Medieval approach to history and education:

Medieval historians varied in that some of them were more scientific, but most historians tried to create a “picture of the past.” This “picture” was not necessarily based in fact and was meant more to entertain curiosity than to seriously inform. Educated people in medieval times, however, had a high standard for education composed of The Seven Liberal Arts of grammar, dialectic, rhetoric, arithmetic, music, geometry, and astronomy.

In the last chapter, Lewis summarizes the influence of the Medieval Model. In general, the model was widely accepted, meaning that most people of the time conformed to the same way of thinking. The model, he reiterates, satisfied imagination and curiosity, but was not necessarily accurate or factual, specifically when analyzed by modern thinkers.

Aside. Regular readers of The Spiral Staircase may also recognize how consciousness informs this blog post. Historical psychology offers a glimpse into worldviews of bygone eras, with the Medieval Period perhaps being the easiest to excavate contemplate due to proximity. Few storytellers (cinema or literature) attempt to depict what the world was truly like in the past (best as we can know) but instead resort to an ahistorical modern gloss on how men and women thought and behaved. One notable exception may be the 1986 film The Name of the Rose, which depicts the emerging rational mind in stark conflict with the cloistered Medieval mind. Sword-and-sandal epics set in ancient Rome and Greece get things even worse.

(more…)

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

I admit (again) to being bugged by things found on YouTube — a miserable proxy for the marketplace of ideas — many of which are either dumb, wrongheaded, or poorly framed. It’s not my goal to correct every mistake, but sometimes, inane utterances of intellectuals and specialists I might otherwise admire just stick in my craw. It’s hubris on my part to insist on my understandings, considering my utter lack of standing as an acknowledged authority, but I’m not without my own multiple areas of expertise (I assert immodestly).

The initial purpose for this blog was to explore the nature of consciousness. I’ve gotten badly sidetracked writing about collapse, media theory, epistemology, narrative, and cinema, so let me circle back around. This is gonna be long.

German philosopher Oswald Spengler takes a crack at defining consciousness:

Human consciousness is identical with the opposition between the soul and the world. There are gradations in consciousness, varying from a dim perception, sometimes suffused by an inner light, to an extreme sharpness of pure reason that we find in the thought of Kant, for whom soul and world have become subject and object. This elementary structure of consciousness is not capable of further analysis; both factors are always present together and appear as a unity.

(more…)

/rant on

MAD is a term I haven’t thought about for a good long while. No illusions here regarding that particularly nasty genie having been stuffed back into its lamp. Nope, it lingers out there in some weird liminal space, routinely displaced by more pressing concerns. However, MAD came back into my thoughts because of saber-rattling by U.S. leadership suggesting resumed above-ground nuclear testing might be just the ticket to remind our putative enemies around the world what complete assholes we are. Leave it to Americans to be the very last — in the midst of a global pandemic (that’s redundant, right?) — to recognize that geopolitical squabbles (alert: reckless minimization of severity using that word squabble) pale in comparison to other looming threats. Strike that: we never learn; we lack the reflective capacity. Still, we ought to reorient in favor of mutual aid and assistance instead of our MAD, insane death pact.

The authoritative body that normally springs to mind when MAD is invoked is the Bulletin of the Atomic Scientists. Ironically, it appears to be an independent, nonprofit 501(c)(3) entity, a media organization, not an actual collection of atomic scientists. (I’ll continue to italicize Bulletin as though it’s a publication like the New York Times even though it’s arguably something else.) I’ve blogged repeatedly about its iconic Doomsday Clock. In an otherwise astute post against sloppy appeals to authority using the increasingly meaningless term expert, Alan Jacobs takes to task the Bulletin for straying out of its lane to consider threats that are political in nature rather than scientific. Reminded me of when Pope Francis in his encyclical deigned to acknowledge climate change, recognizing that Mother Earth is our “common home” and maybe we shouldn’t be raping her. (OK, that coarse bit at the end is mine.) What? He’s not a climatologist! How dare he opine on something outside his official capacity? Go back to saving souls!

At the same time we desperately need expertise to accomplish things like building bridges that don’t fall down (yet still do) or performing an appendectomy without killing the patient, it’s inevitable that people form opinions about myriad subjects without the benefit of complete authority or expertise, if such a thing even exists. As students, citizens, and voters, we’re enjoined to inform ourselves, discuss, and learn rather than forfeit all opinion-making to, oh I dunno, the chattering classes. That’s intellectual sovereignty, unless one is unfortunate enough to live in a totalitarian regime practicing thought control. Oh, wait … So it’s a sly form of credentialing to fence off or police opinion expressed from inexpert quarters as some sort of thought crime. Regarding MAD, maybe the era has passed when actual atomic scientists assessed our threat level. Now it’s a Science and Security Board made up of people few have ever heard of, and the scope of their concern, like the Pope’s, is wide enough to include all existential threats, not just the one assigned to them by pointy-headed categorists. Are politicians better qualified on such matters? Puhleeze! (OK, maybe Al Gore, but he appears to be busy monetizing climate change.)

As a self-described armchair social critic, I, too, recognized more than a decade ago the existential threat (extinction level, too) of climate change and have blogged about it continuously. Am I properly credentialed to see and state the, um, obvious? Maybe not. That’s why I don’t argue the science and peer-reviewed studies. But the dynamics, outlines, and essentials of climate change are eminently understandable by laypersons. That was true as well for Michael Ruppert, who was impeached by documentarians for lacking supposed credentialed expertise yet still having the temerity to state the obvious and sound the alarm. Indeed, considering our failure to act meaningfully to ameliorate even the worst case scenario, we’ve now got a second instance of mutually assured destruction, a suicide pact, and this one doesn’t rely on game-theoretical inevitability. It’s already happening all around us as we live and breathe … and die.

/rant off

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)