Buzzwords circulating heavily in the public sphere these days include equality, equity, inclusion, representation, diversity, pluralism, multiculturalism, and privilege. How they are defined, understood, and implemented are contentious issues that never seem to resolve. Indeed, looking back on decade after decade of activism undertaken to address various social scourges, limited progress has been made, which amounts to tinkering around the edges. The underlying bigotry (opinion, motivation, activity) has never really been eradicated, though it may be diminished somewhat through attrition. Sexism and racism in particular (classics in an expanding universe of -isms) continue to rage despite surface features taking on salutary aspects. Continued activism use the buzzwords above (and others) as bludgeons to win rhetorical battles — frequently attacks on language itself through neologism, redefinition, and reclamation — without really addressing the stains on our souls that perpetuate problematic thinking (as though anyone ever had a lock on RightThink). Violence (application of actual force, not just mean or emphatic words) is the tool of choice deployed by those convinced their agenda is more important than general societal health and wellbeing. Is violence sometimes appropriate in pursuit of social justice? Yes, in some circumstances, probably so. Is this a call for violent protest? No.

Buzzwords stand in for what we may think we want as a society. But there’s built-in tension between competition and cooperation, or alternatively, individual and society (see the start of this multipart blog post) and all the social units that nest between. Each has its own desires (known in politics by the quaint term special interests), which don’t usually combine to form just societies despite mythology to that effect (e.g., the invisible hand). Rather, competition results in winners and losers even when the playing field is fair, which isn’t often. To address what is sometimes understood (or misunderstood, hard to know) as structural inequity, activists advocate privileging disenfranchised groups. Competence and merit are often sacrificed in the process. Various forms of criminal and sociopathic maneuvering also keep a sizeable portion of the population (the disenfranchised) in a perpetual and unnecessary state of desperation. That’s the class struggle.

So here’s my beef: if privilege (earned or unearned) is categorically bad because it’s been concentrated in a narrow class (who then position themselves to retain and/or grow it), why do activists seek to redistribute privilege by bestowing it on the downtrodden? Isn’t that a recipe for destroying ambition? If the game instead becomes about deploying one or more identifiers of oppression to claim privilege rather than working diligently to achieve a legitimate goal, such as acquiring skill or understanding, why bother to try hard? Shortcuts magically appear and inherent laziness is incentivized. Result: the emergent dynamic flattens valuable, nay necessary, competence hierarchies. In it’s communist formulation, social justice is achieved by making everyone equally precarious and miserable. Socialism fares somewhat better. Ideologues throughout history have wrecked societies (and their members) by redistributing or demolishing privilege forcibly while hypocritically retaining privilege for themselves. Ideology never seems to work out as theorized, though the current state of affairs (radical inequality) is arguably no more just.

More to unpack in further installments.

The Bulletin of the Atomic Scientists has reset its doomsday clock ten seconds closer to midnight (figuratively, nuclear Armageddon), bringing the world closest to catastrophe in its history. Bizarrely, its statement published today points squarely at Russia as the culprit but fails to mention the participation of the United States and other NATO countries in the conflict. Seems to me a rather unethical deployment of distinctly one-sided rhetoric. With the clock poised so close to midnight, there’s nearly nowhere left to reset the clock until the bombs fly. In contrast, two of the blogs I read that are openly critical of provocations and escalation against Russia, not by Russia, are Bracing Views (on my blogroll) and Caitlin Johnstone (not on my blogroll). Neither minces words, and both either suggest or say openly that U.S. leadership are the bad guys indulging in nuclear brinkmanship. Many more example of that ongoing debate are available. Judge for yourself whose characterizations are more accurate and useful.

Although armed conflict at one scale or another, one time or another, is inescapable given mankind’s violent nature, it takes no subtle morality or highfalutin ethics to be default antiwar, whereas U.S. leadership is uniformly prowar. Can’t know for sure what the motivations are, but the usual suspects include fear of appearing weak (um, it’s actually that fear that’s weak, dummies) and the profit motive (war is a racket). Neither convinces me in the slightest that squandering lives, energy, and lucre make war worth contemplating except in extremis. Military action (and its logistical support such as the U.S. is providing to Ukraine) should only be undertaken with the gravest regret and reluctance. Instead, war is glorified and valorized. Fools rush in ….

No need to think hard on this subject, no matter where one gets information and reporting (alternately, disinformation and misreporting). Also doesn’t ultimately matter who are the good guys or the bad guys. What needs to happen in all directions and by all parties is deescalation and diplomacy. Name calling, scapegoating, and finger pointing might soothe some that they’re on the right side of history. The bad side of history will be when nuclear powers go MAD, at which point no one can stake any moral high ground.

My inquiries into media theory long ago led me to Alan Jacobs and his abandoned, reactivated, then reabandoned blog Text Patterns. Jacobs is a promiscuous thinker and even more promiscuous technologist in that he has adopted and abandoned quite a few computer apps and publishing venues over time, offering explanations each time. Always looking for better tools, perhaps, but this roving public intellectual requires persistent attention lest one lose track of him. His current blog (for now) is The Homebound Symphony (not on my ruthlessly short blogroll), which is updated roughly daily, sometimes with linkfests or simple an image, other times with thoughtful analysis. Since I’m not as available as most academics to spend all day reading and synthesizing what I’ve read to put into a blog post, college class, or book, I am not on any sort of schedule and only publish new blog posts when I’m ready. Discovered in my latest visit to The Homebound Symphony was a plethora of super-interesting subject matter, which I daresay is relevant to the more literate and literary among us. Let me draw out the one that most piqued my interest. (That was the long way of tipping my hat to Jacobs for the link.)

In an old (by Internet standards) yet fascinating book review by Michael Walzer of Siep Stuurman’s The Invention of Humanity: Equality and Cultural Difference in World History (2017), Walzer describes the four inequalities that have persisted throughout human history, adding a fifth identified by Stuurman:

  • geographic inequality
  • racial inequality
  • hierarchical inequality
  • economic inequality
  • temporal inequality

I won’t unpack what each means if they’re not apparent on their face. Read for yourself. Intersections and overlapping are common in taxonomies of this sort, so don’t expect categories to be completely separate and distinct. The question of equality (or its inverse inequality) is a fairly recent development, part of a stew of 18th-century thought in the West that was ultimately distilled to one famous phrase “all men are created equal.” Seems obvious, but the phrase is fraught, and we’ve never really been equal, have we? So is it equal before god? Equal before the law? Equal in all opportunities and outcomes as social justice warriors now insist? On a moment’s inspection, no one can possibly believe we’re all equal despite aspirations that everyone be treated fairly. The very existence of perennial inequalities puts the lie to any notion of equality trucked in with the invention of humanity during the Enlightenment.

To those inequalities I would add a sixth: genetic inequality. Again, overlap with the others is acknowledged, but it might be worth observing that divergent inherited characteristics (other than wealth) appear quite early in life among siblings and peers, before most others manifest. By that, I certainly don’t mean race or sex, though differences clearly exist there as well. Think instead of intelligence, height, beauty, athletic ability, charisma, health and constitution, and even longevity (life span). Each of us has a mixture of characteristics that are plainly different from those of others and which provide either springboards or produce disadvantages. Just as it’s unusual to find someone in possession of all positive characteristics at once — the equivalent of rolling a 12 for each attribute of a new D&D character — few possess all negatives (a series of 1’s), either. Also, there’s probably no good way to rank best to worst, strongest to weakest, or most to least successful. Bean counters from one discipline or another might try, but that runs counter to the mythology “all men are created equal” and thus becomes a taboo to acknowledge, much less scrutinize.

What to do with the knowledge that all men are not in fact created equal and never will be? That some are stronger; more charming; smarter; taller with good teeth (or these days, dentists), hair, vision, and square jaws; luckier in the genetic lottery? Well, chalk it up, buddy. We all lack some things and possess others.

A bunch of unrelated things I have been reading and hearing suddenly and rather unexpectedly came together synthetically. The profusion is too great to provide a handy set of links, and backstage analytics indicate that almost no one follows the links I provide anyway, so this will be free form.

Hyperanalysis

As I have previously blogged, peering (or staring) too intently at a physical object or subject of inquiry tends to result in the object or subject being examined at the wrong resolution. An obtuse way of restating this is that one can’t study the cosmos under a microscope or cell biology through a telescope. The common mistake is to study minutia and fail to perceive the whole, rarely the reverse. Iain McGilchrist suggests that hyperanalysis is a hallmark of an imbalance in brain lateralization, where the left brain (the Emissary) takes primacy over the right brain (the Master). Others building on McGilchrist’s thesis have theorized that human culture and styles of cognition have swung in and out of proper balance periodically throughout history. One theory used portraiture to support how depiction of the human face can be either humanistic or clinical in orientation. What artists perceive then produce divergent aesthetics where the eyes and mouth in particular suggest different ways of encountering the world. Using art to develop theories of psychohistory fits well with one of the preoccupations of this blog, namely, the nature of consciousness.

Intervention

Armed with a precise appreciation of some small aspect of a whole, often abstracted and idealized independently from easily observed fact in actuality, some make the leap that human activity can be acted upon confidently with entirely foreseeable outcomes. Thus, incautious decision-makers intervene to correct, adjust, or game areas within their concern and control. The applicable aphorism that restrains cautious people is “fools rush in where angels fear to tread.” Perhaps the Dunning-Kruger Effect helps to explain (at least in part) why actors plunge forward despite ample evidence that outcomes don’t adhere reliably to planning. Indeed, the rise and fall of many institutions demonstrate the fallibility of interventions in dynamic systems. Just ask the Federal Reserve. Periodic recourse to warfare is also characterized as a breakdown of peaceful stratagems accompanied by a fog where clarity of purpose is lost. Faced with deteriorating conditions, the demand that something be done, not nothing, also testifies to human frailty and frequent refusal to accept unknown contingency or inevitability.

Micromanagement

When ideology truly misfires, strenuous interventions take the form of tyranny in an attempt to force resistant outcomes into predetermined channels. Too many awful examples of abject failure on that account to mention. Small-scale tyranny might be better recognized as micromanagement. Control freaks adopt that behavior out of any number of motivations, such as distrust of others’ competence, inability to delegate or share responsibility, and hero syndrome. Ironically, the targets of control are often petty and do not contribute meaningfully to enhanced behavior or function. Rather, they focus on a subsidiary metric as a proxy for overall health, wellbeing, and effectiveness.

As people encounter ideologies and learn to systematize, they are especially prone to using incomplete understandings of complex systems and then seizing upon one intervention or another to attempt to solve problems. This is not limited to students in the early years of college who discover a supposed Rosetta stone for fixing ills but rather includes any reform political candidate or newly minted CEO with a mandate to remove corruption, dead weight, or inefficiency. This is also the domain of specialists, experts, and academics who have concentrated their efforts in a narrow subfield and feel confident generalizing that acquired skill and applied knowledge outside the area of focus. Of course, it’s warranted to rely on brain surgeons for brain surgery and car mechanics for automotive repairs. But no one expects them to offer advice on intractable social problems, correct historical wrongs, or develop a centered philosophy of anything. Indeed, institutions and societies are so complex and inherently unsteerable that, despite many futile attempts, no one has ever developed a comprehensive understanding sufficient to engineer programs that lead anywhere remotely in the vicinity of utopia. Yet with an abundance of confidence, agitators and activists — sometime quite young but unafraid to instruct their elders — seek to implement their ideological designs, by force if necessary, to enact desired change. Maybe those changes are considered humble baby steps toward improved social justice and greater equity, tinkering around the edges perhaps, but I doubt most ambitions are so constrained. While that energy is absolutely necessary in a pluralistic society to avoid cynicism and stagnation, it often becomes a victim of its own success when radicals gain power and impose orthodoxies that are ultimately self-defeating and thus short lived. History is full of movements, civil wars, and revolutions that demonstrate the point. Dystopian fiction also forewarns how tyrannies develop out of misguided application of ideology and power.

I probably haven’t described any too well the power law that coalesced in my thinking. Nor do I pretend to have solutions for anything. As I’ve often written, given the myriad problems global civilization now faces, it’s well nigh impossible to know what to wish for with much clarity since deliverance from one set of problems often exacerbates others. World systems theorists try to piece together various dynamics into a coherent unified theory, and I admire the effort to understand how the world really works, but I still subscribe to the precautionary principle when it comes to implementing globe-spanning programs.

A few years ago, Knives Out (2019) unexpectedly solidified the revival of the whodunit and introduced its modern-day master sleuth: Benoit Blanc. The primary appeal of the whodunit has always been smartly constructed plots that unfold slowly and culminate in a final reveal or unmasking that invites readers to reread in search of missed clues. The two early masters of this category of genre fiction were Sir Arthur Conan Doyle and Agatha Christie, both succeeding in making their fictional detectives iconic. Others followed their examples, though the genre arguably shifted onto (into?) the TV with shows such as Perry Mason, Columbo, and Murder She Wrote. No surprise, Hollywood transformed what might have been a one-and-done story into the beginnings of a franchise, following up Knives Out with Glass Onion: A Knives Out Mystery (subtitle displayed unnecessarily to ensure audiences make the connection — wouldn’t a better subtitle be A Benoit Blanc Mystery?). Both movies are entertaining enough to justify munching some popcorn in the dark but neither observes the conventions of the genre — novel, TV, or film — any too closely. Spoilers ahead.

I harbor a sneaking suspicion that Benoit Blac is actually a bumbling fool the way poor, rumpled Columbo only pretended to be. Although I can’t blame Daniel Craig for taking roles that allow him to portray someone other than James Bond, Craig is badly miscast and adopts a silly Southern accent others complain sounds laughably close to Foghorn Leghorn. (Craig was similarly miscast in the American remake of The Girl with the Dragon Tattoo, but that’s an entirely different, unwritten review.) So long as Blanc is a nitwit, I suppose the jokey accent provides some weak characterization and enjoyment. Problem is, because the film is only superficially a whodunit, there is no apparent crime to solve after Blanc figures out the staged murder mystery (sorta like an escape room) just after the vacation weekend gets started but before the faux murder even occurs. Kinda ruins the momentum. As a result, the film digresses to a lengthy flashback to establish the real crime that Blanc is there to solve. Maybe good mystery novels have partial reveals in the middle, reframing the entire mystery. I dunno but rather doubt it.

The plot is by no means tightly knit or clever as a whodunit normally demands. Rather, it employs lazy, pedestrian devices that irritate as much as entertain. Such as one of the characters (the real murdered character) having an identical twin who substitutes herself for the dead one; such as trapping attendees on a remote island without servants or transportation but largely ignoring their suggested captivity, such as uncovering an orgy of evidence better suited to misdirection and framing of an innocent; such as mixing faux violence with real violence, though none of the characters appears even modestly afraid at any point; such as bullets being fortuitously stopped by items in a breast pocket; such as sleuthing and detecting — done by the twin, not Blanc! — being presented in a montage of coinkidinks that demonstrate more luck than skill. I could go on. The worst cinematic trick is reprising scenes in flashback but altered to insert clues viewers would have noticed initially. Those aren’t reveals; they’re revisions. Moreover, instead of inviting viewers to rewatch, this gimmick jams supposedly unnoticed clues down their throats. How insulting. If Benoit Blanc is really an overconfident, dandified nincompoop, I suppose it’s better and more convenient (for bad storytelling) to be lucky than good. He doesn’t solve anything; he’s just there to monologue incessantly.

The weekend party is hosted by a character patterned after … oh never mind, you know who. I decline to provide the name of that real-life narcissist. Members of the entourage are mostly sycophants, originally good friends but later ruined in different ways by proximity to a hyper-successful fraud. As a group, they’re known as The Shitheads, which just about sums it up. Critics have oberved a shift in entertainment toward depicting super-wealthy pretty people as heels of the highest order. Not sure what makes that entertaining exactly. I enjoy no Schadenfreude witnessing the high and mighty brought low, much as they may deserve it. It’s just another lazy cliché (like its inverse: the dignity of the downtrodden everyman a/k/a the noble savage) trotted out in the absence of better ideas.

/rant on

New Year’s Day (or just prior) is the annual cue for fools full of loose talk to provide unasked their year-in-review and “best of” articles summarizing the previous calendar year. I don’t go in for such clichéd forms of curation but certainly recognize an appetite among Web denizens for predigested content that tells them where to park their attention and what or how to think rather than thinking for themselves. Considering how mis- and under-educated the public has grown to be since the steady slippage destruction of educational standards and curricula began in the 1970s (says me), I suppose that appetite might be better characterized as need in much the same way children needs guidance and rules enforced by wizened authorities beginning with parents yet never truly ending, only shifting over to various institutions that inform and restrain society as a whole. I continue to be flabbergasted by the failure of parents (and teachers) to curb the awful effects of electronic media. I also find it impossible not to characterize social media and other hyperstimuli as gateways into the minds of impressionable youth (and permanent adult children) very much like certain drugs (e.g., nicotine, alcohol, and cannabis) are characterized as gateways to even worse drugs. No doubt everyone must work out a relationship with these unavoidable, ubiquitous influences, but that’s not equivalent to throwing wide open the gate for everything imaginable to parade right in, as many do.

Hard to assess whether foundations below American institutions (to limit my focus) were allowed to decay through neglect and inattention or were actively undermined. Either way, their corruption and now inter-generational inability to function effectively put everyone in a wildly precarious position. The know-how, ambition, and moral focus needed to do anything other than game sclerotic systems for personal profit and acquisition of power are eroding so quickly that operations requiring widespread subscription by the public (such as English literacy) or taking more than the push of a button or click of a mouse to initiate preprogrammed commands are entering failure mode. Like the accidental horror film Idiocracy, the point will come when too few possess the knowledge and skills anymore to get things done but can only indulge in crass spectacle with their undeveloped minds. Because this is a date-related blog post, I point out that Idiocracy depicts results of cultural decay 500 years hence. It won’t take nearly that long. Just one miserable example is the fascist, censorious mood — a style of curation — that has swept through government agencies and Silicon Valley offices intent on installing unchallenged orthodoxies, or for that matter, news junkies and social media platform users content to accept coerced thinking. Religions of old ran that gambit but no need to wait for a new Inquisition to arise. Heretics are already persecuted via cancel culture, which includes excommunication social expulsion, suspension and/or cancellation of media accounts, and confiscation of bank deposits.

A similar point can be made about the climate emergency. Fools point to weather rather than climate to dispel urgency. Reports extrapolating trends often focus on the year 2100, well after almost all of us now alive will have departed this Earth, as a bogus target date for eventualities like disappearance of sea and glacial ice, sea level rise, unrecoverable greenhouse gas concentrations in the atmosphere, pH imbalance in the oceans, and other runaway, self-reinforcing consequences of roughly 300 years of industrial activity that succeeded unwittingly in terraforming the planet, along the way making it fundamentally uninhabitable for most species. The masses labor in 2023 under the false impression that everyone is safely distanced from those outcomes or indeed any of the consequences of institutional failure that don’t take geological time to manifest fully. Such notions are like assurances offered to children who seek to understand their own mortality: no need to worry about that now, that’s a long, long way off. Besides, right now there are hangovers to nurse, gifts to return for cash, snow to shovel, and Super Bowl parties to plan. Those are right now or at least imminent. Sorry to say, so is the full-on collapse of institutions that sustain and protect everyone. The past three years have already demonstrated just how precarious modern living arrangements are, yet most mental models can’t or won’t contemplate the wholesale disappearance of this way of life, and if one has learned of others pointing to this understanding, well, no need to worry about that just yet, that’s a long, long way off. However, the slide down the opposite side of all those energy, population, and wealth curves won’t take nearly as long as it took to climb up them.

/rant off

/rant on

The previous time I was prompted to blog under this title was regarding the deplorable state of public education in the U.S., handily summarized at Gin and Tacos (formerly on my blogroll). The blogger there is admirable in many respects, but he has turned his attention away from blogging toward podcasting and professional writing with the ambition of becoming a political pundit. (I have disclaimed any desire on my part to be a pundit. Gawd … kill me first.) I check in at Gin and Tacos rarely anymore, politics not really being my focus. However, going back to reread the linked blog post, his excoriation of U.S. public education holds up. Systemic rot has since graduated into institutions of higher learning. Their mission statements, crafted in fine, unvarying academese, may exhibit unchanged idealism but the open secret is that the academy has become a network of brainwashing centers for vulnerable young adults. See this blog post on that subject. What prompts this new reality check is the ongoing buildup of truly awful news, but especially James Howard Kunstler’s recent blog post “The Four Fuckeries” over at Clusterfuck Nation, published somewhat in advance of his annual year-end-summary-and-predictions post. Kunstler pulls no punches, delivering assessments of activities in the public interest that have gone so abysmally wrong it beggars the imagination. I won’t summarize; go read for yourself.

At some point, I realized when linking to my own past blog posts that perhaps too many include the word wrong in the title. By that, I don’t mean merely incorrect or bad or unfortunate but rather purpose-built for comprehensive damage that mere incompetence could not accomplish or explain. Some may believe the severity of damage is the simple product of lies compounding lies, coverups compounding coverups, and crimes compounding crimes. That may well be true in part. But there is far too much evidence of Manichean manipulation and heedless damn-the-torpedoes-full-steam-ahead garbage decision-making to waive off widespread institutional corruptions as mere conspiracy. Thus, Kunstler’s choice of the term fuckeries. Having already reviewed the unmitigated disaster of public education, let me instead turn to other examples.

Read the rest of this entry »

This blog has never been obliged to observe every passing holiday or comment on celebrity deaths or public events via press release, public statement, command performance, ritual oversharing, or other characterization more closely associated with institutions and public figures who cannot keep from thrusting themselves wantonly onto the public despite having nothing of value to say. The chattering class maintains noise levels handily, so no need to add my voice to that cacophonous chorus. To wit, the recent Thanksgiving holiday prompts each of us every year to take stock anew and identify some area(s) of contentedness and gratitude, which can be challenging considering many Americans feel abandoned, betrayed, or worse as human history and civilization lurch despotically toward their end states. However, one overheard statement of gratitude this year made a strong impression on me, and as is my wont, I couldn’t help but to connect a variety of disparate ideas. Let me digress, starting with music.

Decades ago, the London Philharmonic under Jorge Mester recorded a collection of fanfares commissioned during WWII. American composers represented include (in no particular order) Henry Cowell, Howard Hanson, Roy Harris, Morton Gould, Leonard Bernstein, Virgil Thomson, and Walter Piston. None of their respective fanfares has entered the standard repertoire. However, the sole composer whose stirring fanfare has become legitimate and instantly recognizable Americana is Aaron Copland. His fanfare celebrates no famous figure or fighting force but rather the common man. Copland’s choice to valorize the common man was a masterstroke and the music possesses appealing directness and simplicity that are unexpectedly difficult to capture in music. Far more, um, common is elaborate, noisy, surface detail that fails to please the ear nearly so well as Copland’s stately fanfare. Indeed, the album is called Twenty Fanfares for the Common Man even though that title only applies to Copland’s entry.

The holiday comment that stuck with me was a son’s gratitude for the enduring example set by his father, a common man. Whether one is disposed to embrace or repudiate the patriarchy, there can be no doubt that a father’s role within a family and community is unique. (So, too, is the mother’s. Relax, it’s not a competition; both are important and necessary.) The father-protector and father-knows-best phase of early childhood is echoed in the humorous observation that a dog sees its master as a god. Sadly, the my-dad-can-beat-up-your-dad taunt lives on, transmuted in … superhero flicks. As most of us enter adulthood, coming to terms with the shortcomings of one or both parents (nobody’s perfect …) is part of the maturation process: establishing one’s own life and identity independent yet somehow continuous from those of one’s parents. So it’s not unusual to find young men in particular striking out on their own, distancing from and disapproving of their fathers (sometimes sharply) but later circling back to reflect and reconcile. How many of us can honestly express unalloyed admiration for our fathers and their character examples? I suspect frustration when feet of clay are revealed is more typical.

Read the rest of this entry »

Didn’t expect to come back to this one. Five years after having blogged on this topic, I was delighted to see Graham Hancock get full Netflix documentary treatment under the title Ancient Apocalypse. No doubt streaming video is shaped in both tone and content to fit modern audiences, not modern readers. We are no longer people of the book but instead people of the screen. (An even earlier mode, displaced by the onset in the Gutenberg Era, was the oral tradition, but that was a different blog.) As a result, the eight episodes come across as tabloid-style potboilers, which regrettably undermines Hancock’s authority. Having read two of Hancock’s books exploring the subject, I was already familiar with many of the ancient sites discussed and depicted, though some reports are updated from his books. The main thesis is that archeological structures and cultural origin stories all around the world point to a major human civilization now lost but being gradually rediscovered. The phase of destruction is unaccountably saved until episode eight, namely, a roughly twelve-hundred-year period known as the Younger Dryas marked by repeated, severe climatic events, most notably the Great Flood that raised sea level by more than 400 ft. Suspected causes of these events range from the breaking of ice dams and subsequent breakup of the continental ice sheets to multiple meteor impacts to a coronal mass ejection. Could be more than one.

Several YouTube reviews have already weighed in on strengths and weaknesses of the documentary. Learning that others have been completely absorbed by Hancock’s books is a little like discovering a lost sibling. Intellectual brethren focused on decidedly arcane subject matter is quite different from mass market fandom (or as I once heard someone joke, “You like pizza? I like pizza! BFF!”). Of course, beyond enthusiasts and aficionados are scofflaws, the latter of whom come under specific attack by Hancock for refusing to examine new evidence, instead adhering blindly to established, status quo, academic consensus. Although some would argue the principal takeaway Ancient Apocalypse is filling in gaps in the story of human development (a cosmology or better origin story), my assessment, perhaps a result of prior familiarity with Hancock’s work, is that officialdom as instantiated in various institutions is an abject and unremitting failure. The Catholic Church’s persecution of numerous proto-scientists as heretics during the Middle Ages, or similarly, what has recently become known derisively as “YouTube science” (where heterodox discussion is summarily demonetized in a pointless attempt to shut down dissent) should be concerning to anyone who supports the scientific method or wants to think for themselves. Whether refusals to even consider alternatives to cherished beliefs are a result of human frailty, power struggles, careerism, or sheer stupidity someone else can decide. Could be more than one.

A couple wild suggestions came up in the reviews I caught. For instance, lost knowledge of how to work stone into megaliths used to construct giant monuments is said to be related to either activating resonance in the stone or indeed a completely different form of energy from anything now known. A similar suggestion was made about how the World Trade Center and other nearby structures were demolished when 9/11 occurred. Specially, purported “directed free-energy technology” was deployed to weaken the molecular coherence of solid metal and concrete to collapse the buildings. (Video demonstrations of iron bars/beams being bent are available on YouTube.) For megaliths, the suggestion is that they are temporarily made into a softer, lighter (?) marshmallow-like substance to be positioned, reformed, and rehardened in situ. Indeed, material phase changes under extremes of pressure and temperature are both obvious and ubiquitous. However, to novices and the scientifically illiterate, this is the stuff of magic and alchemy or straight-up conspiracy (if one prefers). I’m largely agnostic when it comes to such assertions about megalithic structures but those theories are at least as tantalizing as evidence of existence of a lost civilization — especially when officialdom instructs everyone not to look there, or if one does anyway, not to believe one’s lying eyes.

As observed in my earlier blog on this subject, the possibility nay inevitability of destruction of our present civilization, whether from forces external or internal, would make putting aside petty squabbles and getting going on preparations (i.e., prepping for human survival) paramount. Good luck getting humanity all together on that project. Are there secret underground bunkers into which the financial and political elite can flee at the propitious moment, abandoning the masses to their fate? Again, conspiratorial types say yes, both now and in the ancient past. Good luck to any survivors, I guess, in the hellscape that awaits. I don’t want to be around after the first major catastrophe.

According to some estimates, historical trends bring us to 8 Billion Day (human population) today (November 15, 2022), despite a slowing birthrate. Took only 11 years to add the next billion from 7 Billion Day and only 4 years to add the half billion from 7.5 Billion Day. That doesn’t look to me like deceleration; perhaps the last 3 years of Covid pandemic is the hinge of the trend reversal. Previous milestones are 1 billion in 1804, 2 billion in 1930, 3 billion in 1960, 4 billion in 1974, 5 billion in 1987, and 6 billion in 1998. Projections are 9 billion in 2037 and 10 billion in 2058. Whereas past numbers are fixed, the future is IMO quite unlikely to produce those numbers on schedule if at all. Factors are many and unpredictable, such as the rise in excess deaths a/k/a all-cause mortality already being reported (but quietly lest panic ensue).

Various economists, demographers, and business leaders bemoan that many countries have already fallen below replacement rate, which poses a dramatic reduction in skilled, experienced labor as members of the Baby Boom retire and die off. Worse than that, however, is the recognition that in growth economies (now ubiquitous across the globe), the only way forward is to have a growing population, young people at the bottom supporting old people at the top. It’s a perfect Ponzi setup, replicated many times over in various institutions and destined to fail spectacularly as more women (in particular) are educated and opt out of motherhood entirely in favor of careers. Given that the Covid era has proven to be a baby bust, one can only wonder whether birth rates will spike as fears subside (which produced the Baby Boom after WWII) or population decline will be a permanent feature of society. I offer no predictions. Further, with myriad variables competing for primacy among doomers who forecast dire consequences of human behavior accumulated over several centuries , I admit being at a loss to know what to hope for. More people (and thus, more subsequent suffering) or fewer?