Archive for the ‘Culture’ Category

I have just one previous blog post referencing Daniel Siegel’s book Mind and threatened to put the book aside owing to how badly it’s written. I haven’t yet turned in my library copy and have made only modest additional progress reading the book. However, Siegel came up over at How to Save the World, where at least one commentator was quite enthusiastic about Siegel’s work. In my comment there, I mentioned the book only to suggest that his appreciation of the relational nature of the mind (and cognition) reinforces my long-held intuition that the self doesn’t exist in an idealized vacuum, capable of modeling and eventually downloading to a computer or some other Transhumanist nonsense, but is instead situated as much between us as within us. So despite Siegel’s clumsy writing, this worthwhile concept deserves support.

Siegel goes on to wonder (without saying he believes it to be true — a disingenuous gambit) that perhaps there exists an information field, not unlike the magnetic field or portions of the light spectrum, that affects us yet falls outside the scope of our direct perception or awareness. Credulous readers might leap to the conclusion that the storied collective consciousness is real. Some fairly trippy theories of consciousness propose that the mind is actually more like an antenna receiving signals from some noncorporeal realm (e.g., a quantum dimension) we cannot identify yet tap into constantly, measuring against and aligning with the wider milieu in which we function. Even without expertise in zoology, one must admit that humans are social creatures operating at various levels of hierarchy including individual, family, clan, pack, tribe, nation-state, etc. We’re less like mindless drones in a hive (well, some of us) and more like voluntary and involuntary members of gangs or communities formed along various familial, ethnic, regional, national, language group, and ideological lines. Unlike Siegel, I’m perfectly content with existing terminology and feel no compulsion to coin new lingo or adopt unwieldy acronyms to mark my territory.

What Siegel hasn’t offered is an observation on how our reliance on and indebtedness to the public sphere (via socialization) have changed with time as our mode of social organization has morphed from a predominantly localized, agrarian existence prior to the 20th century to a networked, high-density, information-saturated urban and suburban existence in the 21st century. The public sphere was always out there, of course, especially as embodied in books, periodicals, pamphlets, and broadsides (if one was literate and had reliable access to them), but the unparalleled access we now enjoy through various electronic devices has not only reoriented but disoriented us. Formerly slow, isolated information flow has become a veritable torrent or deluge. It’s not called the Information Age fer nuthin’. Furthermore, the bar to publication  — or insertion into the public sphere — has been lowered to practical nonexistence as the democratization of production has placed the tools of widely distributed exposure into the hands of everyone with a blog (like mine) or Facebook/Instagram/Twitter/Pinterest/LinkedIn account. As a result, a deep erosion of authority has occurred, since any yahoo can promulgate the most reckless, uninformed (and disinformed) opinions. The public’s attention riveted on celebrity gossip and House of Cards-style political wrangling, false narratives, fake news, alternative facts, and disinformation also make navigating the public sphere with much integrity impossible for most. For instance, the MSN and alternative media alike are busy selling a bizarre pageant of Russian collusion and interference with recent U.S. elections as though the U.S. were somehow innocent of even worse meddling abroad. Moreover, it’s naïve to think that the public sphere in the U.S. isn’t already completely contaminated from within by hucksters, corporations (including news media), and government entities with agendas ranging from mere profit seeking to nefarious deployment and consolidation of state power. For example, the oil and tobacco industries and the Bush Administration all succeeded in suppressing truth and selling rank lies that have landed us in various morasses from which there appears to be no escape.

If one recognizes his or her vulnerability to the depredations of info scammers of all types and wishes to protect oneself, there are two competing strategies: insulation and inoculation. Insulation means avoiding exposure, typically by virtue of mind-cleansing behaviors, whereas inoculation means seeking exposure in small, harmless doses so that one can handle a larger infectious attack. It’s a medical metaphor that springs from meme theory, where ideas propagate like viruses, hence, the notion of a meme “going viral.” Neither approach is foolproof. Insulation means plugging one’s ears or burying one’s head in the sand at some level. Inoculation risks spreading the infection. If one regards education as an inoculation of sorts, seeking more information of the right types from authoritative sources should provide means to combat the noise in the information signals received. However, as much as I love the idea of an educated, informed public, I’ve never regarded education as a panacea. It’s probably a precondition for sound thinking, but higher education in particular has sent an entire generation scrambling down the path of identity politics, which sounds like good ideas but leads inevitably to corruption via abstraction. That’s all wishful thinking, though; the public sphere we actually witness has gone haywire, a condition of late modernism and late-stage capitalism that has no known antidote. Enjoy the ride!

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (1oo years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

When I first wrote about this topic back in July 2007, I had only just learned of the Great Pacific Garbage Patch (and similar garbage gyres in others oceans). Though I’d like to report simply that nothing has changed, the truth is that conditions have worsened. Some commentators have rationalized contextualized the issue by observing that the Earth, the environment, the ecosphere, the biosphere, Gaia, or whatever one wishes to call the natural world has always been under assault by humans, that we’ve never truly lived in balance with nature. While that perspective may be true in a literal sense, I can’t help gnashing my teeth over the sheer scale of the assault in the modern industrial age (extending back 250+ years but really getting going once the steam engine was utilized widely). At that point, production and population curves angled steeply upwards, where they continue point as though there be no biophysical limits to growth or the amount and degree of destruction that can be absorbed by the biosphere. Thus, at some undetermined point, industrial scale became planetary scale and humans became terraformers.

News reports came in earlier this month that the remote and uninhabited (by humans) Henderson Island in the Pacific is now an inadvertent garbage dump, with estimates of over 17 tons of debris littering its once-pristine shores.

17hendersonisland1-superjumbo

This despoliation is a collateral effect of human activity, not the predictable result of direct action, such as with the Alberta Tar Sands, another ecological disaster (among many, many others). In the U.S., the Environmental Protection Agency (EPA) describes its mission as protecting human health and the environment and has established a Superfund to clean up contaminated sites. Think of this as a corporate subsidy, since the principal contaminators typically inflict damage in the course of doing business and extracting profit then either move on or cease to exist. Standard Oil is one such notorious entity. Now that the EPA is in the process of being defunded (and presumably on its way to being deauthorized) by the current administration of maniacs, the ongoing death-by-a-thousand-cuts suffered by the natural world will likely need to be revised to death-by-millions-of-cuts, a heedless acceleration of the death sentence humans have set in motion. In the meantime, industry is being given a freer hand to pollute and destroy. What could possibly go wrong?

If all this weren’t enough, another development darkened my brow recently: the horrific amount of space debris from decades of missions to put men, communications and surveillance satellites, and (one would presume) weapons in orbit. (Maybe the evil brainchild of inveterate cold warriors known unironically as “Star Wars” never actually came into being, but I wouldn’t place any bets on that.) This video from the Discovery Network gives one pause, no?

Admittedly, the dots are not actual size and so would not be as dense or even visible from the point of view of the visualization, but the number of items (20,000+ pieces) is pretty astonishing. (See this link as well.) This report describes some exotic technologies being bandied about to address problem of space junk. Of course, that’s just so that more satellites and spacecraft can be launches into orbit as private industry takes on the mantle once enjoyed exclusively by NASA and the Soviet space program. I suppose the explorer’s mindset never diminishes even as the most remote places on and now around Earth are no longer untouched but human refuse.

An old Star Trek episode called “A Taste for Armageddon” depicts Capt. Kirk and crew confronting a planetary culture that has adopted purely administrative warfare with a nearby planet, where computer simulations determine outcomes of battles and citizens/inhabitants are notified to report for their destruction in disintegration chambers to comply with those outcomes. Narrative resolution is tidied up within the roughly 1-hour span of the episode, of course, but it was and is nonetheless a thought-provoking scenario. The episode, now 50 years old, prophesies a hyper-rational approach to conflict. (I was 4 years old at the time it aired on broadcast television, and I don’t recall having seen it since. Goes to show how influential high-concept storytelling can be even on someone quite young.) The episode came to mind as I happened across video showing how robot soldiers are being developed to supplement and eventually replace human combatants. See, for example, this:

The robot in the video above is not overtly militarized, but there is no doubt that it will could be. Why the robot takes bipedal, humanoid form with an awkwardly high center of gravity is unclear to me beyond our obvious self-infatuation. Additional videos with two-wheeled, quadriped, and even insect-like multilegged designs having much improved movement and flexibility can be found with a simple search. Any of them can be transformed into ground-based killing machines, as suggested more manifestly in the video below highlighting various walking, rolling, flying, floating, and swimming machines developed to do our dirty work:

(more…)

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)

What’s Missing

Posted: May 9, 2017 in Artistry, Cinema, Consumerism, Culture

Pessimists, misanthropes, fatalists, and doomers (I’m all types) often find themselves defending the position that maybe we don’t live at the very best, super-duper, tippy-top time in history, that although technology in particular has admittedly delivered some amazing innovations and improved our material conditions considerably even over our fairly recent past, we nonetheless lack spirituality and meaning — unless of course one’s spirituality and meaning are mistakenly found in technology. The technological sublime is so deceptively glamorous and ubiquitous that it obscures the idea that maybe another time and place made for a more ethical, moral, and noble life. (Experience varied widely, obviously.) In addition, personal freedoms enjoyed in liberal democracies are difficult to argue against, though that’s more characteristic of those at the very top of the socioeconomic heap than those of us below who know to keeps things buttoned up lest we discomfit our betters. Perhaps the most broadly enjoyed modern development is increased lifespan borne out of improved healthcare. And quite recently, the grossly expanded communications age (the information age is nested inside) reputedly makes learning and keeping in touch far easier than ever before. I won’t dispel any of these. However, I have two complaints against modernity as evidenced by a body of posts extending back through the life of this blog: life lacks many salutary aspects delivered passively by bygone social structures and technology smuggles in a host of problems with its bounty.

Among the failed attributes are (1) lack of a cohesive narrative for what life ought to mean beyond grubbing for money, chasing fame and social cachet, and overpopulating the planet with progeny (many of whom suffer neglect while parents are away grubbing for money), (2) lack of true community and social conditions necessary to be properly situated within a wholesome context, (3) spiritual and emotional vacuity, (4) destructive social presences and bullying, especially online (as exemplified by the Bully-in-Chief) and by civil authorities, (5) unrelenting, disorienting technological and social change, including slang and memes that are impossible to track fully without being in the thrall of celebrities, pundits, and media in general, (6) little prospect of things improving near term, and (7) the still-dawning realization that, like the existential angst from the so-called Atomic Age and its threat of complete nuclear annihilation, we possess tools sufficient to destroy ourselves many times over (including rotting ourselves out from the inside in a fevered race to the cultural bottom) and have unwittingly fired the slo-mo suicide gun. We’re only just waiting for the bullet to strike its target. Since the gun is industrial civilization, the target is all of us. Dead men walking, or perhaps more accurately, the zombie shuffle.

A significant minority has replied to this set of miserable circumstances by voting into office our our current president, 45. According to one astute commentator, 45 was never understood as a solution but rather a murder weapon used by the disenfranchised to kill the host, namely, our sick society run by plutocrats maniacally hellbent on destroying everyone outside their immediate concern, which is just about everybody.

(more…)

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)

The Internet is now a little more than two decades old (far more actually, but I’m thinking of it’s widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating push back and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

Even before I begin, you must know what the title means. It’s the proliferation of options that induces dread in the toothpaste aisle of the store. Paste or gel? Tartar control or extra whitening? Plain, mint, cinnamon, or bubble gum? The matrix of combinations is enough to reduce the typical shopper to a quivering state of high anxiety lest the wrong toothpaste be bought. Oh, how I long for the days when choices ran solely between plain Crest and Colgate. I can’t say whether the toothpaste effect originated with oral hygiene. A similarly bewildering host of choices confronts shoppers in the soft drink aisle. Foodstuffs seem especially prone to brand fragmentation. Woe be the retailer forced to shelve all 38 Heinz products on this page. (True, some are just different packaging of the same basic item, but still.)

Purveyors of alcoholic beverages are on the bandwagon, too. I rather like the bygone cliché of the cowboy/gunslinger riding off the range, swinging into the saloon, and ordering simply “whisky.” Nowadays, even a poorly stocked bar is certain to have a dozen or so whiskys (see this big brand list, which doesn’t include sub-brands or craft distillers.) Then come all the varieties of schnapps, rum, and vodka, each brand further fragmented with infusions and flavorings of every imaginable type. Some truly weird ones are found here. Who knew that these spirits were simply blank canvases awaiting the master distiller’s crazy inventiveness.

/rant on

What really gets my bile flowing on this issue, however, is the venerable Lays potato chip. Seriously, Frito-Lay, what are you thinking? You arguably perfected the potato chip, much like McDonald’s perfected the French fry. (Both are fried potato, interestingly.) Further, you have a timeless, unbeatable slogan: “betcha can’t eat just one.” The plain, salted chip, the “Classic” of the Lays brand, cannot be improved upon and is a staple comfort food. Yet you have succumbed to the toothpaste effect and gone haywire with flavorings (I won’t even countenance the Wavy, Poppables, Kettle-Cooked, Ruffles, and STAX varieties). For variety’s sake, I’d be content with a barbecue chip, maybe even salt & vinegar, but you’ve piled on past the point of ridiculousness:

  • cheddar & sour cream (a favorite of mine)
  • Chile limón
  • deli style
  • dill pickle
  • flamin’ hot
  • honey barbecue
  • limón
  • pico de gallo
  • salt & vinegar (not to my taste)
  • sour cream & onion (a good alternative)
  • sweet Southern heat barbecue
  • Southern biscuits & gravy
  • Tapatío (salsa picante)

(more…)

I finished Graham Hancock’s Fingerprints of the Gods (1995). He saved the best part of the book, an examination of Egyptian megalithic sites, for the final chapters and held back his final conclusion — more conjecture, really — for the tail end. The possible explanation Hancock offers for the destruction and/or disappearance of a supposed civilization long predating the Egyptian dynasties, the subject of the entire book, is earth-crust displacement, a theory developed by Charles Hapgood relating to polar shifts. Long story short, evidence demonstrates that the Antarctic continent used to be 2,000 miles away from the South Pole (about 30° from the pole) in a temperate zone and may have been, according to Hancock, the home of a seafaring civilization that had traveled and mapped the Earth. It’s now buried under ice. I find the explanation plausible, but I wonder how much the science and research has progressed since the publication of Fingerprints. I have not yet picked up Magicians of the Gods (2015) to read Hancock’s update but will get to it eventually.

Without having studied the science, several competing scenarios exist regarding how the Earth’s crust, the lithosphere, might drift, shift, or move over the asthenosphere. First, it’s worth recognizing that the Earth’s rotational axis defines the two poles, which are near but not coincident with magnetic north and south. Axial shifts are figured in relation to the crust, not the entire planet (crust and interior). From a purely geometric perspective, I could well imagine the crust and interior rotating as different speeds, but since we lack more than theoretical knowledge of the Earth’s liquid interior (the inner core is reputedly solid), only the solid portions at the surface of the sphere offer a useful frame of reference. The liquid surfaces (oceans, seas) obviously flow, too, but are also understood primarily in relation to the solid crust both below and above sea level.

The crust could wander slowly and continuously, shift all at once, or some combination of both. If all at once, the inciting event might be a sudden change in magnetic stresses that breaks the entire lithosphere loose or perhaps a gigantic meteor hit that knocks the planet as a whole off its rotational axis. Either would be catastrophic for living things that are suddenly moved into a substantially different climate. Although spacing of such events is unpredictable and irregular, occurring in geological time, Hancock assembles considerable evidence to conclude that the most recent such occurrence was probably about 12,000 BCE at the conclusion of the last glacial maximum or ice age. This would have been well within the time humans existed on Earth but long enough ago in our prehistory that human memory systems record events only as unreliable myth and legend. They are also recorded in stone, but we have yet to decipher their messages fully other than to demonstrate that significant scientific knowledge of astronomy and engineering were once possessed by mankind but was lost until redeveloped during the last couple of centuries.