Posts Tagged ‘Book Blogging’

An ongoing conflict in sociology and anthropology exists between those who believe that human nature is competitive and brutal to the bitter end versus those who believe human nature is more cooperative and sociable, sharing resources of all types to secure the greater good. This might be recognizable to some as the perennial friction between citizen and society (alternatively, individualism and collectivism). Convincing evidence from human prehistory is difficult to uncover. Accordingly, much of the argument for competition comes from evolutionary biology, where concepts such as genetic fitness and reproductive success (and by inference, reproductive failure) are believed to motivate and justify behavior across the board. As the typical argument goes, inferior genes and males in particular who lack sexual access or otherwise fail to secure mates don’t survive into the next generation. Attributes passed onto each subsequent generation thus favor fitter, Type A brutes who out-compete weaker (read: more cooperative) candidates in an endless self-reinforcing and narrowing cycle. The alternative offered by others points to a wider gene pool based on collaboration and sharing of resources (including mates) that enables populations to thrive together better than individuals who attempt to go it alone or dominate.

Not having undertaken a formal study of anthropology (or more broadly, primatology), I can’t say how well this issue is settled in the professional, academic literature. Online, I often see explanations that are really just-so stories based on logic. What that means is that an ideal or guiding principle is described, something that just “makes sense,” and supporting evidence is then assumed or projected. For instance, we now know many of the mechanisms that function at the cellular level with respect to reproduction and genetic evolution. Those mechanisms are typically spun up the level of the organism through pure argumentation and presumed to manifest in individual behaviors. Any discontinuity between aggregate characteristics and particular instances is ignored. Questions are solved through ideation (i.e., thought experiments). However, series of if-then statements that seem plausible when confronted initially often turn out to be pure conjecture rather than evidence. That’s a just-so story.

One of the reasons we look into prehistory for evidence of our true nature (understood as biology, not sociology, handily sweeping aside the nature/nurture question) is that hunter-gatherers (HGs) lived at subsistence level for a far longer period of our evolutionary history than our comparatively brief time within the bounty of civilization. It’s only when surpluses and excesses provide something worth hoarding, monopolizing, and protecting that hierarchies arise and/or leveling mechanisms are relaxed. Leaving Babylon has a discussion of this here. Some few HG cultures survive into the 21st century, but for most of us, The Agricultural Revolution is the branching point when competition began to assert itself, displacing sharing and other egalitarian impulses. Accordingly, the dog-eat-dog competition and inequality characteristic of the modern world is regarded by many as an exaptation, not our underlying nature.

(more…)

Advertisements

I’m currently reading Go Wild by John Ratey and Richard Manning. It has some rather astounding findings on offer. One I’ll draw out is that the human brain evolved not for thinking, as one might imagine, but for coordinating complex physiological movements:

… even the simplest of motions — a flick of a finger or a turn of the hand to pick up a pencil — is maddeningly complex and requires coordination and computational power beyond electronics abilities. For this you need a brain. One of our favorites quotes on this matter comes from the neuroscientists Rodolfo Llinás: “That which we call thinking is the evolutionary internationalization of movement.” [p. 100]

Almost all the computation is unconsciousness, or maybe preconscious, and it’s learned over a period of years in infancy and early childhood (for basic locomotion) and then supplemented throughout life (for skilled motions, e.g., writing cursive or typing). Moreover, those able to move with exceptional speed, endurance, power, accuracy, and/or grace are admired and sometimes rewarded in our culture. The obvious example is sports. Whether league sports with wildly overcompensated athletes, Olympic sports with undercompensated athletes, or combat sports with a mixture of both, thrill attaches to watching someone move effectively within the rule-bound context of the sport. Other examples include dancers, musicians, circus artists, and actors who specialize in physical comedy and action. Each develops specialized movements that are graceful and beautiful, which Ratey and Manning write may also account for nonsexual appreciation and fetishization of the human body, e.g., fashion models, glammed-up actors, and nude photography.

I’m being silly saying that jocks figgered it first, of course. A stronger case could probably be made for warriors in battle, such as a skilled swordsman. But it’s jocks who are frequently rewarded all out of proportion with others who specialize in movement. True, their genetics and training enable a relatively brief career (compared to, say, surgeons or pianists) before abilities ebb away and a younger athlete eclipses them. But a fundamental lack of equivalence with artisans and artists is clear, whose value lies less with their bodies than with outputs their movements produce.

Regarding computational burdens, consider the various mechanical arms built for grasping and moving objects, some of them quite large. Mechanisms (frame and hydraulics substituting for bone and muscle) themselves are quite complex, but they’re typically controlled by a human operator rather than automated. (Exceptions abound, but they’re highly specialized, such as circuit board manufacture or textile production.) More recently, robotics demonstrate considerable advancement in locomotion without a human operator, but they’re also narrowly focused in comparison with the flexibility of motion a human body readily possesses. Further, in the case of flying drones, robots operate in wide open space, or, in the case of those designed to move like dogs or insects, use 4+ legs for stability. The latter are typically built to withstand quite a lot of bumping and jostling. Upright bipedal motion is still quite clumsy in comparison with humans, excepting perhaps wheeled robots that obviously don’t move like humans do.

Curiously, the movie Pacific Rim (sequel just out) takes notice of the computational or cognitive difficulty of coordinated movement. To operate giant robots needed to fight Godzilla-like interdimensional monsters, two mind-linked humans control a battle robot. Maybe it’s a simple coincidence — a plot device to position humans in the middle of the action (and robot) rather than killing from a distance — such as via drone or clone — or maybe not. Hollywood screenwriters are quite clever at exploiting all sorts material without necessarily divulging the source of inspiration. It’s art imitating life, knowingly or not.

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

(more…)

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

Here’s another interesting tidbit from Anthony Giddens’ book The Consequences of Modernity, which is the subject of a series of book blogs I’ve been writing. In his discussion of disembedding mechanisms, he introduces the idea of civil inattention (from Goffman, actually). This has partly to do with presence or absence (including inattention) in both public and private settings where face-to-face contact used to be the only option but modern technologies have opened up the possibility of faceless interactions over distance, such as with the telegraph and telephone. More recently, the face has been reintroduced with videoconferencing, but nonverbal cues such as body language are largely missing; the fullness of communication remains attenuated. All manner of virtual or telepresence are in fact cheap facsimiles of true presence and the social cohesion and trust enabled by what Giddens calls facework commitments. Of course, we delude ourselves that interconnectivity mediated by electronics is a reasonable substitute for presence and attention, which fellow blogger The South Roane Agrarian bemoans with this post.

Giddens’ meaning is more specific than this, though. The inattention of which Giddens writes is not the casual distraction of others with which we all increasingly familiar. Rather, Giddens takes note social behaviors embedded in deep culture having to do with signalling trust.

Two people approach and pass one another on a city sidewalk. What could be more trivial and uninteresting? … Yet something is going on here which links apparently minor aspects of bodily management to some of the most pervasive features of modernity. The “inattention” displayed is not indifference. Rather it is a carefully monitored demonstration of what might be called polite estrangement. As the two people approach one another, each rapidly scans the face of the other, looking away as they pass … The glance accords recognition of the other as an agent and as a potential acquaintance. Holding the gaze of the other only briefly, then looking ahead as each passes the other couples such an attitude with an implicit reassurance of lack of hostile intent. [p. 81]

It’s a remarkably subtle interaction: making eye contact to confirm awareness of another but then averting one’s eyes to establish that copresence poses no particular threat in either direction. Staring too fixedly at another communicates something quite else, maybe fear or threat or disapprobation. By denying eye contact — by keeping one’s eyes buried in a handheld device, for instance — the opportunity to establish a modicum of trust between strangers is missed. Intent (or lack thereof) is a mystery. In practice, such modern-day inattention is mere distraction, not a sign of malevolence, but the ingrained social cue is obviated and otherwise banal happenstances become sources of irritation, discomfort, and/or unease, as with someone who doesn’t shake hands or perform others types of greeting properly.

I wrote before about my irritation with others face-planted in their phones. It is not a matter of outright offense but rather a quiet sense of affront at failure to adopt accepted social behaviors (as I once did). Giddens puts it this way:

Tact and rituals of politeness are mutual protective devices, which strangers or acquaintances knowingly use (mostly on the level of practical consciousness) as a kind of implicit social contact. Differential power, particularly where it is very marked, can breach or skew norms …. [pp. 82–83]

That those social behaviors have adapted to omnipresent mobile media, everyone pacified or hypnotized within their individual bubbles, is certainly not a salutary development. It is, however, a clear consequence of modernity.

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)

I finished Graham Hancock’s Fingerprints of the Gods (1995). He saved the best part of the book, an examination of Egyptian megalithic sites, for the final chapters and held back his final conclusion — more conjecture, really — for the tail end. The possible explanation Hancock offers for the destruction and/or disappearance of a supposed civilization long predating the Egyptian dynasties, the subject of the entire book, is earth-crust displacement, a theory developed by Charles Hapgood relating to polar shifts. Long story short, evidence demonstrates that the Antarctic continent used to be 2,000 miles away from the South Pole (about 30° from the pole) in a temperate zone and may have been, according to Hancock, the home of a seafaring civilization that had traveled and mapped the Earth. It’s now buried under ice. I find the explanation plausible, but I wonder how much the science and research has progressed since the publication of Fingerprints. I have not yet picked up Magicians of the Gods (2015) to read Hancock’s update but will get to it eventually.

Without having studied the science, several competing scenarios exist regarding how the Earth’s crust, the lithosphere, might drift, shift, or move over the asthenosphere. First, it’s worth recognizing that the Earth’s rotational axis defines the two poles, which are near but not coincident with magnetic north and south. Axial shifts are figured in relation to the crust, not the entire planet (crust and interior). From a purely geometric perspective, I could well imagine the crust and interior rotating as different speeds, but since we lack more than theoretical knowledge of the Earth’s liquid interior (the inner core is reputedly solid), only the solid portions at the surface of the sphere offer a useful frame of reference. The liquid surfaces (oceans, seas) obviously flow, too, but are also understood primarily in relation to the solid crust both below and above sea level.

The crust could wander slowly and continuously, shift all at once, or some combination of both. If all at once, the inciting event might be a sudden change in magnetic stresses that breaks the entire lithosphere loose or perhaps a gigantic meteor hit that knocks the planet as a whole off its rotational axis. Either would be catastrophic for living things that are suddenly moved into a substantially different climate. Although spacing of such events is unpredictable and irregular, occurring in geological time, Hancock assembles considerable evidence to conclude that the most recent such occurrence was probably about 12,000 BCE at the conclusion of the last glacial maximum or ice age. This would have been well within the time humans existed on Earth but long enough ago in our prehistory that human memory systems record events only as unreliable myth and legend. They are also recorded in stone, but we have yet to decipher their messages fully other than to demonstrate that significant scientific knowledge of astronomy and engineering were once possessed by mankind but was lost until redeveloped during the last couple of centuries.

As I read into Fingerprints of the Gods by Graham Hancock and learn more about antiquity, it becomes clear that weather conditions on Earth were far more hostile then (say, 15,000 years ago) than now. Looking way, way back into millions of years ago, scientists have plotted global average temperature and atmospheric carbon, mostly using ice cores as I understand it, yielding this graph:

co2-levels-over-time1

I’ve seen this graph before, which is often used by climate change deniers to show a lack of correlation between carbon and temperature. That’s not what concerns me. Instead, the amazing thing is how temperature careens up and down quickly (in geological time) between two limits, 12°C and 22°C, and forms steady states known at Ice Age Earth and Hot House Earth. According to the graph, we’re close to the lower limit. It’s worth noting that because of the extremely long timescale, the graph is considerably smoothed.

(more…)