Posts Tagged ‘Futurism’

I started reading Yuval Harari’s book Homo Deus: A Brief History of Tomorrow (2017). Had expected to read Sapiens (2014) first but its follow-up came into my possession instead. My familiarity with Harari’s theses and arguments stem from his gadfly presence on YouTube being interviewed or giving speeches promoting his books. He’s a compelling yet confounding thinker, and his distinctive voice in my mind’s ear lent to my reading the quality of an audiobook. I’ve only read the introductory chapter (“A New Human Agenda”) so far, the main argument being this:

We have managed to bring famine, plague and war under control thanks largely to our phenomenal economic growth, which provides us with abundant food, medicine, energy and raw materials. Yet this same growth destabilises the ecological equilibrium of the planet in myriad ways, which we have only begun to explore … Despite all the talk of pollution, global warming and climate change, most countries have yet to make any serious economic or political sacrifices to improve the situation … In the twenty-first century, we shall have to do better if we are to avoid catastrophe. [p. 20]

“Do better”? Harari’s bland understatement of the catastrophic implications of our historical moment is risible. Yet as a consequence of having (at least temporarily) brought three major historical pestilences (no direct mention of the fabled Four Horsemen of the Apocalypse) under administrative, managerial, and technical control (I leave that contention unchallenged), Harari states rather over-confidently — forcefully even — that humankind is now turning its attention and ambitions toward different problems, namely, mortality (the fourth of the Four Horsemen and one of the defining features of the human condition), misery, and divinity.

(more…)

Advertisements

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

This is about to get weird.

I caught a good portion of a recent Joe Rogan podcast (sorry, no link or embedded video) with Alex Jones and Eddie Bravo (nearly 5 hours long instead of the usual 2 to 3) where the trio indulged themselves in a purported grand conspiracy to destroy civilization and establish a new post-human one. The more Jones rants speaks (which is quite a lot), the more he sounds like a madman. But he insists he does so to serve the public. He sincerely wants people to know things he’s figured out about an evil cabal of New World Order types. So let me say at least this: “Alex Jones, I hear you.” But I’m unconvinced. Apologies to Alex Jones et al. if I got any details wrong. For instance, it’s not clear to me whether Jones believes this stuff himself or he’s merely reporting what others may believe.

The grand conspiracy is supposedly interdimensional beings operating at a subliminal range below or beyond normal human perception. Perhaps they revealed themselves to a few individuals (to the cognoscenti, ya know, or is that shared revelation how one is inducted into the cognoscenti?). Rogan believes that ecstatic states induced by drugs provide access to revelation, like tuning a radio to the correct (but secret) frequency. Whatever exists in that altered cognitive state appears like a dream and is difficult to understand or remember. The overwhelming impression Rogan reports as lasting is of a distinct nonhuman presence.

Maybe I’m not quite as barking mad as Jones or as credulous as Rogan and Bravo, but I have to point out that humans are interdimensional beings. We move through three dimensions of space and one unidirectional dimension of time. If that doesn’t quite make sense, then I refer readers to Edwin Abbott’s well-known book Flatland. Abbott describes what it might be like for conscious beings in only two dimensions of space (or one). Similarly, for most of nature outside of vertebrates, it’s understood that consciousness, if it exists at all (e.g., not in plants), is so rudimentary that there is no durable sense of time. Beings exist in an eternal now (could be several seconds long/wide/tall — enough to function) without memory or anticipation. With that in mind, the possibility of multidimensional beings in 5+ dimensions completely imperceptible to us doesn’t bother me in the least. The same is true of the multiverse or many-worlds interpretation. What bothers me is that such beings would bother with us, especially with a conspiracy to crash civilization.

The other possibility at which I roll my eyes is a post-human future: specifically, a future when one’s consciousness escapes its biological boundaries. The common trope is that one’s mind is uploaded to a computer to exist in the ether. Another is that one transcends death somehow with intention and purpose instead of simply ceasing to be (as atheists believe) or some variation of the far more common religious heaven/hell/purgatory myth. This relates as well to the supposition of strong AI about to spark (the Singularity): self-awareness and intelligent thought that can exist on some substrate other than human biology (the nervous system, really, including the brain). Sure, cognition can be simulated for some specific tasks like playing chess or go, and we humans can be fooled easily into believing we are communicating with a thought machine à la the Turing Test. But the rather shocking sophistication, range, utility, and adaptability of even routine human consciousness is so far beyond any current simulation that the usual solution to get engineers from where they are now to real, true, strong AI is always “and then a miracle happened.” The easy, obvious route/accident is typically a power surge (e.g., a lightning strike).

Why bother with mere humans is a good question if one is post-human or an interdimensional being. It could well be that existence in such a realm would make watching human interactions either impenetrable (news flash, they are already) or akin to watching through a dim screen. That familiar trope is the lost soul imprisoned in the spirit world, a parallel dimension that permits viewing from one side only but prohibits contact except perhaps through psychic mediums (if you believe in such folks — Rogan for one doesn’t).

The one idea worth repeating from the podcast is the warning not to discount all conspiracy theories out of hand as bunk. At least a few have been demonstrated to be true. Whether any of the sites behind that link are to be believed I leave you readers to judge.

Addendum: Although a couple comments came in, no one puzzled over the primary piece I had to add, namely, that we humans are interdimentional beings. The YouTube video below depicts a portion of the math/science behind my statement, showing how at least two topographical surfaces behave paradoxically when limited to 2 or 3 dimensions but theoretically cohere in 4+ dimensions imperceptible to us.

For a time after the 2008 financial collapse, skyscraper projects in Chicago came to a dead halt, mostly due to dried-up financing. My guess (since I don’t know with any reliability) is that much the same obtained worldwide. However, the game appears to be back on, especially in New York City, one of few cities around the globe where so-called “real money” tends to pool and collect. Visual Capitalist has an interesting infographic depicting changes to the NYC skyline every 20 years. The number of supertalls topping 1,000 feet expected by 2020 is quite striking.

Courtesy of Visual Capitalist

The accompanying text admits that NYC is left in the dust by China, specifically, the Pearl River Delta Megacity, which includes Hong Kong, Shenzhen, Macau, and others. As I’ve written before, the mad rush to build (earning ridiculous, absurd, imaginary prestige points awarded by and to exactly no one) takes no apparent notice of a slo-mo crack-up in the way modern societies organize and fund themselves. The new bear market might give one … um, pause.

Also left in the dust is Chicago, home of the original skyscraper. Since the 2008 collapse, Chicago’s most ambitious project, the ill-fated Chicago Spire (a/k/a the Fordham Spire) was abandoned despite a big hole dug in the ground and some foundation work completed. An absence of completed prestige projects since 2008 means Chicago has been lapped several times over by NYC, not that anyone is counting. The proposed site of the Chicago Spire is too enticing, however — just inside Lake Shore Drive at the mouth of the Chicago River — for it to be dormant for long. Indeed, a press release last year (escaped my attention at the time) announced redevelopment of the site, and a slick website is operating for now (linked in the past to similar sites that went abandoned along with their subject projects). Also reported late last year, Chicago appears to have rejoined the game in earnest, with multiple projects already under construction and others in the planning/approval phases.

So if hiatus was called the last time we crashed financially (a regular occurrence, I note), it seems we’ve called hiatus on the hiatus and are back in a mad, futile race to remake modernity into gleaming vertical cities dotting the globe. Such hubris and exuberance might be intoxicating to technophiles, but I’m reminded of a observation (can’t locate a quote, sorry) to the effect that civilizations’ most extravagant projects are undertaken just before their collapses. Our global civilization is no different.

From time to time, I admit that I’m in no position to referee disputes, usually out of my lack of technical expertise in the hard sciences. I also avoid the impossibility of policing the Internet, assiduously pointing out error where it occurs. Others concern themselves with correcting the record and/or reinterpreting argument with improved context and accuracy. However, once in a while, something crosses my desk that gets under my skin. An article by James Ostrowski entitled “What America Has Done To its Young People is Appalling,” published at LewRockwell.com, is such a case. It’s undoubtedly a coincidence that the most famous Rockwell is arguably Norman Rockwell, whose celebrated illustrations for the Saturday Evening Post in particular helped reinforce a charming midcentury American mythology. Lew Rockwell, OTOH, is described briefly at the website’s About blurb:

The daily news and opinion site LewRockwell.com was founded in 1999 by anarcho-capitalists Lew Rockwell … and Burt Blumert to help carry on the anti-war, anti-state, pro-market work of Murray N. Rothbard.

Those political buzzwords probably deserve some unpacking. However, that project falls outside my scope. In short, they handily foist blame for what ills us in American culture on government planning, as distinguished from the comparative freedom of libertarianism. Government earns its share of blame, no doubt, especially with its enthusiastic prosecution of war (now a forever war); but as snapshots of competing political philosophies, these buzzwords are reductive almost to the point of meaninglessness. Ostrowski lays blame more specifically on feminism and progressive big government and harkens back to an idyllic 1950s nuclear family fully consonant with Norman Rockwell’s illustrations, thus invoking the nostalgic frame.

… the idyllic norm of the 1950’s, where the mother typically stayed home to take care of the kids until they reached school age and perhaps even long afterwards, has been destroyed.  These days, in the typical American family, both parents work fulltime which means that a very large percentage of children are consigned to daycare … in the critical first five years of life, the vast majority of Americans are deprived of the obvious benefits of growing up in an intact family with the mother at home in the pre-school years. We baby boomers took this for granted. That world is gone with the wind. Why? Two main reasons: feminism and progressive big government. Feminism encouraged women to get out of the home and out from under the alleged control of husbands who allegedly controlled the family finances.

Problem is, 1950s social configurations in the U.S. were the product of a convergence of historical forces, not least of which were the end of WWII and newfound American geopolitical and economic prominence. More pointedly, an entire generation of young men and women who had deferred family life during perilous wartime were then able to marry, start families, and provide for them on a single income — typically that of the husband/father. That was the baby boom. Yet to enjoy the benefits of the era fully, one probably needed to be a WASPy middle-class male or the child of one. Women and people of color fared … differently. After all, the 1950s yielded to the sexual revolution and civil rights era one decade later, both of which aimed specifically to improve the lived experience of, well, women and people of color.

Since the 1950s were only roughly 60 years ago, it might be instructive to consider how life was another 60 years before then, or in the 1890s. If one lived in an eastern American city, life was often a Dickensian dystopia, complete with child labor, poorhouses, orphanages, asylums, and unhygienic conditions. If one lived in an agrarian setting, which was far more prevalent before the great 20th-century migration to cities, then life was frequently dirt-poor subsistence and/or pioneer homesteading requiring dawn-to-dusk labor. Neither mode yet enjoyed social planning and progressive support including, for example, sewers and other modern infrastructure, public education, and economic protections such as unionism and trust busting. Thus, 19th-century America might be characterized fairly as being closer to anarcho-capitalism than at any time since. One of its principal legacies, one must be reminded, was pretty brutal exploitation of (and violence against) labor, which can be understood by the emergence of political parties that sought to redress its worst scourges. Hindsight informs us now that reforms were slow, partial, and impermanent, leading to the observation that among all tried forms of self-governance, democratic capitalism can be characterized as perhaps the least awful.

So yeah, the U.S. came a long way from 1890 to 1950, especially in terms of standard of living, but may well be backsliding as the 21st-century middle class is hollowed out (a typical income — now termed household income — being rather challenging for a family), aspirations to rise economically above one’s parents’ level no longer function, and the culture disintegrates into tribal resentments and unrealistic fantasies about nearly everything. Ostrowski marshals a variety of demographic facts and figures to support his argument (with which I agree in large measure), but he fails to make a satisfactory causal connection with feminism and progressivism. Instead, he sounds like 45 selling his slogan Make America Great Again (MAGA), meaning let’s turn back the clock to those nostalgic 1950s happy days. Interpretations of that sentiment run in all directions from innocent to virulent (but coded). By placing blame on feminism and progressivism, it’s not difficult to hear anyone citing those putative causes as an accusation that, if only those feminists and progressives (and others) had stayed in their assigned lanes, we wouldn’t be dealing now with cultural crises that threaten to undo us. What Ostrowski fails to acknowledge is that despite all sorts of government activity over the decades, no one in the U.S. is steering the culture nearly as actively as in centrally planned economies and cultures, current and historical, which in their worst instances are fascist and/or totalitarian. One point I’ll agree on, however, just to be charitable, is that the mess we’ve made and will leave to youngsters is truly appalling.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

(more…)

I caught the presentation embedded below with Thomas L. Friedman and Yuval Noah Harari, nominally hosted by the New York Times. It’s a very interesting discussion but not a debate. For this now standard format (two or more people sitting across from each other with a moderator and an audience), I’m pleased to observe that Friedman and Harari truly engaged each others’ ideas and behaved with admirable restraint when the other was speaking. Most of these talks are rude and combative, marred by constant interruptions and gotchas. Such bad behavior might succeed in debate club but makes for a frustratingly poor presentation. My further comments follow below.

With a topic as open-ended as The Future of Humanity, arguments and support are extremely conjectural and wildly divergent depending on the speaker’s perspective. Both speakers here admit their unique perspectives are informed by their professions, which boils down to biases borne out of methodology, and to a lesser degree perhaps, personality. Fair enough. In my estimation, Harari does a much better job adopting a pose of objectivity. Friedman comes across as both salesman and a cheerleader for human potential.

Both speakers cite a trio of threats to human civilization and wellbeing going forward. For Harari, they’re nuclear war, climate change, and technological disruption. For Friedman, they’re the market (globalization), Mother Nature (climate change alongside population growth and loss of diversity), and Moore’s Law. Friedman argues that all three are accelerating beyond control but speaks of each metaphorically, such as when refers to changes in market conditions (e.g., from independent to interdependent) as “climate change.” The biggest issue from my perspective — climate change — was largely passed over in favor of more tractable problems.

Climate change has been in the public sphere as the subject of considerable debate and confusion for at least a couple decades now. I daresay it’s virtually impossible not to be aware of the horrific scenarios surrounding what is shaping up to be the end of the world as we know it (TEOTWAWKI). Yet as a global civilization, we’ve barely reacted except with rhetoric flowing in all directions and some greenwashing. Difficult to assess, but perhaps the appearance of more articles about surviving climate change (such as this one in Bloomberg Businessweek) demonstrates that more folks recognize we can no longer stem or stop climate change from rocking the world. This blog has had lots to say about the collapse of industrial civilization being part of a mass extinction event (not aimed at but triggered by and including humans), so for these two speakers to cite but then minimize the peril we face is, well, façile at the least.

Toward the end, the moderator finally spoke up and directed the conversation towards uplift (a/k/a the happy chapter), which almost immediately resulted in posturing on the optimism/pessimism continuum with Friedman staking his position on the positive side. Curiously, Harari invalidated the question and refused to be pigeonholed on the negative side. Attempts to shoehorn discussions into familiar if inapplicable narratives or false dichotomies are commonplace. I was glad to see Harari calling bullshit on it, though others (e.g., YouTube commenters) were easily led astray.

The entire discussion is dense with ideas, most of them already quite familiar to me. I agree wholeheartedly with one of Friedman’s remarks: if something can be done, it will be done. Here, he refers to technological innovation and development. Plenty of prohibitions throughout history not to make available disruptive technologies have gone unheeded. The atomic era is the handy example (among many others) as both weaponry and power plants stemming from cracking the atom come with huge existential risks and collateral psychological effects. Yet we prance forward headlong and hurriedly, hoping to exploit profitable opportunities without concern for collateral costs. Harari’s response was to recommend caution until true cause-effect relationships can be teased out. Without saying it manifestly, Harari is citing the precautionary principle. Harari also observed that some of those effects can be displaced hundreds and thousands of years.

Displacements resulting from the Agrarian Revolution, the Scientific Revolution, and the Industrial Revolution in particular (all significant historical “turnings” in human development) are converging on the early 21st century (the part we can see at least somewhat clearly so far). Neither speaker would come straight out and condemn humanity to the dustbin of history, but at least Harari noted that Mother Nature is quite keen on extinction (which elicited a nervous? uncomfortable? ironic? laugh from the audience) and wouldn’t care if humans were left behind. For his part, Friedman admits our destructive capacity but holds fast to our cleverness and adaptability winning out in the end. And although Harari notes that the future could bring highly divergent experiences for subsets of humanity, including the creation of enhanced humans to and reckless dabbling with genetic engineering, I believe cumulative and aggregate consequences of our behavior will deposit all of us into a grim future no sane person should wish to survive.

See this post on Seven Billion Day only a few years ago as a launching point. We’re now closing in on 7.5 billion people worldwide according to the U.S. Census Bureau. At least one other counter indicates we’ve already crossed that threshold. What used to be called the population explosion or the population bomb has lost its urgency and become generically population growth. By now, application of euphemism to mask intractable problems should be familiar to everyone. I daresay few are fooled, though plenty are calmed enough to stop paying attention. If there is anything to be done to restrain ourselves from proceeding down this easily recognized path to self-destruction, I don’t know what it is. The unwillingness to accept restraints in other aspects of human behavior demonstrate pretty well that consequences be damned — especially if they’re far enough delayed in time that we get to enjoy the here and now.

Two additional links (here and here) provide abundant further information on population growth if one desired to delve more deeply into the topic. The tone of these sites is sober, measured, and academic. As with climate change, hysterical and panic-provoking alarmism is avoided, but dangers known decades and centuries ago have persisted without serious redress. While it’s true that growth rate (a/k/a replacement rate) has decreased considerably since its peak in 1960 or so (the height of the postwar baby boom), absolute numbers continue to climb. The lack of immediate concern reminds me of Al Bartlett’s articles and lectures on the failure to understand the exponential function in math (mentioned in my prior post). Sure, boring old math about which few care. The metaphor that applies is yeast growing in a culture with a doubling factor that makes everything look just peachy until the final doubling that kills everything. In this metaphor, people are the unthinking yeast that believe there’s plenty of room and food and other resources in the culture (i.e., on the planet) and keep consuming and reproducing until everyone dies en mass. How far away in time that final human doubling is no one really knows.

Which brings me to something rather ugly: hearings to confirm Brett Kavanaugh’s appointment to the U.S. Supreme Court. No doubt conservative Republican presidents nominate similarly conservative judges just as Democratic presidents nominate progressive centrist judges. That’s to be expected. However, Kavanaugh is being asked pointed questions about settled law and legal precedents perpetually under attack by more extreme elements of the right wing, including Roe v. Wade from 1973. Were we (in the U.S.) to revisit that decision and remove legal abortion (already heavily restricted), public outcry would be horrific, to say nothing of the return of so-called back-alley abortions. Almost no one undertakes such actions lightly. A look back through history, however, reveals a wide range of methods to forestall pregnancy, end pregnancies early, and/or end newborn life quickly (infanticide). Although repugnant to almost everyone, attempts to legislate abortion out of existence and/or punish lawbreakers will succeed no better than did Prohibition or the War Against Drugs. (Same can be said of premarital and underage sex.) Certain aspects of human behavior are frankly indelible despite the moral indignation of one or another political wing. Whether Kavanaugh truly represents the linchpin that will bring new upheavals is impossible to know with certainty. Stay tuned, I guess.

Abortion rights matter quite a lot when placed in context with population growth. Aggregate human behaviors drive out of existence all sorts of plant and animal populations routinely. This includes human populations (domestic and foreign) reduced to abject poverty and mad, often criminal scrambles for survival. The view from on high is that those whose lives fall below some measure of worthwhile contribution are useless eaters. (I don’t recommend delving deeper into that term; it’s a particularly ugly ideology with a long, tawdry history.) Yet removing abortion rights would almost certainly  swell those ranks. Add this topic to the growing list of things I just don’t get.

Haven’t purged my bookmarks in a long time. I’ve been collecting material about technological dystopia already now operating but expected to worsen. Lots of treatments out there and lots of jargon. My comments are limited.

Commandeering attention. James Williams discusses his recognition that interference media (all modern media now) keep people attuned to their feeds and erode free will, ultimately threatening democratic ideals by estranging people from reality. An inversion has occurred: information scarcity and attention abundance have become information abundance and attention scarcity.

Outrage against the machines. Ran Prieur (no link) takes a bit of the discussion above (probably where I got it) to illustrate how personal responsibility about media habits is confused, specifically, the idea that it’s okay for technology to be adversarial.

In the Terminator movies, Skynet is a global networked AI hostile to humanity. Now imagine if a human said, “It’s okay for Skynet to try to kill us; we just have to try harder to not be killed, and if you fail, it’s your own fault.” But that’s exactly what people are saying about an actual global computer network that seeks to control human behavior, on levels we’re not aware of, for its own benefit. Not only has the hostile AI taken over — a lot of people are taking its side against their fellow humans. And their advice is to suppress your biological impulses and maximize future utility like a machine algorithm.

Big Data is Big Brother. Here’s a good TedTalk by Zeynep Tufekci on how proprietary machine-learning algorithms we no longer control or understand, ostensibly used to serve targeted advertising, possess the power to influence elections and radicalize people. I call the latter down-the-rabbit-hole syndrome, where one innocuous video or news story is followed by another of increasing extremity until the viewer or reader reaches a level of outrage and indignation activating an irrational response.

(more…)

In the sense that a picture is worth a thousand words, this cartoon caught my immediate attention (for attribution, taken from here):

comforting-lies-vs-unpleasant-truths-640x480

Search engines reveal quite a few treatments of the central conflict depicted here, including other versions of essentially the same cartoon. Doubtful anything I could say would add much to the body of analysis and advice already out there. Still, the image called up a whole series of memories for me rather quickly, the primary one being the (only) time I vacationed in Las Vegas about a decade ago.

(more…)