“Any sufficiently advanced technology is indistinguishable from magic.” –Arthur C. Clarke

/rant on

Jon Evans at TechCrunch has an idiot opinion article titled “Technology Is Magic, Just Ask The Washington Post” that has gotten under my skin. His risible assertion that the WaPo editorial board uses magical thinking misframes the issue whether police and other security agencies ought to have backdoor or golden-key access to end-users’ communications carried over electronic networks. He marshals a few experts in the field of encryption and information security (shortened to “infosec” — my, how hep) who insist that even if such a thing (security that is porous to select people or agencies only) were possible, that demand is incompatible with the whole idea of security and indeed privacy. The whole business strikes me as a straw man argument. Here is Evans’ final paragraph:

If you don’t understand how technology works — especially a technical subgenre as complex and dense as encryption and information security — then don’t write about it. Don’t even have an opinion about what is and isn’t possible; just accept that you don’t know. But if you must opine, then please, at least don’t pretend technology is magic. That attitude isn’t just wrong, it’s actually dangerous.

Evans is pushing on a string, making the issue seem as though agencies that simply want what they want believe in turn that those things come into existence by the snap of one’s fingers, or magically. But in reality beyond hyperbole, absolutely no one believes that science and technology are magic. Rather, software and human-engineered tools are plainly understood as mechanisms we design and fabricate through our own effort even if we don’t understand the complexity of the mechanism under the hood. Further, everyone beyond the age of 5 or 6 loses faith in magical entities such as the Tooth Fairy, unicorns, Fairy God Mothers, etc. at about the same time that Santa Claus is revealed to be a cruel hoax. A sizable segment of the population for whom the Reality Principle takes firm root goes on to lose faith in progress, humanity, religion, and god (which version becomes irrelevant at that point). Ironically, the typically unchallenged thinking that technology delivers, among other things, knowledge, productivity, leisure, and other wholly salutary effects — the very thinking a writer for TechCrunch might exhibit — falls under the same category.

Who are these magical creatures who believe their smartphones, laptops, TVs, vehicles, etc. are themselves magical simply because their now routine operations lie beyond the typical end-user’s technical knowledge? And who besides Arthur C. Clarke is prone to calling out the bogus substitution of magic for mechanism besides ideologues? No one, really. Jon Evans does no one any favors by raising this argument — presumably just to puncture it.

If one were to observe how people actually use the technology now available in, say, handheld devices with 24/7/365 connection to the Internet (so long as the batteries hold out, anyway), it’s not the device that seems magical but the feeling of being connected, knowledgeable, and at the center of activity, with a constant barrage of information (noise, mostly) barreling at them and defying them to turn attention away lest something important be missed. People are so dialed into their devices, they often lose touch with reality, much like the politicians who no longer relate to or empathize with voters, preferring to live in their heads with all the chatter, noise, news, and entertainment fed to them like an endorphin drip. Who cares how the mechanism delivers, so long as supply is maintained? Similarly, who cares how vaporware delivers unjust access? Just give us what we want! Evans would do better to argue against the unjust desire for circumvention of security rather than its presumed magical mechanism. But I guess that idea wouldn’t occur to a technophiliac.

/rant off

The phrase enlightened self-interest has been been used to describe and justify supposed positive results arising over time from individuals acting competitively, as opposed to cooperatively, using the best information and strategies available. One of the most enduring examples is the prisoner’s dilemma. Several others have dominated news cycles lately.

Something for Nothing

At the Univ. of Maryland, a psychology professor has been offering extra credit on exams of either 2 or 6 points if no more that 10 percent of students elect to receive the higher amount. If more than 10% go for the higher amount, no one gets anything. The final test question, which fails as a marker of student learning or achievement and doesn’t really function so well as a psychology or object lesson, either, went viral when a student tweeted out the question, perplexed by the prof’s apparent cruelty. Journalists then polled average people and found divergence (duh!) between those who believe the obvious choice is 6 pts (or reluctantly, none) and those who righteously characterize 2 pts as “the right thing to do.” It’s unclear what conclusion to draw, but the prof reports that since 2008, only one class got any extra credit by not exceeding the 10% cutoff.

Roping One’s Eyeballs

This overlong opinion article found in the Religion and Ethics section of the Australian Broadcasting Commission (ABC) website argues that advertizing is a modern-day illustration of the tragedy of the commons:

Expensively trained human attention is the fuel of twenty-first century capitalism. We are allowing a single industry to slash and burn vast amounts of this productive resource in search of a quick buck.

I practice my own highly restrictive media ecology, defending against the fire hose of information and marketing aimed at me (and everyone else) constantly, machine-gun style. So in a sense, I treat my own limited time and attention as a resource not to be squandered on nonsense, but when the issue is scaled up to the level of society, the metaphor is inapt and breaks down. I assert that attention as an exploitable resource functions very differently when considering an individual vs. the masses, which have unique behavioral properties. Still, it’s an interesting idea to consider.

No One’s Bank Run

My last last example is entirely predictable bank runs in Greece that were forestalled when banks closed for three weeks and placed withdrawal limits (euphemism: capital controls) on what cash deposits are actually held in the vaults. Greek banks have appealed to depositors to trust them — that their deposits are guaranteed and there will be no “haircut” such as occurred in Cyprus — but appeals were met with laughter and derision. Intolerance of further risk is an entirely prudent response, and a complete and rapid flight of capital would no doubt have ensued if it weren’t disallowed.

What these three examples have in common is simple: it matters little what any individual may do, but it matters considerably what everyone does. Institutions and systems typically have enough resilience to weather a few outliers who exceed boundaries (opting for 6 pts, pushing media campaigns to the idiotic point of saturation, or withdrawing all of one’s money from a faltering bank), but when everyone acts according to enlightened self-interest, well, it’s obvious that something’s gotta give. In the examples above, no one gets extra points, no one pays heed to much of anything anymore (or perhaps more accurately, attention is debased and diluted to the point of worthlessness), and banks fail. In the professor’s example, the threshold for negative results is only 10%. Different operating environments probably vary, but the modesty of that number is instructive.

More than a few writers have interpreted the tragedy of the commons on a global level. As a power law, it probably functions better at a feudal level, where resources are predominantly local and society is centered around villages rather than megalopolises and/or nation-states. However, it’s plain to observe, if one pays any attention (good luck with that in our new age of distraction, where everyone is trained to hear only what our own culture instructs, ignoring what nature tells us), that interlocking biological systems worldwide are straining and failing under the impacts of anthropomorphic climate change. Heating at the poles and deep in the oceans are where the worst effects are currently being felt, but climate chaos is certainly not limited to out-of-sight, out-of-mind locations. What’s happening in the natural world, however, is absolutely and scarily for real, unlike bogus stress tests banks undergo to buttress consumer sentiment (euphemism: keep calm and carry on). Our failure to listen to the right messages and heed warnings properly will be our downfall, but we will have lots of company because everyone is doing it.

Among numerous elephants in the room, trampling everything in sight and leaving behind giant, steaming piles of shit, the one that galls me the most is the time, effort, expense, and lives we Americans sacrifice to the Dept. of Defense, Dept. of Homeland Security, and various other government agencies. The gargantuan corporate-military-industrial complex they have grown into over the past 65 years diverts our attention away from other honorable and worthwhile endeavors we might undertake if we weren’t instead so consumed with blowing people up and taking their stuff while playing bully-as-victim. I’m a little too young to have been scarred they way many of my elders were, ducking, covering, and cowering under schoolroom desks, so I never formed a worldview based on bogeymen. Yet that is the prevailing view, and we currently have the capacity to interfere and cause mischief globally. Impunity for doing so cannot be expected to last. Indeed, many of the current crop of clown presidential candidates see use of force to redistribute their (furriners) resources to us (Murricans) as the best option as eroding wealth and increasing scarcity threaten difficulty maintaining the vaunted American way of life. Blowhard candidate Donald Trump is probably most honest about it, promising that as president he would basically forgo diplomacy in favor of smash-and-grab escalation. Pretty fucking scary, if you ask me.

One of my favorite films is The Hunt for Red October, a taut thriller balancing on the edge of nuclear Armageddon. That clever analysts might assess situations for what they truly are and steer geopolitics away from unnecessary bombing (and concomitant self-annihilation) is especially appealing to me. However, if those people exist beyond fiction, they are below my radar. Instead, in the marketplace of ideas, we have unsubtle thinkers committed to the same useless conventions (bombing didn’t work? then we need more bombing!) as Robert McNamara famously finally(!) recognized and admitted to late in life and as described in the documentary film The Fog of War. Yet as much as unconventional thinking is admired (some bloggers have made themselves into clichés with their predictable topsy-turvy argumentation), operationally, we’re stuck with Cold War strategizing, not least because minor powers threaten to become irrational, world-ending demons should any acquire a nuclear bomb. Current negotiations with Iran to limit its nuclear ambitions are of just that sort, and America never fails to rise to the bait. However, as attractive as nuclear capability must seem to those not yet in the club, weaponized versions offer little or no practical utility, even as deterrents, in an age of mutually assured destruction (a MAD world, quite literally) should that genie be let back out of the bottle. Any analyst can recognize that.

Once striking act of unconventional thinking is Pres. Obama’s recent step toward ending the U.S. embargo of Cuba. Thus far, economic sanctions are still in place, and travel restrictions have been relaxed only in the case of missionary or educational work. Still, even minor revisions to this Cold War relic suggest further changes may be in store. I’m of mixed opinion about it; I expect Cuba to be ruined if overrun by American tourists and capital. It would be a different kind of bomb exploded on foreign soil but no less destructive.

Lastly, Greece is the current trial balloon (one that bursts) for exit from the European Union and its currency. The trope about the historical seat of modern democracy being the first to fail is a red herring; pay it no attention. We’re all failing in this best of all possible worlds. Thus far, events have been relatively orderly, at least so far as media reports portray. Who can know just how disruptive, violent, and ghastly things will get when the gears of industrial machinery seize up and stop providing everything we have come to expect as normal. Some countries are better equipped psychologically to handle such setbacks. Least among them is the U.S. Having just passed Bastille Day on the calendar, it occurred to me that it has been many generations since the U.S. has seen blood flowing in the streets (not counting a spate of massacres and police murders of civilians, which show no signs of abating), but considering how we are armed to the teeth and have the impulse control of a typical three-year-old, going positively apeshit is pretty much guaranteed when, say, food supplies dwindle. I’m hardly alone in saying such things, and it seems equally obvious that over the past decade or more, the federal government has been not-so-quietly preparing for that eventuality. How the mob is managed will be ugly, and one has to pause and wonder how far things will go before a complete crack-up occurs.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitution scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. Read the rest of this entry »

I get exasperated when I read someone insisting dogmatically upon ideological purity. No such purity exists, as we are all participants, in varying degrees, in the characteristics of global civilization. One of those characteristics is the thermodynamic cycle of energy use and consumption that gradually depletes available energy. The Second Law guarantees depletion, typically over cosmological time frames, but we are seeing it manifest over human history as EROI decreases dramatically since the start of the fossil fuel era. So playing gotcha by arguing, for instance, “You use electricity, too, right? Therefore, you have no right to tell me what I can and can’t do with electricity!” is laughably childish. Or put another way, if even an inkling of agreement exists that maybe we should conserve, forgo needless waste, and accept some discomfort and hardship, then it’s typically “you first” whenever the issue is raised in the public sphere.

In a risible article published at Townhall.com, Michelle Malkin calls the Pope a hypocrite for having added his authority to what scientists and environmentalists have been saying: we face civilization-ending dangers from having fouled our own nest, or “our common home” as the Pope calls it. As though that disrespect were not yet enough, Malkin also tells the Pope essentially to shut it:

If the pontiff truly believes “excessive consumption” of modern conveniences is causing evil “climate change,” will he be shutting down and returning the multi-million-dollar system Carrier generously gifted to the Vatican Museums?

If not, I suggest, with all due respect, that Pope Francis do humanity a favor and refrain from blowing any more hot air unless he’s willing to stew in his own.

The disclaimer “with all due respect” does nothing to ease the audacity of a notorious ideologue columnist picking a fight over bogus principles with the leader of the world’s largest church, who (I might add) is slowly regaining some of the respect the Catholic Church lost over the past few scandalous decades. I suspect Malkin is guilelessly earnest in the things she writes and found a handy opportunity to promote the techno-triumphalist book she researched and wrote for Mercury Ink (owned by Glenn Beck). However, I have no trouble ignoring her completely, since she clearly can’t think straight.

Plenty of other controversy followed in the wake of the latest papal encyclical, Laudato Si. That’s to be expected, I suppose, but religious considerations and gotcha arguments aside, the Pope is well within the scope of his official concern to sound the alarm alongside the scientific community that was once synonymous with the Church before they separated. If indeed Pope Francis has concluded that we really are in the midst of both an environmental disaster and a mass extinction (again, more process than event), it’s a good thing that he’s bearing witness. Doomers like me believe it’s too little, too late, and that our fate is already sealed, but there will be lots of ministry needed when human die-offs get rolling. Don’t bother seeking any sort of grace from Michelle Malkin.

I’m not a serious cineaste, but I have offered a few reviews on The Spiral Staircase. There are many, many cineastes out there, though, and although cinema is now an old medium (roughly 100 years old), cineastes tend to be on the younger side of 35 years. Sure, lots of established film critics are decidedly older, typically acting under the aegis of major media outlets, but I’m thinking specifically of the cohort who use new, democratized media (e.g., cheap-to-produce and -distribute YouTube channels) to indulge in their predilections. For example, New Media Rockstars has a list of their top 100 YouTube channels (NMR No. 1 contains links to the rest). I have heard of almost none of them, since I don’t live online like so many born after the advent of the Information/Communications Age. The one I pay particular attention to is Screen Junkies (which includes Honest Trailers, the Screen Junkies Show, and Movie Fights), and I find their tastes run toward childhood enthusiasms that mire their criticism in a state of permanent adolescence and self-mocking geekdom. The preoccupation with cartoons, comic books, action figures, superheros, and popcorn films couldn’t be more clear. Movies Fights presumes to award points on the passion, wit, and rhetoric of the fighters rather than quality of the films they choose to defend. However, adjudication is rarely neutral, since trump cards tend to get played when a superior film or actor is cited against an inferior one.

So I happened to catch three recent flicks that are central to Screen Junkies canon: Captain America: Winter Soldier, The Avengers: Age of Ultron, and Transformers: Age of Extinction (links unnecessary). They all qualify as CGI festivals — films centered on hyperkinetic action rather than story or character (opinions differ, naturally). The first two originate from the MCU (acronym alert: MCU = Marvel Cinematic Universe, which is lousy with comic book superheros) and the last is based on a Saturday-morning children’s cartoon. Watching grown men and a few women on Screen Junkies getting overexcited about content originally aimed at children gives me pause, yet I watch them to see what fighters say, knowing full well that thoughtful remarks are infrequent.

Were I among the fighters (no chance, since I don’t have my own media fiefdom), I would likely be stumped when a question needs immediate recall (by number, as in M:I:3 for the third Mission Impossible film) of a specific entry from any of numerous franchises pumping out films regularly like those named above. Similarly, my choices would not be so limited to films released after 1990 as theirs, that year being the childhood of most of the fighters who appear. Nor would my analysis be so embarrassingly visual in orientation, since I understand good cinema to be more about story and character than whiz-bang effects.

Despite the visual feast fanboys adore (what mindless fun!), lazy CGI festivals suffer worst from overkill, far outstripping the eye’s ability to absorb onscreen action fully or effectively. Why bother with repeat viewing of films with little payoff in the first place? CGI characters were interesting in and of themselves the first few times they appeared in movies without causing suspension of belief, but now they’re so commonplace that they feel like cheating. Worse, moviegoers are now faced with so many CGI crowds, clone and robot armies, zombie swarms, human-animal hybrids, et cetera ad nauseum, little holds the interest of jaded viewers. Thus, because so few scenes resonate emotionally, sheer novelty substitutes (ineffectively) for meaning, not that most chases or slugfests in the movies offer much truly original. The complaint is heard all the time: we’ve seen it before.

Here’s my basic problem with the three CGI-laden franchise installments I saw recently: their overt hypermilitarism. When better storytellers such as Kubrick or Coppola make films depicting the horrors of war (or other existential threats, such as the ever-popular alien invasion), their perspective is indeed that war is horrible, and obvious moral and ethical dilemmas flow from there. When hack filmmakers pile up frenzied depictions of death and destruction, typically with secondary or tertiary characters whose dispatch means and feels like nothing, and with cities destroyed eliciting no emotional response because it’s pure visual titillation, they have no useful, responsible, or respectable commentary. Even the Screen Junkies recognize that, unlike, say, Game of Thrones, none of their putative superheroes really face much more than momentary distress before saving the day in the third act and certainly no lasting injury (a little make-up blood doesn’t convince me). Dramatic tension simply drains away, since happy resolutions are never in doubt. Now, characters taking fake beatdowns are laughter inducing, sorta like professional wrestling after the sheepish admission that they’ve been acting all along. Frankly, pretend drama with nothing at stake is a waste of effort and the audience’s time and trust. That so many fanboys enjoy being goosed or that some films make lots of money is no justification. The latter is one reason why cinema so often fails to rise to the aspiration of art: it’s too bound up in grubbing for money.

Earlier this month, I offered a comment at Text Patterns that observed (in part) the following:

History demonstrates pretty clearly that the focus of innovation and investment has shifted considerably over the past 150 years or so from (1) mechanical contraptions and processes to (2) large infrastructure projects to (3) space exploration and now to (4) computers, communications, and networking.

I would amend (3) to include nuclear technologies on top of space exploration. Thus, various categories of innovation attract attention and glamor then fade away. I also suggested that correlating these orientations with styles of warfare might be an interesting blog post, but I won’t perform that analysis, leaving the idea available for others to develop.

With respect to (2), it seems to me that while large infrastructure projects enjoyed a period of preeminence from 1900 to 1950 or so, concluding roughly with the construction of the interstate highway system, then yielded in the popular imagination to nuclear power (and nuclear angst) and numerous NASA projects, infrastructure projects have never really stopped despite many reports of decaying and decrepit infrastructure that reduce quality of life and economic competitiveness. The development of high-speed rail, standard in Europe and Japan, is a good example of infrastructure never developed in the U.S. because we have chosen instead to support automobiles and over-the-road trucking. Elizabeth Moss Kanter’s new book, Move: Putting America’s Infrastructure Back in the Lead, argues that we need to turn attention back to infrastructure posthaste.

If infrastructure projects lack the faux glamor and sexiness of more contemporary categories of innovation and investment, they have nonetheless been ongoing, though perhaps not diligently enough to avoid periodic bridge collapses, electrical grid failures, and train derailments. Surveying the scene in my hometown (Chicago), I can recall that in the last 15 years, major renovations have been done to all the interstates (they are under construction almost continuously yet remain choked with traffic), as well as two massive reconstruction projects to different stretches of Upper and Lower Wacker Drive. The Chicago Transit Authority (CTA) has also had major capital projects to update its rail lines and stations, the newest phase being Red and Purple Line modernization, including a planned flyover for the Brown Line just north of the Clark Junction. This list of projects omits regular neighborhood street and intersection repairs, redesigns, and reconstruction, as well as water, sewer, and other utility repairs, all of which lack a significant wow factor.

One project that just opened, though it’s not yet quite complete, is converting the disused rail line known as the Bloomingdale Trail (now renamed The 606 after the stem of Chicago zip codes) into a walking/running/biking path and park system extending across several NW Chicago neighborhoods. Although the Bloomingdale Trail has been open less than a week, I ventured onto it and found that it has already been widely embraced by the public. Indeed, riding the Bloomingdale Trail on my bicycle was frustrating and tense because there was no possibility of going at a normal bike speed plus lots of maneuvering to avoid collisions with others on the path. If I were a runner or walker on the path instead, I would undoubtedly have been irritated by bicyclists zipping by me at close quarters. Although the path is designed for mixed use, it fails to satisfy (IMO) any of those intended uses because of sheer congestion. Instant popularity has made the trail a victim of its own success, not unlike the overused Lakeshore Bike Path and Forest Preserve picnic shelters. And therein lies the problem: if you built it, they will come — and they often come in droves that overwhelm and ruin the entire experience.

This seems to be an endemic feature of modern infrastructure. We build spaces and places to accommodate transit, transportation, and events of all sorts, but then they stagger under the masses of people and vehicles that descend upon them. I have had similarly poor experiences at events such at the Chase Corporate Challenge (a 5K foot race and walk in Grant Park) and the Late Ride (a 25-mile bike ride — not race — through the city occurring well after midnight) where the events are so oversubscribed that constant jostling for position on the pavement among the other participants ruins the experience. Difficult entry and egress from sporting events at Wrigley Field, Soldier Field, and the United Center have also been cause for consternation. The annual lakefront Air and Water Show, the Taste of Chicago, and July 4 fireworks displays (before the city cancelled its fireworks) are further examples of places I avoid because they’re simply too mobbed with people.

Some of the cause is population density, even though the City of Chicago proper had higher density in the first half of the 20th century, and some of the cause is overpopulation, with events drawing people from the surrounding suburbs and exurbs. Some of the cause, too, is the abandonment of an historically rural, agrarian way of life for modern, technological social structures organized around city centers. The resulting depopulation of the countryside, relatively speaking, is a curiosity to me, which I have prophesied may give way to repopulation (which I called repatriation) when agribusiness fails to deliver enough food to grocery shelves. But until then, we’re stuck congregating in mostly the same places.

I have read a number of exhortations by gurus of one sort or another, usually plumbing the depths of self-delusion, to “imagine the absurd” as a means of unlocking one’s latent creativity blocked by hewing too closely to convention and, dare I say it, reality. Invoking absurdity seems to me redundant: we (well, some) already live with absurd comfort and wealth, purchased by the sweat and suffering of many, not least of which is the Earth itself (or herself, if you prefer). Slights and insults absorbed by the biosphere in the last decade may be individually insignificant, but over time and in aggregate, they constitute a proverbial death by a thousand cuts. But you don’t have to take my word for it: investigate for yourself how many plant and animal species have suffered significant die-offs. Other absurdities are piling up, too, especially in the area of politics, which is looking more than ever (how is that even possible?) like a clown circus as, for example, more candidate from both major political parties enter the 2016 presidential race. We haven’t even gotten to loony third-party candidates yet.

These are familiar ideas to readers of this blog, and although they bear repeating, they are not really what I want to discuss. Rather, it has become increasingly clear that in an age of excess and entitlement — call it the Age of Absurdity — the awful truth can only be told through comedy, just like Michael Moore’s short-lived comic documentary TV show of the same name. Sure, there are a growing number of Cassandras like me prophesying doom, but our claim on public dialogue is thus far negligible. Further, serious documentaries exposing absurd levels of corruption, mendacity, idiocy, and cruelty are currently enjoying a golden age. But compare these to any number of TV shows and movies — all offered for comedic entertainment purposes — that are now functioning as de facto news outlets (The Daily Show and Real Time with Bill Maher have been particularly manifest about this), and it’s easy to see that the public prefers its truth sugar-coated, even if it’s given a reverse twist such as with The Colbert Report. (I can’t watch Colbert for the same reason I can’t watch South Park or The Simpsons: they’re too accurate, too on the nose, even as jokey reflections of or on the Age of Absurdity.) The only thing one needs to reveal truth inside a comedy show (not just the fake news shows) is to ignore the laugh track and turn off one’s sense of humor, treating each comedy bit earnestly, the way a child would. That’s how awful, accurate, and absurd things have become.

Take, for instance, this article in The New Yorker, which is satire on its face but quite literally tells the truth when considered soberly. The last line, “Our research is very preliminary, but it’s possible that they [denialists of all stripes] will become more receptive to facts once they are in an environment without food, water, or oxygen,” is pretty macabre but tells precisely the thing to be expected when supplies falter.

Take, for another instance, the celebrity roasts that Comedy Central has revived. I’ve watched only a few clips, but roasters typically say egregiously insulting things that are quite literally true about the roastee, who laughs and smiles through the humiliation. Insult comedy may perhaps be embellished or exaggerated for effect, but it scarcely needs to be. To call someone a hack or comment on his/her undesired unwarranted overexposure (commonplace now in the era of omnimedia and leaked sex tapes) takes a little comedic shaping, but there is always a sizable kernel of truth behind the jokes. That’s what makes comedy funny, frankly. This might be best exemplified when a joke is made “too soon.” The subject matter will become funny in time, after the shocking truth has worn off some, but when too soon, the insult is just too much to take in good taste and no enjoyment can be had from exposing that particular truth.

Is there a conclusion to be drawn? I dunno. The culture has room for both seriousness and humor, gallows and otherwise. I wonder sometimes if the ability to act with seriousness of purpose to forestall the worst is even possible anymore. Instead, we’re absorbed by distractions and cheap entertainments that divert our attention to trifles. (Why am I aware in even the slightest way of the Kardashians?) A true expression of the Zeitgeist perhaps, we know deep down that the dominoes are tipping over and we’re lined up to take hit after hit until everything has crumbled around us. So why not laugh and party right up to the bitter end?

I’m not quite yet done with the idea behind this post, namely, that certain insidious ideas permit problems that more wizened thinking might avoid. If I were less judicious, I might say that lousy ideas generate many of our problems, but cause-and-effect and correlation links are too subtle to draw unambiguous conclusions. Along those lines, I’ve been puzzling the last few weeks over the Middle East, including (of course) Israel and North Africa. Everyone seems to have a pet theory how to put an end to endless violence and mayhem in the region. Most theories call for (further) bombing, strategic or otherwise, of one faction or another. Clearly, that’s not really a solution, since wreaking even more havoc and violence solves nothing, and it’s equally obvious that no pat solution exists. The situation has become a multigenerational, multinational conflict that perpetuates itself, the original provocation(s) having been long forgotten or subsumed into more recent events. Such events include no small amount of meddling and destabilization by the United States and its allies, plus economic difficulties that have people in the streets agitating for a reasonable share of what’s available, which is diminishing rapidly as overpopulation and industrial collapse ramp up in the region.

Reasons why conflict arises are many, but let’s not lose sight of our response. Statesmen of an earlier era might have been predisposed toward diplomatic and economic responses. Indeed, foreign aid and restructuring plans such as those that followed WWII might be examples of a better way to deploy our resources now to achieve desirable results for everyone (here and there). So why do today’s government policy- and decision-makers with their fingers on the buttons — those holding the presumed monopoly on the use of force — now so frequently resort to bombing and decades-long armed response, entailing boots on the ground, air strikes from carriers positioned in the region, and now drone warfare? Destroying people, infrastructure, industrial capacity, and with them means of living peaceably does not make us safer at home, unless there is something they know that I don’t. Rather, considering the apparently unlimited availability of arms to various factions (in high contrast with, um, er, well, food and jobs), it seems obvious that we’re seeding revolution while radicalizing populations that might prefer to be left alone to work out their own problems, which frankly would probably involve armed conflict. So in effect, we’re painting the bullseye on our own backs (and have been for a long time as the self-appointed World Police with strategic interests extending quite literally across the globe), uniting disparate factions against a common enemy — us.

So let me ask again: what makes this possible? In an era of psychotic knowledge and managed perception (and to a far lesser extent, managed consent), many leaders have developed bunker mentality, where everyone is a threat (even from within) and they (whoever they are, it hardly matters) all always poised to come for us and take away our vaunted freedoms (rhetoric alert). Never mind that the reverse is actually more true. I’ve argued before that bunker mentality goes hand-in-hand with Cold War paranoia drummed into the heads of folks who were children in the 1950s and 60s. Too many duck-and-cover air raid drills during primary school left indelible marks on their souls. Younger G-men and -women are undoubtedly infected by the meme now, too, by frequent security briefings that make the world look far more dangerous (to us) than it actually is, not unlike so many police shows on TV that overstate by a large margin the frequency of, say, street shootouts. (Fatalities from automobile accidents and obesity far outstrip losses from terrorism and other existential threats. Go look it up.) Fruit of that propaganda is our current fight-or-flight response: always, always fight; never, ever take flight. The mouth-breathing public is on board with this, too, always ready to throw down with reckless, half-wit commentary such as “bomb them back to the Stone Age!” Yet a few noisy pundits are beginning to suggest that the U.S. transition back to a more isolationist policy, perhaps sitting out a conflict or two rather than engaging reflexively, thoughtlessly, and pointlessly. Isolationism was our stance prior to WWII, having learned in the American Civil War and WWI that warfare absolutely sucks and should be avoided instead of relished. Living memory of those conflagrations is now gone, and we’re left instead with bullshit jingoism about the Greatest Generation having won WWII, quietly skipping over wars we lost gave up on in Korea and Vietnam.

For a long time, people have tried to draw connections between TV and videogame violence and actual crime. The same is true of pornography and rape. No direct links have been demonstrated convincingly using the tools of psychometrics, much to the chagrin of crusaders and moralists everywhere. Yet the commonsense connection has never really been dispelled: if the culture is positively saturated with images of violence and sexuality (as it is), whether actual, fabricated, or fictional (for the purpose of dramatic license and entertainment), then why wouldn’t vulnerable thinkers’ attitudes be shaped by irrational fear and lust? That’s nearly everyone, considering how few can truly think for themselves, resisting the dominant paradigm. Imagery and rhetoric deployed against us throughout the mainstream media is undoubtedly hyperviolent and hypersexual, but we’re smarter as a people than to succumb to such lures and lies? Sorry, even without peer-reviewed studies to show direct causation, that just doesn’t pass the straight-face test.