Archive for the ‘Idle Nonsense’ Category

The phrase fight or flight is often invoked to describe an instinctual response to threat to survival or wellbeing, especially physical attack. The response is typically accompanied by a rush of adrenaline that overwhelms the rational mind and renders preplanning moot. The phrase is among the most ubiquitous examples of a false binary: a limiting choice between two options. It’s false precisely because other options exist and further complicated by actual responses to threat arguably falling within more than one category. Other examples of false binaries include with/against us, Republican/Democrat, tradition/progress, and religious/secular. Some would include male/female, but that’s a can of worms these days, so I’d prefer to leave it alone. With respect to fight/flight, options might be better characterized as fight/flight/freeze/feign/fail, with acknowledgment that category boundaries are unclear. Let me characterize each in turn.

Fight. Aggressive types may default to fighting in response to provocation. With combat training, otherwise submissive types may become more confident and thus willing to fight. Of course, the level of threat and likelihood of success and/or survival figure into when one decides to engage, even with snap judgments. Some situations also admit no other response: gotta fight.

Flight. When available, evading direct confrontation may be preferable to risking bodily harm. High threat level often makes flight a better strategy than fighting, meaning flight is not always a mark of cowardice. Flight is sometimes moot, as well. For instance, humans can’t outrun bears (or wolves, or dogs, pick your predator), so if one retains one’s wits in the face of a bear charge, another response might be better, though reason may have already departed then scene.

Freeze. Freezing in place might be one of two (or more) things: paralysis in the face of threat or psychological denial of what’s happening. Both are something to the effect, “this can’t possibly be happening, so I won’t even respond.” An event so far outside of normal human experience, such as a fast-moving natural disaster (e.g., a tsunami) or the slow-moving disaster of ecocide perpetrated by humans both fail to provoke active response.

Feign. Some animals are known to fake death or bluff a stronger ability to fight than is true. Feigning death, or playing possum, might work in some instances, such as mass shooting where perpetrators are trained on live targets. Facing a charging bear might just intimidate the bear enough to turn its attentions elsewhere. Probably doesn’t work at all with reptiles.

Fail. If the threat is plainly insurmountable, especially with natural disasters and animal attacks, one response may be to simply succumb without resistance. Victims of near-drowning often report being overtaken with bliss in the moment of acceptance. During periods of war and genocide, I suspect that many victims also recognized that, in those immortal words, resistance is futile. Giving up may be a better way to face death than experiencing desperation until one’s dying breath.

Bullying is one example of threat most are forced to confront in childhood, and responses are frequently based on the physical size of the bully vs. the one being bullied. Also, the severity of bullying may not be so dire that only instinctive responses are available; one can strategize a bit. Similarly, since it’s in the news these days, sexual assault, typically men against women (but not always — Catholic priest pederasts are the obvious counterexample), the response of a surprising women is to succumb rather than face what might be even worse outcomes. One can debate whether that is freezing, feigning, or failing. Doesn’t have to be only one.


Twice in the last month I stumbled across David Benatar, an anti-natalist philosopher, first in a podcast with Sam Harris and again in a profile of him in The New Yorker. Benatar is certainly an interesting fellow, and I suspect earnest in his beliefs and academic work, but I couldn’t avoid shrugging as he gets caught in the sort of logical traps that plague hyperintellectual folks. (Sam Harris is prone to the same problem.) The anti-natalist philosophy in a nutshell is finding, after tallying the pros and cons of living (sometimes understood as happiness or enjoyment versus suffering), that on balance, it would probably be better never to have lived. Benatar doesn’t apply the finding retroactively by suggesting folks end their lives sooner rather than later, but he does recommend that new life should not be brought into the world — an interdiction almost no parent would consider for more than a moment.

The idea that we are born against our will, never asked whether we wanted life in the first place, is an obvious conundrum but treated as a legitimate line of inquiry in Benatar’s philosophy. The kid who throws the taunt “I never asked to be born!” to a parent in the midst of an argument might score an emotional hit, but there is no logic to the assertion. Language is full of logic traps like this, such as “an infinity of infinities” (or multiverse), “what came before the beginning?” or “what happens after the end?” Most know to disregard the former, but entire religions are based on seeking the path to the (good) afterlife as if conjuring such a proposition manifests it in reality. (more…)

Fan Service

Posted: December 27, 2017 in Artistry, Cinema, Culture, Idle Nonsense, Media, Taste

Having just seen the latest installment of the supermegahit Star Wars franchise, my thinking drifted ineluctably to the issue of fan service. There is probably no greater example of the public claiming ownership of popular culture than with Star Wars, which has been a uniquely American phenomenon for 40 years and risen to the level of a new mythology. Never mind that it was invented out of whole cloth. (Some argue that the major religions are also invented, but that’s a different subject of debate.) Other invented, segmented mythologies include Rowling’s Harry Potter series (books before movies), Tolkien’s Lord of the Rings (books before movies), Martin’s Game of Thrones (books before TV show), and Wagner’s Ring of the Nibelung (operas). It’s little surprise (to me, at least) that the new American mythology stems from cinema rather than literature or music.

Given the general public’s deep knowledge of the Star Wars canon, it’s inevitable that some portion of the each installment of the franchise must cite and rhyme recognizable plots, dialogue, and thematic elements, which is roughly analogous to one’s favorite band playing its hits rather than offering newly composed music at every concert. With James Bond (probably the first movie franchise, though book series written by Sir Arthur Conan Doyle and Agathe Christie long ago established the model for recurring characters), story elements were formalized rather early in its history and form the foundation of each later story. Some regard the so-called formula as a straitjacket, whereas others derive considerable enjoyment out of familiar elements. So, too, with Star Wars. The light sabers, the spaceships, the light and dark sides of the force, the plucky rebels, the storm troopers, the disfigured villains, and the reluctant hero all make their appearances and reappearances in different guises. What surprised me most about The Last Jedi is how frequently and skillfully fan service was handled, typically undercutting each bit to simultaneously satisfy and taunt viewers. Some indignant fanboys (and -girls) have actually petitioned to have The Last Jedi struck from the Star Wars canon for defying franchise conventions so flagrantly.

New media have enabled regular folks to indulge their pet theories of the Star Wars universe in public fora, and accordingly, no shortage of overexcited analysis exists regarding plots, family relationships, cat-and-mouse strategics, and of course, possible stories to be told in an ever-expanding cinematic universe promising new films with nauseating regularity for the foreseeable future, or at least so long as the intellectual property owners can wring giant profits out of the series. This is what cinematic storytelling has become: setting up a series and wringing every last bit of value out of it before leaving it fallow and untended for a decade or more and then rebooting the entire stinking mess. The familiar criticism is Hollywood Out of Ideas, which often rings true except when one considers that only a few basic narrative structures exist in the first place. All the different manifestations are merely variations upon familiar themes, another form of fan service.

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as,, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.

What is more tantalizing and enticing than a secret? OK, probably sex appeal, but never mind that for now. Secrets confer status on the keeper and bring those on whom the secret is bestowed into an intimate (nonsexual, for you dirty thinkers) relationship with the secret sharer. I remember the sense of relief and quiet exhilaration when the Santa Claus story was finally admitted by my parents to be a hoax untrue. I had already ceased to really believe in it/him but wasn’t yet secure enough as a 6- or 7-year-old (or whenever it was) to assert it without my parents’ confirmation. And it was a secret I withheld from my younger siblings, perhaps my first instruction on when lying was acceptable, even looked upon approvingly. Similarly, I remember how it felt to be told about sex for the first time by older kids (now you can go there, you cretins) and thus realize that my parents (and everyone else’s) had done the dirty — multiple times even for families with more than one kid. I was the possessor of secret knowledge, and everyone figured out quickly that it was best to be discreet about it. It may have been the first open secret. Powerful stuff, as we were to learn later in our hormone-addled adolescence. In early adulthood, I also began to assert my atheism, which isn’t really a secret but still took time to root fully. From my mature perspective, others who believe in one sky-god or another look like the kids who at a tender age still believe in Santa Claus and the Easter Bunny. I don’t go out of my way to dispel anyone’s faith.

Even as adults, those of us who enjoy secret knowledge feel a bit of exhilaration. We know what goes on (a little or a lot) behind the scenes, behind the curtain, in the backrooms and dark places. It may also mean that we know how the proverbial sausage is made, which is far less special. National security clearance, operating at many levels of access, may be the most obvious example, or maybe it’s just being a bug on the wall in the dugout or locker room during a pro sports contest. Being within the circle of intimates is intoxicating, though the circumstances that gets one into the circle may be rather mundane, and those on the outside may look oddly pathetic.

The psychology behind secret knowledge functions prominently with conspiracy theories. Whether the subject is political assassinations, Bigfoot or the Loch Ness Monster, the moon landings, Area 51 and alien abduction, chemtrails/contrails, or 9/11, one’s personal belief and pet theory inescapably confers special status, especially as unacknowledged or unaccepted truth. Often, as others seek to set the record straight, one digs in to defend cherished beliefs. It’s an elixir,  a dangerous cycle that traps people in contrafactual cliques. So we have flat Earthers, birthers, 9/11 truthers, creationists, climate change deniers, etc. (I count myself among one of those groups, BTW. Figure it out for yourself.) The range of interpretations floated in the political realm with respect to the machinations of the two parties and the White House boggle my mind with possibilities. However, I’m squarely outside those circles and feel no compulsion to decide what I believe when someone asserts secret knowledge from inside the circle. I float comfortably above the fray. Similarly, with so much fake news pressing for my attention, I consciously hold quite a lot of it in abeyance until time sorts it out for me.

The scandal surrounding Harvey Weinstein and all the people he harassed, bullied, assaulted, molested, and raped has provided occasion for many who had dealings with him to revisit their experiences and wonder what might have been (or not been) had things gone differently, had they acted otherwise in response to his oafish predations. I judge it’s nearly impossible for those outside the Hollywood scene to understand fully the stakes involved (and thus the distorted psychology), but on the other hand, nearly everyone has experience with power imbalances that enable some to get away with exploiting and victimizing others. And because American culture responds to tragedies like a bunch of rubberneckers, the witch hunt has likely only just begun. There’s a better than average chance that, as with icebergs, the significantly larger portion of the problem lies hidden below the surface, as yet undisclosed. Clamor won’t alter much in the end; the dynamics are too ingrained. Still, expect accusations to fly all over the industry, including victim blaming. My strong suspicion is that some folks dodged (actively or passively) becoming victims and paid a price in terms of career success, whereas others fell prey or simply went along (and then stayed largely silent until now) and got some extra consideration out of it. Either way, it undermines one’s sense of self-worth, messing with one’s head for years afterwards. Sometimes there’s no escaping awful circumstance.

Life is messy, right? We all have episodes from our past that we wish we could undo. Hindsight makes the optimal path far more clear than in the moment. Fortunately, I have no crimes among my regrets, but with certain losses, I certainly wish I had known then what I know now (a logical fallacy). Strange that the news cycle has me revisiting my own critical turning points in sympathy with others undoubtedly doing the same.

As I generalize this thought process, I can’t help but to wonder as well what might have been had we not, say, (1) split the atom and immediately weaponized the technology, (2) succumbed to various Red Scares scattered around 20th- and 21st-century calendars but instead developed a progressive society worthy of the promise our institutions once embodied, (3) plunged forward out of avarice and shortsightedness by plundering the Earth, and (4) failed to reverse course once the logical conclusion to our aggregate effects on the biosphere was recognized. No utopia would have arisen had we dodged these bullets, of course, but the affairs of men would have been marginally improved, and we might even have survived the 21st century. Such thinking is purely hypothetical and invites a fatalist like me to wonder whether — given our frailty, weakness, and corruption (the human condition being akin to original sin) — we don’t already inhabit the best of all possible worlds.

Isn’t that a horrible thought? A world full of suffering and hardship, serial rapists and murderers, incompetent and venal political leaders, and torture and genocides is the best we can do? We can’t avoid our own worst tendencies? Over long spans of time, cataclysmic earthquakes, volcanic eruptions, superstorms, and meteor strikes already make life on Earth rather precarious, considering that over 99% of all species that once existed are now gone. On balance, we have some remarkable accomplishments, though often purchased with sizeable trade-offs (e.g., slave labor, patriarchal suppression). Still, into the dustbin of history is where we are headed rather sooner than later, having enjoyed only a brief moment in the sun.

Reading further into Anthony Giddens’ book The Consequences of Modernity, I got a fuller (though still incomplete) sense of what is meant by his terms disembedding mechanisms, expert systems, and symbolic tokens, all of which disrupt time and space as formerly understood in traditional societies that enjoyed the benefit of centuries of continuity. I’ve been aware of analyses regarding, for instance, the sociology of money and the widespread effects of the introduction and adoption of mechanical clocks and timepieces. While most understand these developments superficially as unallayed progress, Giddens argues that they do in fact reorder our experience in the world away from an organic, immediate orientation toward an intellectualized adherence to distant, abstract, self-reinforcing (reflexive) mechanisms.

But those matters are not really what this blog post is about. Rather, this passage sparked my interest:

… when the claims of reason replaced those of tradition, they appeared to offer a sense of certitude greater than that provided by preexisting dogma. But this idea only appears persuasive so long as we do not see that the reflexivity of modernity actually subverts reason, at any rate where reason is understood as the gaining of certain knowledge … We are abroad in a world which is thoroughly constituted through reflexively applied knowledge, but where at the same time we can never be sure that any given element of that knowledge will not be revised. [p. 39]

Put another way, science and reason are axiomatically open to examination, challenge, and revision and often undergo disruptive change. That’s what is meant by Karl Popper’s phrase “all science rests upon shifting sand” and informs the central thesis of Thomas Kuhn’s well-known book The Structure of Scientific Revolutions. It’s not the narrow details that shift so much (hard sciences lead pretty reliably to applied engineering) as the overarching narrative, e.g., the story of the Earth, the cosmos, and ourselves as revealed through scientific inquiry and close examination. Historically, the absolute certainty of the medieval church, while not especially accurate in either details or narrative, yielded considerable authority to post-Enlightenment science and reason, which themselves continue to shift periodically.

Some of those paradigm shifts are so boggling and beyond the ken of the average thinker (including many college-educated folks) that our epistemology is now in crisis. Even the hard facts — like the age and shape of the Earth or its orbital relationship to other solar bodies — are hotly contested by some and blithely misunderstood by others. One doesn’t have to get bogged down in the vagaries of relativity, nuclear power and weapons, or quantum theory to lose the thread of what it means to live in the 21st century. Softer sciences such as psychology, anthropology, economics, and even history now deliver new discoveries and (re-)interpretations of facts so rapidly, like the dizzying pace of technological change, that philosophical systems are unmoored and struggling for legitimacy. For instance, earlier this year, a human fossil was found in Morocco that upended our previous knowledge of human evolution (redating the first appearance of biologically modern humans about 100,000 years earlier). More popularly, dieticians still disagree on what sorts of foods are healthy for most of us (though we can probably all agree that excess sugar is bad). Other recent developments include the misguided insistence among some neurobiologists and theorists that consciousness, free will, and the self do not exist (I’ll have a new post regarding that topic as time allows) and outright attacks on religion not just for being in error but for being the source of evil.

I have a hard time imagining other developments in 21st-century intellectual thought that would shake the foundations of our cosmology any more furiously than what we’re now experiencing. Even the dawning realization that we’ve essentially killed ourselves (with delayed effect) by gradually though consistently laying waste to our own habitat is more of an “oops” than the mind-blowing moment of waking up from The Matrix to discover the unreality of everything once believed. Of course, for fervent believers especially, the true facts (best as we can know them, since knowledge is forever provisional) are largely irrelevant in light of desire (what one wants to believe), and that’s true for people on both sides of the schism between church and science/reason.

As Shakespeare wrote in Hamlet, “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.” So it’s probably wrong to introduce a false dualism, though it has plenty of historical precedent. I’ll suggest instead that there are more facets and worldviews at play in the world that the two that have been warring in the West for the last 600 years.

Some phrases have a wide range of applicability, such as the book title “______ for Dummies.” The popular Netflix show Orange is the New Black is another, claiming cachet in criminality. Let me jump on the bandwagon and observe how Transgression is the New Chic. There are two aspects to how transgression has become the new “it” thing: committing a transgression and being transgressed. Seems these days everyone is positioning themselves along one axis or another, sometimes both.

Not many of us possess the ability to transgress others without consequences. To do so basically requires fuck-you money. Celebrity also helps. With those characteristics, however, one can get away with an awful lot of mischief and make themselves look pretty damn cool in the process (if one is impressed by such foolishness). At the top of the heap is our Bully-in-Chief, who is busy testing another man-child in a reckless exercise in brinkmanship that could easily blow up in our faces (and theirs, too, which would be criminal considering the mismatch of power — like a billionaire stealing from a fast food worker). Yet the impulse to puff up one’s chest and appear unwavering in resolve or whatever other silly justification enters the minds of status seekers is awfully strong. To rational minds, it looks like insanity. Sadly, the masses do not possess rational minds and so give the game credibility.

On the flip side, claiming victimization at imagined transgressions is another fantasy league populated by the emotionally needy. Snowflakes. Or the Strawberry Generation of people prone to spoil at the slightest whiff of life’s difficulties. The so-called microaggression and the demand for safe spaces are frequent power plays used by star players, where points are scored by cowing into submission administrators too timid to call bullshit on the charade. Berkeley administrators offering counseling for students “terrorized” by a speech delivered by Ben Shapiro (whether students actually attend is beside the point) is a good example. Feigning offense works when lying to oneself, too, so the master player get double points for transgressing him- or herself. Well played. When the cycle of blaming and bullying will subside is anyone’s guess.

rant on/

As the next in an as-yet unnumbered series of Storms of the Century (I predict more than a dozen at least) is poised to strike nearly the entirety of the State of Florida, we know with confidence from prior experience, recent and not so recent, that any lessons we might take regarding how human habitation situated along or near coastlines vulnerable to extreme weather events, now occurring with increasing frequency and vehemence, will remain intransigently unlearned. Instead, we’ll begin rebuilding on the very same sites as soon as construction labor and resources can be mustered and deployed. Happened in New Orleans and New Jersey; is about to happen in Houston; and will certainly happen all across Florida — even the fragile Florida Keys. I mean, shit, we can’t do without The Magic Kingdom and other attractions in the central-Florida tourist mecca, now can we?

This predictable spin around the dance floor might look like a tragicomic circus waltz (e.g., The Daring Young Man on the Flying Trapeze), or even out-of-tune, lopsided calliope music from the carousel, except that positioning ourselves right back in harm’s way would be better characterized as a danse macabre. I dub it the Builder’s Waltz, which could also be the Rebuilder’s Rumba, the Catastrophe Tango, the Demolition Jive … take your pick.

Obstinate refusal to apprehend reality as it slams into us is celebrated as virtue these days. Can’t lose hope even as dark forces coalesce all around us, right? Was it always so? Still, an inkling might be dawning on some addle-brained deniers that perhaps science-informed global warming and climate change news might actually be about something with real-world impact, such as dramatic reduction of oil refinery output or a lost citrus crop. So much for illusions of business as usual continuing unhindered into the foreseeable future. Instead, our future looks more like dominoes lined up to fall — like the line of hurricanes formed in the Atlantic. Good luck hunkering down and weathering once-in-a-lifetime storms that just keep coming. And rebuilding the same things in the same places, well, just let it go, man, ’cuz it’s already gone.

rant off/

Having been asked to contribute to a new group blog (Brains Unite), this is my first entry, which is cross-posted here at The Spiral Staircase. The subject of this post is the future of transportation. I’ve no expertise in this area, so treat this writing with the authority it deserves, which is to say, very little.

Any prediction of what awaits us must inevitably consider what has preceded us. Britain and the United States were both in the vanguard during the 19th and early 20th centuries when it came to innovation, and this is no less true of transportation than any other good or service. I’m not thinking of the routine travels one makes in the course of a day (e.g., work, church, school) but rather long excursions outside one’s normal range, a radius that has expanded considerably since then. (This hold true for freight transportation, too, but I am dropping that side of the issue in favor of passenger transit.) What is interesting is that travel tended to be communal, something we today call mass transit. For example, the Conestoga wagon, the stagecoach, the riverboat, and the rail car are each iconic of the 19th-century American West.

Passenger rail continued into the mid-20th century but was gradually replaced (in the U.S.) by individual conveyance as the automobile became economically available to the masses. Air travel commenced about the same time, having transitioned fairly quickly from 1 or 2 seats in an exposed cockpit to sealed fuselages capable of transporting 30+ people (now several hundred) at once. Still, as romantic as air travel may once have been (it’s lost considerable luster since deregulation as airlines now treat passengers more like freight), nothing beats the freedom and adventure of striking out on the road in one’s car to explore the continent, whether alone or accompanied by others.

The current character of transportation is a mixture of individual and mass transit, but without consulting numbers at the U.S. Dept. of Transportation, I daresay that the automobile is the primary means of travel for most Americans, especially those forced into cars by meager mass transit options. My prediction is that the future of transportation will be a gradual return to mass transit for two reasons: 1) the individual vehicle will become too costly to own and operate and 2) the sheer number of people moving from place to place will necessitate large transports.

While techno-utopians continue to conjure new, exotic (unfeasible) modes of transportation (e.g., the Hyperloop, which will purportedly enable passengers to make a 100-mile trip in about 12 minutes), they are typically aimed at individual transport and are extremely vulnerable to catastrophic failure (like the Space Shuttles) precisely because they must maintain human environments in difficult spaces (low orbit, underwater, inside depressurized tubes, etc.). They are also aimed at circumventing the congestion of conventional ground transportation (a victim of its own success now that highways in many cities resemble parking lots) and shortening transit times, though the extraordinary costs of such systems far exceed the benefit of time saved.

Furthermore, as climate change ramps up, we will witness a diaspora from regions inundated by rising waters, typically along the coasts where some 80% of human population resides (so I’ve read, can’t remember where). Mass migration out of MENA is already underway and will be joined by large population flows out of, for example, the Indian subcontinent, the Indonesian archipelago, and dustbowls that form in the interiors of continents. Accordingly, the future of transportation may well be the past:

photo: National Geographic


photo: UN Refugee Agency