Posts Tagged ‘Philosophy’

Watched Everything Everywhere All at Once (DVD version) at home on my TV, which is where I see most films these days. Very few inspire me to trek to the theater anymore to overpay for seats and popcorn. Was pleased to enjoy this film quite a bit — at least before turning an analytical eye toward it. Let me provide a fun, glossy assessment before getting bogged down in troublesome detail.

The film introduces and trades heavily on characters from a supposed multiverse (a multitude of parallel universes branching indiscreetly from arbitrary decision points into an infinity of possibilities) “verse-jumping” into our universe to fix and repair damage done in one or more of the others. As plot devices go, this one is now quite commonplace and always (perhaps inevitably, given our preoccupation with ourselves) positions our universe (the only one we know until someone from outside intrudes) at the center of the others and as the linchpin in some grand plan to save the space-time continuum. It’s a worn trope yet allows storytellers immense freedom to conjure anything imaginable. Everything depicts disorienting alternative universes quite well, most of them (for no particular reason beyond having fun, I surmise) absurd variations of the familiar. Indeed, unlike most films where I sit in stone silence no matter what is presented, this one generated laugh-out-loud moments and gestures across the couch to the effect “did you see that?” In short, what that means is the film produced reflexive responses (it goosed me), which is quite unusual considering how most films, despite lots of overwrought action and drama, fail to register more than a checkbox “yup, got it.”

Actors portraying the three or four main characters do well in their respective jobs, playing several versions of themselves from different universes with diverse experiences. Most of the film is chase-and-evade, devolving at times into a familiar martial-arts punchfest that has frankly lost all possibility of making an impact in the era of overpowered, invulnerable superheros and magical unpredictability. Why filmmakers believe audiences want to see more of this drivel is beyond me, but I guess the animal curiosity to find out which make-believe character will prevail in a battle royale never gets old with mouth-breathers. I’m quite over it. The central conflict, however, wasn’t about the strongest punch. Rather, it was about persisting in the face of revealed meaninglessness a/k/a nihilism.

So here’s where hindsight analysis kinda ruined things for me. Although I recognize storytelling as elemental to modern cognition and consciousness, I don’t regard most narrative forms as art. Cinema, because of its financial interests and collaborative nature, rarely rises to the level of art. There are simply too many diverse elements that must be assembled under a unified aesthetic vision for that to occur often. Cinema is thus more entertainment than art, just like sports and games are entertainment, not art. Impressive skill may be demonstrated, which often produces enjoyable results, but I don’t conflate skill or mere craft with artistry. (I also tire of everything that provides moral and epistemological orientation being conflated with religion). So when films introduce super-serious subjects that really trouble me (e.g., overpopulation, institutional corruption, the climate emergency) but treat them lightly, I’m bothered. Everything does that with philosophy.

Coming to grips with nihilism and the absurdity of existence is the central feature of more than one 20th-century philosophy (and their variants). Downstream (or parallel?) are artistic genres that also express the idea, though in far less overt terms. One can easily get lost down a hole, seeking the bottom (alternatively, the root of things) but finding only the abyss. For that very reason, I have acquaintance with philosophical themes but have not truly sunk into them deeply. Nihilism is not something to mess with, even as a thought experiment or intellectual inquiry — especially if one is inclined to connect strongly with those same things. In Everything, the nihilist conclusion (i.e., that nothing matters) manifests absurdly as a giant, black, everything bagel that can literally suck a person into its hole. Well and good enough; probably best not to overexplain that McGuffin. But it demands a conclusion or resolution, which comes in the form of the mother rescuing the daughter. Ironically, it was the mother (from an alternative universe) who had introduced the daughter (also an alternative) to verse-jumping, who then (the daughter) got lost down the hole and threatened to collapse the multiverse into the everything bagel in a final gesture of despair. In effect, the mother had tinkered with powers well beyond her control, unwittingly created the daughter-monster with out-of-control feeling and unexpected powers, and had to clean up her own mess. How does she (the mother) do it? Through the power of love.

OK, fine. Love (especially unconditional love, as opposed to romantic or familial love) is a universal salve capable of healing all wounds. Except that it’s not. When the film finally depicts the rescue, saving the daughter and multiverse from destruction, it comes across as flat, obvious, and ineffectual (to me at least) and breaks the tone and pacing of the film. Lots of films resort to the power of love to save the day (typically just before the stroke of midnight), but they usually (not always) have better set-ups, which is to say, their film universes cohere and deliver cogent conclusions rather than waving a magic love-wand over everything to solve and resolve. The writers of this film are adept at the enjoyable absurd parts that launch and propel the story but could not stick the landing. Introducing (albeit comically) doomsday philosophy but then failing to treat it seriously enough left me deeply conflicted and dissatisfied. Perhaps it’s a case where my suspension of disbelief was not complete enough. Or maybe I brought too much into the film from outside, but we all have inescapable frames of reference. I wasn’t exactly triggered, merely frustrated. YMMV

Search the tag Counter-Enlightenment at the footer of this blog to find roughly ten disparate blog posts, all circling around the idea that intellectual history, despite all the obvious goodies trucked in with science and technology, is turning decidedly away from long-established Enlightenment values. A fair number of resources are available online and in book form exploring various movements against the Enlightenment over the past few centuries, none of which I have consulted. Instead, I picked up Daniel Schwindt’s The Case Against the Modern World: A Crash Course in Traditionalist Thought (2016), which was gifted to me. The book was otherwise unlikely to attract my attention considering that Schwindt takes Catholicism as a starting point whereas I’m an avowed atheist, though with no particular desire to proselytize or attempt to convince others of anything. However, The Case Against is suffused with curious ideas, so it is a good subject for a new book blogging project, which in characteristic fashion (for me) will likely proceed in fits and starts.

Two interrelated ideas Schwindt puts forward early in the book fit with multiple themes of this blog, namely, (1) the discovery and/or development of the self (I refer more regularly to consciousness) and (2) the reductive compartmentalization of thought and behavior. Let’s take them in order. Here’s a capsule of the first issue:

(more…)

Following up the two previous entries in this series, the Feb. 2022 issue of Scientific American has a cover article by Adam Becker called “The Origins of Space and Time” with the additional teaser “Does spacetime emerge from a more fundamental reality?” (Oddly, the online title is different, making it awkward to find.) I don’t normally read Scientific American, which has become a bit like Time and Newsweek in its blurb-laden, graphics-heavy presentation intended patronizingly for general-interest readers. In fact, I’ll quote the pullout (w/o graphics) that summarizes the article:

How Spacetime Emerges. Space and time are traditionally thought of as the backdrop to the universe. But new research suggests they might not be fundamental; instead spacetime could be an emergent property of a more basic reality, the true backdrop to the cosmos. This idea comes from two theories that attempt to bridge the divide between general relativity and quantum mechanics. The first, string theory, recasts subatomic particles as tiny loops of vibrating string. The second, loop quantum gravity, envisions spacetime being broke down into chunks — discrete bits that combine to create a seemingly smooth continuum.

Being a layperson in such matters, I’ll admit openly that I don’t fully grasp the information presented. Indeed, every breathless announcement from CERN (or elsewhere) about a new subatomic particle discovery or some research group’s new conjectures into quantum this-or-that I typically greet passively at best. Were I a physicist or cosmologist, my interest would no doubt be more acute, but these topics are so far removed from everyday life they essentially become arcane inquiries into the number of angels dancing on the head of a pin. I don’t feel strongly enough to muster denunciation, but discussion of another aspect of pocket reality is worth some effort.

My understanding is that the “more basic reality, the true backdrop” discussed in the article is multidimensionality, something Eric Weinstein has also been grappling with under the name Geometric Unity. (Bizarrely, Alex Jones has also raved about interdimensional beings.) If the universe indeed has several more undetectable dimensions (as memory serves, Weinstein says as many as 14) and human reality is limited to only a few, potentially breaking through to other dimension and/or escaping boundaries of a mere four is tantalizing yet terrifying. Science fiction often explores these topics, usually in the context of space travel and human colonization of the galaxy. As thought experiments, fictional stories can be authentically entertaining and enjoyable. Within nonfiction reality, desire to escape off-world or into extra- or interdimensionality is an expression of desperation considering just how badly humans have fucked up the biosphere and guaranteed an early extinction for most species (including ours). I also chafe at the notion that this world, this reality, is not enough and that pressing forward like some unstoppable chemical reaction or biological infiltration is the obvious next step.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Among those building fame and influence via podcasting on YouTube is Michael Malice. Malice is a journalist (for what organization?) and the author of several books, so he has better preparation and content than many who (like me) offer only loose opinion. He latest book (no link) is The Anarchist Handbook (2021), which appears to be a collection of essays (written by others, curated by Malice) arguing in theoretical support of anarchism (not to be confused with chaos). I say theoretical because, as a hypersocial species of animal, humans never live in significant numbers without forming tribes and societies for the mutual benefit of their members. Malice has been making the rounds discussing his book and is undoubtedly an interesting fellow with well rehearsed arguments. Relatedly, he argues in favor of objectivism, the philosophy of Ayn Rand that has been roundly criticized and dismissed yet continues to be attractive especially to purportedly self-made men and women (especially duped celebrities) of significant wealth and achievement.

Thus far in life, I’ve disdained reading Rand or getting too well acquainted with arguments in favor of anarchism and/or objectivism. As an armchair social critic, my bias skews toward understanding how things work (i.e., Nassim Taleb’s power laws) in actuality rather than in some crackpot theory. As I understand it, the basic argument put forward to support any variety of radical individualism is that everyone working in his or her own rational self-interest, unencumbered by the mores and restrictions of polite society, leads to the greatest (potential?) happiness and prosperity. Self-interest is not equivalent to selfishness, but even if it were, the theorized result would still be better than any alternative. A similar argument is made with respect to economics, known as the invisible hand. In both, hidden forces (often digital or natural algorithms), left alone to perform their work, enhance conditions over time. Natural selection is one such hidden force now better understood as a component of evolutionary theory. (The term theory when used in connection with evolution is an anachronism and misnomer, as the former theory has been scientifically substantiated as a power law.) One could argue as well that human society is a self-organizing entity (disastrously so upon even casual inspection) and that, because of the structure of modernity, we are all situated within a thoroughly social context. Accordingly, the notion that one can or should go it alone is a delusion because it’s flatly impossible to escape the social surround, even in aboriginal cultures, unless one is totally isolated from other humans in what little remains of the wilderness. Of course, those few hardy individuals who retreat into isolation typically bring with them the skills, training, tools, and artifacts of society. A better example might be feral children, lost in the wilderness at an early age and deprived of human society but often taken in by a nonhuman animal (and thus socialized differently).

My preferred metaphor when someone insists on total freedom and isolation away from the maddening crowd is traffic — usually automobile traffic but foot traffic as well. Both are examples of aggregate flow arising out of individual activity, like drops of rain forming into streams, rivers, and floods. When stuck in automobile congestion or jostling for position in foot traffic, it’s worthwhile to remember that you are the traffic, a useful example of synecdoche. Those who buck the flow, cut the line, or drive along the shoulder — often just to be stuck again a little farther ahead — are essentially practicing anarchists or me-firsters, whom the rest of us simply regard as assholes. Cultures differ with respect to the orderliness of queuing, but even in those places where flow is irregular and unpredictable, a high level of coordination (lost on many American drivers who can’t figger a roundabout a/k/a traffic circle) is nonetheless evident.

As I understand it, Malice equates cooperation with tyranny because people defer to competence, which leads to hierarchy, which results in power differentials, which transforms into tyranny (petty or profound). (Sorry, can’t locate the precise formulation.) Obvious benefits (e.g., self-preservation) arising out of mutual coordination (aggregation) such as in traffic flows are obfuscated by theory distilled into nicely constructed quotes. Here’s the interesting thing: Malice has lived in Brooklyn most of his life and doesn’t know how to drive! Negotiating foot traffic has a far lower threshold for serious harm than driving. He reports that relocation to Austin, TX, is imminent, and with it, the purchase of a vehicle. My suspicion is that to stay out of harm’s way, Malice will learn quickly to obey tyrannical traffic laws, cooperate with other drivers, and perhaps even resent the growing number of dangerous assholes disrupting orderly flow like the rest of us — at least until he develops enough skill and confidence to become one of those assholes. The lesson not yet learned from Malice’s overactive theoretical perspective is that in a crowded, potentially dangerous world, others must be taken into account. Repetition of this kindergarten lesson throughout human activity may not be the most pleasant thing for bullies and assholes to accept, but refusing to do so makes one a sociopath.

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

I have observed various instances of magical thinking in mainstream culture, especially here, which I find problematical. Although it’s not my ambition to disabuse anyone of magical thinking, which extends far beyond, say, religious thought, I was somewhat taken aback at the suggestion found in the comic at this link (not embedded). For those not familiar with Questionable Content (one of two online comics I read regularly), the comic presents an extended cast of characters, mostly in their early 20s, living in a contemporary New England college town. Those characters are supplemented by a few older parents and lots of AIs (in robot bodies). The AIs are not particularly futuristic but are simply accepted as a normal (if curious) part of the world of the comic. Major story arcs involve characters and AIs (the AIs are characters, I suppose) in the process of discovering and establishing themselves as they (the humans, anyway) transition into early adulthood. There are no great political themes or intrusions into life in a college town. Rather, the comic is largely about acceptance of difference. Often, that means washing away meaningful difference in the name of banal tolerance. Real existential struggle is almost entirely absent.

In the linked comic, a new character comes along and offers advice to an established character struggling with sexual attractions and orientation. The dialogue includes this exchange:

Character A: If tarot or astrology or religion halps you make sense of the world and your place in it, then why not use them?
Character B: But they’re not real. [emphasis in original]
Character A: It doesn’t matter, if you use them constructively!

There it is in a nutshell: believe whatever you want if it, um, halps. I’ve always felt that being wrong (i.e., using unreal or make-believe things) was a sufficient injunction against anchoring oneself to notions widely known to be false. Besides, isn’t it often remarked that the biggest fool is one who fools himself? (Fiction as a combination of entertainment and building a worldview is quite normative, but it’s understood as fiction, or to a lesser degree, as life imitating art and its inverse. Exceptions abound, which are regarded as psychopathy.) The instruction in that dialogue (part object lesson, part lesson in cognition) is not that it’s OK to make mistakes but that knowingly believing something false has worthwhile advantages.

Surveying examples where promulgating false beliefs have constructive and destructive effects is too large a project. Well short of that, nasty categories include fraud, gaslighting, and propaganda, which are criminal in many cases and ought to be in most others (looking at you, MSM! — or not, since I neither trust nor watch). One familiar benevolent category is expressed in the phrase fake it til you make it, often recommended to overcome a lack of confidence. Of course, a swindle is also known as a confidence game (or by its diminutive, a con), so beware overconfidence when asked by another to pay for something (e.g., tarot or astrology readings), take risks, or accept an ideology without question.

As philosophy, willful adoption of falsity for its supposed benefits is half-baked. Though impossible to quantify, my suspicion is that instances of positive outcomes are overbalanced by negative ones. Maybe living in a constructed reality or self-reinforcing fantasy is what people want. The comic discussed is certainly in line with that approach. However, while we dither and delude ourselves with happy, aspirational stories based on silliness, the actual world around us, including all the human institutions that used to serve us but no longer do, falls to tatters. Is it better going through life and eventually to one’s grave refusing to see that reality? Should childlike wonder and innocence be retained in spite of what is easily observable just by poking one’s head up and dismissing comforting lies? Decide for yourself.

Once in a while, when discussing current events and their interpretations and implications, a regular interlocutor of mine will impeach me, saying “What do you know, really?” I’m always forced to reply that I know only what I’ve learned through various media sources, faulty though they may be, not through first-hand observation. (Reports of anything I have observed personally tend to differ considerably from my own experience once the news media completes its work.) How, then, can I know, to take a very contemporary instance this final week of July 2020, what’s going on in Portland from my home in Chicago other than what’s reported? Makes no sense to travel there (or much of anywhere) in the middle of a public health crisis just to see a different slice of protesting, lawbreaking, and peacekeeping [sic] activities with my own eyes. Extending the challenge to its logical extremity, everything I think I know collapses into solipsism. The endpoint of that trajectory is rather, well, pointless.

If you read my previous post, there is an argument that can’t be falsified any too handily that what we understand about ourselves and the world we inhabit is actually a constructed reality. To which I reply: is there any other kind? That construction achieves a fair lot of consensus about basics, more than one might even guess, but that still leaves quite a lot of space for idiosyncratic and/or personal interpretations that conflict wildly. In the absence of stabilizing authority and expertise, it has become impossible to tease a coherent story out of the many voices pressing on us with their interpretations of how we ought to think and feel. Twin conspiracies foisted on us by the Deep State and MSM known as RussiaGate and BountyGate attest to this. I’ll have more to say about inability to figure things out when I complete my post called Making Sense and Sensemaking.

In the meantime, the modern world has in effect constructed its own metaphorical Tower of Babel (borrowing from Jonathan Haidt — see below). It’s not different languages we speak so much (though it’s that, too) as the conflicting stories we tell. Democratization of media has given each us of — authorities, cranks, and everyone between — new platforms and vehicles for promulgating pet stories, interpretations, and conspiracies. Most of it is noise, and divining the worthwhile signal portion is a daunting task even for disciplined, earnest folks trying their best to penetrate the cacophony. No wonder so many simply turn away in disgust.

I admit (again) to being bugged by things found on YouTube — a miserable proxy for the marketplace of ideas — many of which are either dumb, wrongheaded, or poorly framed. It’s not my goal to correct every mistake, but sometimes, inane utterances of intellectuals and specialists I might otherwise admire just stick in my craw. It’s hubris on my part to insist on my understandings, considering my utter lack of standing as an acknowledged authority, but I’m not without my own multiple areas of expertise (I assert immodestly).

The initial purpose for this blog was to explore the nature of consciousness. I’ve gotten badly sidetracked writing about collapse, media theory, epistemology, narrative, and cinema, so let me circle back around. This is gonna be long.

German philosopher Oswald Spengler takes a crack at defining consciousness:

Human consciousness is identical with the opposition between the soul and the world. There are gradations in consciousness, varying from a dim perception, sometimes suffused by an inner light, to an extreme sharpness of pure reason that we find in the thought of Kant, for whom soul and world have become subject and object. This elementary structure of consciousness is not capable of further analysis; both factors are always present together and appear as a unity.

(more…)