Posts Tagged ‘Culture’

Continuing from part 1, which is altogether too much screed and frustration with Sam Harris, I now point to several analyses that support my contentions. First is an article in The Nation about the return of so-called scientific racism and speaks directly about Charles Murray, Sam Harris, and Andrew Sullivan, all of whom are embroiled in the issue. Second is an article in The Baffler about constructing arguments ex post facto to conform to conclusions motivated in advance of evidence. Most of us are familiar with the the constructed explanation, where in the aftermath of an event, pundits, press agents, and political insiders propose various explanatory narratives to gain control over what will eventually become the conventional understanding. Published reports such as the Warren Commission‘s report on the assassination of JFK is one such example, and I daresay few now believe the report and the consensus that it presents weren’t politically motivated and highly flawed. Both linked articles above are written by Edward Burmilla, who blogs at Gin and Tacos (see blogroll). Together, they paint a dismal picture of how reason and rhetoric can be corrupted despite the sheen of scientific respectability.

Third is an even more damaging article (actually a review of the new anthology Trump and the Media) in the Los Angeles Review of Books by Nicolas Carr asking the pointed question “Can Journalism Be Saved?” Admittedly, journalism is not equivalent with reason or rationalism, but it is among several professions that employ claims of objectivity, accuracy, and authority. Thus, journalism demands both attention and respect far in excess of the typical blogger (such as me) or watering-hole denizen perched atop a barstool. Consider this pullquote:

… the flaws in computational journalism can be remedied through a more open and honest accounting of its assumptions and limitations. C. W. Anderson, of the University of Leeds, takes a darker view. To much of the public, he argues, the pursuit of “data-driven objectivity” will always be suspect, not because of its methodological limits but because of its egghead aesthetics. Numbers and charts, he notes, have been elements of journalism for a long time, and they have always been “pitched to a more policy-focused audience.” With its ties to social science, computational journalism inevitably carries an air of ivory-tower elitism, making it anathema to those of a populist bent.

Computational journalism is contrasted with other varieties of journalism based on, say, personality, emotionalism, advocacy, or simply a mad rush to print (or pixels) to scoop the competition. This hyperrational approach has already revealed its failings, as Carr reports in his review.

What I’m driving at is that, despite frequent appeals to reason, authority, and accuracy (especially the quantitative sort), certain categories of argumentation fail to register on the average consumer of news and information. It’s not a question of whether arguments are right or wrong, precisely; it’s about what appeals most to those paying even a modest bit of attention. And the primary appeal for most (I judge) isn’t reason. Indeed, reason is swept aside handily when a better, um, reason for believing something appears. If one has done the difficult work of acquiring critical thinking and reasoning skills, it can be quite the wake-up call when others fail to behave according to reason, such as with acting against enlightened self-interest. The last presidential election was a case in point.

Circling back so something from an earlier blog, much of human cognition is based on mere sufficiency: whatever is good enough in the moment gets nominated then promoted to belief and/or action. Fight, flight, or freeze is one example. Considered evaluation and reason are not even factors. Snap judgments, gut feelings, emotional resonances, vibes, heuristics, and Gestalts dominate momentary decision-making, and in the absence of convincing countervailing information (if indeed one is even vulnerable to reason, which would be an unreasonable assumption), action is reinforced and suffices as belief.

Yet more in part 3 to come.


The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielded from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.


But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

I’ve been modestly puzzled of late to observe that, on the one hand, those in the U.S. and Canada who have only just reached the age of majority (a/k/a the threshold of adulthood, which is not strictly the same as “the age of sexual consent, marriageable age, school leaving age, drinking age, driving age, voting age, smoking age, gambling age, etc.” according to the link) are disregarded with respect to some political activism while, on the other hand, they’re admired for other political activism. Seems to be issue specific whether young adults are to be taken seriously. If one is agitating for some aspect of identity politics, or a Social Justice Warrior (SJW), one can be discredited as simply being too young to understand things properly, whereas advocating gun control (e.g., in the wake of the Parkland, Florida shootings in February) is recognized as well within a youthful mandate. Survivors of violence and mayhem seem to be uniquely immune to gun advocates trotting out the meme “now is not the time.”

As it happens, I agree that identity politics is a load of horseshit and tighter gun control (no, not taking away everyone’s guns totally) needs to be tried. But I haven’t arrived at either position because youth are either too youthful or wizened enough by horrific experience to understand. Hanging one’s positions on the (dis)qualification of age is a red herring, a meaningless distraction from the issues themselves. Rather, if thoughtful consideration is applied to the day’s issues, which I daresay is not an easy prospect, one should ideally arrive at positions based on any number of criteria, some of which may conflict with others. For instance, I used to be okay (not an enthusiastic supporter, mind you) with the death penalty on a number of grounds but changed my opinion for purely pragmatic reasons. The sheer cost of automatic appeals and other safeguards to ensure that innocents are not wrongly convicted and executed, a cost borne by U.S. taxpayers, is so onerous that to prosecute through to execution looks less like justice and more like maniacal vengeance. Life in prison without the possibility of parole is a much saner and less costly project in comparison.

With intractable debates and divisive issues (e.g, abortion, free speech, right to bear arms, immigration, religion, Israel/Palestine conflict, euthanasia, etc.) plaguing public life, one might wonder how do we get everyone on board? Alternatively, how do we at least agree to be civil in spite of our disagreements? I have two replies but no solutions. The first is to recognize that some issues are indeed intractable and insoluble, so graceful acceptance that an opposing opinion or perspective will always be present is needed lest one twist and writhe inconsolably when one’s cherished perspective is not held universally. That’s not necessarily the same as giving up or succumbing to fatalism. Rather, it’s recognition that banging one’s head against certain walls is futile. The second is to recognize that opposing opinions are needed to avoid unhealthy excess in social environments. Put another way, heterodoxy avoids orthodoxy. Many historical practices we now regard as barbaric were abandoned or outlawed precisely because consensus opinion swung from one side to the other. Neil Postman called this a thermostatic response in several of his books. Other barbaric behaviors have been only partially addressed and require further agitation to invalidate fully. Examples are not mentioned, but I could compile a list rather quickly.

Haven’t purged my bookmarks in a long time. I’ve been collecting material about technological dystopia already now operating but expected to worsen. Lots of treatments out there and lots of jargon. My comments are limited.

Commandeering attention. James Williams discusses his recognition that interference media (all modern media now) keep people attuned to their feeds and erode free will, ultimately threatening democratic ideals by estranging people from reality. An inversion has occurred: information scarcity and attention abundance have become information abundance and attention scarcity.

Outrage against the machines. Ran Prieur (no link) takes a bit of the discussion above (probably where I got it) to illustrate how personal responsibility about media habits is confused, specifically, the idea that it’s okay for technology to be adversarial.

In the Terminator movies, Skynet is a global networked AI hostile to humanity. Now imagine if a human said, “It’s okay for Skynet to try to kill us; we just have to try harder to not be killed, and if you fail, it’s your own fault.” But that’s exactly what people are saying about an actual global computer network that seeks to control human behavior, on levels we’re not aware of, for its own benefit. Not only has the hostile AI taken over — a lot of people are taking its side against their fellow humans. And their advice is to suppress your biological impulses and maximize future utility like a machine algorithm.

Big Data is Big Brother. Here’s a good TedTalk by Zeynep Tufekci on how proprietary machine-learning algorithms we no longer control or understand, ostensibly used to serve targeted advertising, possess the power to influence elections and radicalize people. I call the latter down-the-rabbit-hole syndrome, where one innocuous video or news story is followed by another of increasing extremity until the viewer or reader reaches a level of outrage and indignation activating an irrational response.


I’m currently reading Go Wild by John Ratey and Richard Manning. It has some rather astounding findings on offer. One I’ll draw out is that the human brain evolved not for thinking, as one might imagine, but for coordinating complex physiological movements:

… even the simplest of motions — a flick of a finger or a turn of the hand to pick up a pencil — is maddeningly complex and requires coordination and computational power beyond electronics abilities. For this you need a brain. One of our favorites quotes on this matter comes from the neuroscientists Rodolfo Llinás: “That which we call thinking is the evolutionary internationalization of movement.” [p. 100]

Almost all the computation is unconsciousness, or maybe preconscious, and it’s learned over a period of years in infancy and early childhood (for basic locomotion) and then supplemented throughout life (for skilled motions, e.g., writing cursive or typing). Moreover, those able to move with exceptional speed, endurance, power, accuracy, and/or grace are admired and sometimes rewarded in our culture. The obvious example is sports. Whether league sports with wildly overcompensated athletes, Olympic sports with undercompensated athletes, or combat sports with a mixture of both, thrill attaches to watching someone move effectively within the rule-bound context of the sport. Other examples include dancers, musicians, circus artists, and actors who specialize in physical comedy and action. Each develops specialized movements that are graceful and beautiful, which Ratey and Manning write may also account for nonsexual appreciation and fetishization of the human body, e.g., fashion models, glammed-up actors, and nude photography.

I’m being silly saying that jocks figgered it first, of course. A stronger case could probably be made for warriors in battle, such as a skilled swordsman. But it’s jocks who are frequently rewarded all out of proportion with others who specialize in movement. True, their genetics and training enable a relatively brief career (compared to, say, surgeons or pianists) before abilities ebb away and a younger athlete eclipses them. But a fundamental lack of equivalence with artisans and artists is clear, whose value lies less with their bodies than with outputs their movements produce.

Regarding computational burdens, consider the various mechanical arms built for grasping and moving objects, some of them quite large. Mechanisms (frame and hydraulics substituting for bone and muscle) themselves are quite complex, but they’re typically controlled by a human operator rather than automated. (Exceptions abound, but they’re highly specialized, such as circuit board manufacture or textile production.) More recently, robotics demonstrate considerable advancement in locomotion without a human operator, but they’re also narrowly focused in comparison with the flexibility of motion a human body readily possesses. Further, in the case of flying drones, robots operate in wide open space, or, in the case of those designed to move like dogs or insects, use 4+ legs for stability. The latter are typically built to withstand quite a lot of bumping and jostling. Upright bipedal motion is still quite clumsy in comparison with humans, excepting perhaps wheeled robots that obviously don’t move like humans do.

Curiously, the movie Pacific Rim (sequel just out) takes notice of the computational or cognitive difficulty of coordinated movement. To operate giant robots needed to fight Godzilla-like interdimensional monsters, two mind-linked humans control a battle robot. Maybe it’s a simple coincidence — a plot device to position humans in the middle of the action (and robot) rather than killing from a distance — such as via drone or clone — or maybe not. Hollywood screenwriters are quite clever at exploiting all sorts material without necessarily divulging the source of inspiration. It’s art imitating life, knowingly or not.

In the sense that a picture is worth a thousand words, this cartoon caught my immediate attention (for attribution, taken from here):


Search engines reveal quite a few treatments of the central conflict depicted here, including other versions of essentially the same cartoon. Doubtful anything I could say would add much to the body of analysis and advice already out there. Still, the image called up a whole series of memories for me rather quickly, the primary one being the (only) time I vacationed in Las Vegas about a decade ago.


Long again this time and a bit contentious. Sorry for trying your patience.

Having watched a few hundred Joe Rogan webcasts by now (previous blog on this topic here), I am pretty well acquainted with guests and ideas that cycle through periodically. This is not a criticism as I’m aware I recycle my own ideas here, which is more nearly thematic than simply repetitive. Among all the MMA folks and comedians, Rogan features people — mostly academics — who might be called thought leaders. A group of them has even been dubbed the “intellectual dark web.” I dunno who coined the phrase or established its membership, but the names might include, in no particular order, Jordan Peterson, Bret Weinstein, Eric Weinstein, Douglas Murray, Sam Harris, Jonathan Haidt, Gad Saad, Camille Paglia, Dave Ruben, Christina Hoff Sommers, and Lawrence Krauss. I doubt any of them would have been considered cool kids in high school, and it’s unclear whether they’re any cooler now that they’ve all achieved some level of Internet fame on top of other public exposure. Only a couple seem especially concerned with being thought cool now (names withheld), though the chase for clicks, views, likes, and Patreon support is fairly upfront. That they can usually sit down and have meaningful conversations without rancor (admirably facilitated by Joe Rogan up until one of his own oxen is gored, less admirably by Dave Ruben) about free speech, Postmodernism, social justice warriors, politics, or the latest meme means that the cliquishness of high school has relaxed considerably.

I’m pleased (I guess) that today’s public intellectuals have found an online medium to develop. Lots of imitators are out there putting up their own YouTube channels to proselytize their own opinions. However, I still prefer to get deeper understanding from books (and to a lesser degree, blogs and articles online), which are far better at delivering thoughtful analysis. The conversational style of the webcast is relentlessly up-to-date and entertaining enough but relies too heavily on charisma. And besides, so many of these folks are such fast talkers, often talking over each other to win imaginary debate points or just dominate the conversational space, that they frustrate and bewilder more than they communicate or convince.


A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

I remarked in an earlier blog that artists, being hypersensitive to emergent patterns and cultural vibes, often get to ideas sooner than the masses and express their sensibilities through creative endeavor. Those expressions in turn give watchers, viewers, listeners, readers, etc. a way of understanding the world through the artist’s interpretive lens. Interpretations may be completely fictitious, based on real-life events, or merely figurative as the medium allows. They are nonetheless an inevitable reflection of ourselves. Philistines who fail to appreciate that the arts function by absorbing and processing human experience at a deep, intuitive level may insist that the arts are optional or unworthy of attention or financial support. That’s an opinion not at all borne out in the culture, however, and though support may be vulnerable to shifts in valuation (e.g., withdrawal of federal funding for the NEA and PBS), the creative class will always seek avenues of expression, even at personal cost. The democratization of production has made modes of production and distribution for some media quite cheap compared to a couple decades ago. Others remain undeniably labor intensive.

What sparked my thinking are several TV series that have caught my attention despite my generally low level of attention to such media. I haven’t watched broadcast television in over a decade, but the ability to stream TV programming has made shows I have ignored for years far more easy to tune in on my own terms and schedule. “Tune in” is of course the wrong metaphor, but suffice it to say I’ve awarded some of my attention to shows that have up until now fell out of scope for me, cinema being more to my liking. The three shows I’ve been watching (only partway through each) are The Americans, Homeland, and Shameless. The first two are political thrillers (spy stuff) whereas the last is a slice-of-life family drama, which often veers toward comedy but keeps delivering instead tragedy. Not quite the same thing as dark comedy. Conflict is necessary for dramatic purposes, but the ongoing conflict in each of these shows flirts with the worst sorts of disaster, e.g., the spies being discovered and unmasked and the family being thrown out of its home and broken up. Episodic scenarios the writers concoct to threaten catastrophe at every step or at any moment gets tiresome after a while. Multiple seasons ensure that dramatic tension is largely dispelled, since the main characters are present year over year. (The trend toward killing off major characters in others popular TV dramas is not yet widespread.) But still, it’s no way to live, constantly in disaster mode. No doubt I’ve cherry picked three shows from a huge array of entertainments on offer.

Where art reflects reality is that we all now live in the early 21st century under multiple, constantly disquieting threats, large and small, including sudden climate change and ecological disaster, nuclear annihilation, meteor impacts, eruption of the shield volcano under Yellowstone, the Ring of Fire becoming active again (leading to more volcanic and earthquake activity), geopolitical dysfunction on a grand scale, and of course, global financial collapse. This, too, is no way to live. Admittedly, no one was ever promised a care-free life. Yet our inability to manage our own social institutions or shepherd the earth (as though that were our mandate) promise catastrophes in the fullness of time that have no parallels in human history. We’re not flirting with disaster so much as courting it.

Sociologists and historians prepare scholarly works that attempt to provide a grand narrative of the times. Cinema seems to be preoccupied with planetary threats requiring superhero interventions. Television, on the other hand, with its serial form, plumbs the daily angst of its characters to drive suspense, keeping viewers on pins and needles while avoiding final resolution. That final resolution is inevitably disaster, but it won’t appear for a few seasons at least — after the dramatic potential is wrung out of the scenario. I can’t quite understand why these shows are consumed for entertainment (by me no less than anyone else) except perhaps to distract from the clear and present dangers we all face every day.

Oddly, there is no really good antonym for perfectionism. Suggestions include sloppiness, carelessness, and disregard. I’ve settled on approximation, which carries far less moral weight. I raise the contrast between perfectionism and approximation because a recent study published in Psychological Bulletin entitled “Perfectionism Is Increasing Over Time: A Meta-Analysis of Birth Cohort Differences From 1989 to 2016″ makes an interesting observation. Here’s the abstract:

From the 1980s onward, neoliberal governance in the United States, Canada, and the United Kingdom has emphasized competitive individualism and people have seemingly responded, in kind, by agitating to perfect themselves and their lifestyles. In this study, the authors examine whether cultural changes have coincided with an increase in multidimensional perfectionism in college students over the last 27 years. Their analyses are based on 164 samples and 41,641 American, Canadian, and British college students, who completed the Multidimensional Perfectionism Scale (Hewitt & Flett, 1991) between 1989 and 2016 (70.92% female, Mage = 20.66). Cross-temporal meta-analysis revealed that levels of self-oriented perfectionism, socially prescribed perfectionism, and other-oriented perfectionism have linearly increased. These trends remained when controlling for gender and between-country differences in perfectionism scores. Overall, in order of magnitude of the observed increase, the findings indicate that recent generations of young people perceive that others are more demanding of them, are more demanding of others, and are more demanding of themselves.

The notion of perfection, perfectness, perfectibility, etc. has a long tortured history in philosophy, religion, ethics, and other domains I won’t even begin to unpack. From the perspective of the above study, let’s just say that the upswing in perfectionism is about striving to achieve success, however one assesses it (education, career, relationships, lifestyle, ethics, athletics, aesthetics, etc.). The study narrows its subject group to college students (at the outset of adult life) between 1989 and 2016 and characterizes the social milieu as neoliberal, hyper-competitive, meritocratic, and pressured to succeed in a dog-eat-dog environment. How far back into childhood results of the study (agitation) extend is a good question. If the trope about parents obsessing and competing over preschool admission is accurate (may be just a NYC thang), then it goes all the way back to toddlers. So much for (lost) innocence purchased and perpetuated through late 20th- and early 21st-century affluence. I suspect college students are responding to awareness of two novel circumstances: (1) likelihood they will never achieve levels of success comparable to their own parents, especially financial (a major reversal of historical trends) and (2) recognition that to best enjoy the fruits of life, a quiet, reflective, anonymous, ethical, average life is now quite insufficient. Regarding the second of these, we are inundated by media showing rich celebrities (no longer just glamorous actors/entertainers) balling out of control, and onlookers are enjoined to “keep up.” The putative model is out there, unattainable for most but often awarded by randomness, undercutting the whole enterprise of trying to achieve perfection.