Archive for the ‘Advertising’ Category

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.

Stray links build up over time without my being able to handle them adequately, so I have for some time wanted a way of purging them. I am aware of other bloggers who curate and aggregate links with short commentaries quite well, but I have difficulty making my remarks pithy and punchy. That said, here are a few that I’m ready to purge in this first attempt to dispose of a few links from by backlog.

Skyfarm Fantasies

Futurists have offered myriad visions of technologies that have no hope of being implemented, from flying cars to 5-hour workweeks to space elevators. The newest pipe dream is the Urban Skyfarm, a roughly 30-story tree-like structure with 24 acres of space using solar panels and hydroponics to grow food close to the point of consumption. Utopian engineering such as this crops up frequently (pun intended) and may be fun to contemplate, but in the U.S. at least, we can’t even build high-speed rail, and that technology is already well established elsewhere. I suppose that’s why cities such as Seoul and Singapore, straining to make everything vertical for lack of horizontal space, are the logical test sites.

Leaving Nashville

The City of Nashville is using public funds to buy homeless people bus tickets to leave town and go be poor somewhere else. Media spin is that the city is “helping people in need,” but it’s obviously a NIMBY response to a social problem city officials and residents (not everyone, but enough) would rather not have to address more humanely. How long before cities begin completing with each other in numbers of people they can ship off to other cities? Call it the circle of life when the homeless start gaming the programs, revisiting multiple cities in an endless circuit.

Revisioneering

Over at Rough Type, Nick Carr points to an article in The Nation entitled “Instagram and the Fantasy of of Mastery,” which argues that a variety of technologies now give “artists” the illusion of skill, merit, and vision by enabling work to be easily executed using prefab templates and stylistic filters. For instance, in pop music, the industry standard is to auto-tune everyone’s singing to hide imperfections. Carr’s summary probably is better than the article itself and shows us the logical endpoint of production art in various media undertaken without the difficult work necessary to develop true mastery.

Too Poor to Shop

The NY Post reported over the summer that many Americans are too poor to shop except for necessities. Here are the first two paragraphs:

Retailers have blamed the weather, slow job growth and millennials for their poor results this past year, but a new study claims that more than 20 percent of Americans are simply too poor to shop.

These 26 million Americans are juggling two to three jobs, earning just around $27,000 a year and supporting two to four children — and exist largely under the radar, according to America’s Research Group, which has been tracking consumer shopping trends since 1979.

Current population in the U.S. is around 325 million. Twenty percent of that number is 65 million; twenty-six million is 8 percent. Pretty basic math, but I guess NY Post is not to be trusted to report even simple things accurately. Maybe it’s 20% of U.S. households. I dunno and can’t be bothered to check. Either way, that’s a pretty damning statistic considering the U.S. stock market continues to set new all-time highs — an economic recovery not shared with average Americans. Indeed, here are a few additional newsbits and links stolen ruthlessly from theeconomiccollapseblog.com:

  • The number of Americans that are living in concentrated areas of high poverty has doubled since the year 2000.
  • In 2007, about one out of every eight children in America was on food stamps. Today, that number is one out of every five.
  • 46 million Americans use food banks each year, and lines start forming at some U.S. food banks as early as 6:30 in the morning because people want to get something before the food supplies run out.
  • The number of homeless children in the U.S. has increased by 60 percent over the past six years.
  • According to Poverty USA, 1.6 million American children slept in a homeless shelter or some other form of emergency housing last year.

For further context, theeconomiccollapseblog also points to “The Secret Shame of Middle Class Americans” in The Atlantic, which reports, among other things, that fully 47% of Americans would struggle to scrape together a mere $400 in an emergency.

How do such folks respond to the national shopping frenzy kicking off in a few days with Black Friday, Small Business Saturday, Charitable Sunday, and Cyber Monday? I suggest everyone stay home.

According to Jean-Paul Sartre, the act of negation (a/k/a nihilation) is a necessary step in distinguishing foreground objects from background. A plethora of definitions and formal logic ensure that his philosophical formulations are of only academic interest to us nowadays, since philosophy in general has dropped out of currency in the public sphere and below awareness or concern even among most educated adults. With that in mind, I thought perhaps I should reinforce the idea of negation in my own modest (albeit insignificant) way. Negation, resistance, and dissent have familial relations, but they are undoubtedly distinct in some ways, too. However, I have no interest in offering formal treatments of terminology and so will gloss over the point and decline to offer definitions. Lump ’em all in together, I say. However, I will make a distinction between passive and active negation, which is the point of this blog post.

Although the information environment and the corporations that provide electronic access through hardware and connectivity would have us all jacked into the so-called Information Superhighway unceasingly, and many people do just that with enormous relish, I am of a different mind. I recognize that electronic media are especially potent in commanding attention and providing distraction. Stowed away or smuggled in with most messaging is a great deal of perception and opinion shaping that is worse than just unsavory, it’s damaging. So I go beyond passively not wanting handheld (thus nonstop) access to actively wanting not to be connected. Whereas others share excitement about the latest smartphone or tablet and the speed, cost, and capacity of the service provider for the data line on their devices, I don’t demur but insist instead “keep that nonsense away from me.” I must negate those prerogatives, refuse their claims on my attention, and be undisturbed in my private thoughts while away from the computer, the Internet, and the constant flow of information aimed indiscriminately at me and everyone.

Of course, I win no converts with such refusals. When I was shopping for a new phone recently, the default assumption by the sales clerk was that I wanted bells and whistles. She simply could not conceive of my desire to have a phone that is merely a phone, and the store didn’t sell one anyway. Even worse, since all phones are now by default smart phones, I had a data block put on my account to avoid inadvertently connecting to anything that would require a data line. That just blew her mind, like I was forgoing oxygen. But I’m quite clear that any vulnerability to information either tempting me or forced on me is worth avoiding and that time away from the computer and its analogues is absolutely necessary.

Personal anecdote: I was shopping at an appliance retailer (went to look at refrigerators) recently that had an embedded Apple store. At the counter with three models of the iPhone 6, the latest designations, were three kids roughly 8-11 in age (I estimate). They were unattended by parents, who must have believed that if the kids were not causing problems, they were a-okay. The kids themselves were absolutely transfixed — enthralled, really — by the screens, standing silent and motionless (very unlike most kids) with either a fierce concentration or utterly empty heads as they examined the gadgets. They were so zoomed in they took no notice at all of passersby. Parents of earlier generations used the TV as a pacifier or baby sitter the same way, but these kids seemed even more hollow than typical, dull-eyed TV viewers. Only a few days later, at a concert I attended, I watched a child who apparently could not pry his eyes away from the tablet he was carrying — even as he struggled to climb the stairs to his seat. The blue glare of the screen was all over his face.

Both scenes were unspeakably sad, though I might be hard-pressed to convince anyone of that assessment had I intervened. These scenes play out again and again, demonstrating that the youngest among us are the most vulnerable and least able to judge when to turn away, to disconnect. Adults fare no better. Schools and device makers alike have succeeded in selling electronics as “educational devices,” but the reality is that instead of exploring the world around us, people get sucked into a virtual world and the glossy fictions displayed on screens. They ultimately become slaves to their own devices. I mourn for their corrupted mindscapes, distorted and ruined by parents and teachers who ought to be wiser but who themselves have been coopted and hollowed out by mass media.

A little more content lite (even though my complaint is unavoidable). Saw on Motherboard a report on a first-person, Web-based shopping game about Black Friday zombie mall shoppers. You can play here. It’s pure kitsch but does reinforce the deplorable behaviors of sale-crazed shoppers swarming over each other to get at goodies (especially cheap electronics), sometimes coming to blows. Videos of 2015 Black Friday brawls appeared almost immediately.

We apparently learn nothing year-over-year as we reenact our ritual feeding frenzy, lasting all the way through New Year’s Eve. (I never go out on Black Friday.) I might have guessed that big box retailers face diminishing returns with store displays torn apart, disgruntled shoppers, traumatized employees, and the additional cost of rent-a-cops to herd the masses and maintain order (which obviously doesn’t work in many instances). Yet my e-mail inbox keeps loading up with promotions and advertisements, even a day later. The video game in particular reminds me of Joe Bageant’s great line: “We have embraced the machinery of our undoing as recreation.”

This xkcd comic bugged me when I first saw it, but I didn’t give it too much thought at first because its dismissive approach to media is quite familiar and a bit tiresome:

On reflections, however, and in combination with other nonsense I’ve been reading, the irksome joke/not joke hasn’t faded from my thinking. So I’ll be very unhip and respond without the appropriate ironic detachment that modern life demands of us, where everything is cool and chill and like, dude, whatever ….

(more…)

I’ve been puzzling for some time over my increasingly visceral aversion to folks face-planted in their phones and tablets. It’s not merely their often being stumblebums clogging hallways, corridors, sidewalks, and elevators with rank inattention to traffic flows, though that public nuisance has been jangling my nerves to a startling degree. The answer, I was surprised to discover when I began reading Matthew Crawford’s highly regarded book The World Beyond Your Head, is my sense that those staring unwaveringly at their screens are in effect denying sociability in the most ordinary of ways by failing to acknowledge my presence with a nod or even eye contact. This is most surprising to me because I used to eschew common social graces (a couple decades ago) but have revised my thinking through recognition that, as social creatures, we take cues from each other ranging from inconsequential to life and death. None should be discarded. Even though I don’t expect soul-felt validation of my very person in day-to-day interactions, the notable absence of any acknowledgement whatsoever feels less passively neutral, more aggressively hostile. Indeed, I’ve heard stories of people wearing earbuds (without being jacked into anything) precisely to forestall anyone striking up a conversation. I call that protective headgear.

Crawford describes a familiar scene: travelers in an airport gate lounge zoned maniacally into one type of screen or another, some handheld, others mounted overhead, but in either case oblivious to each other in what might have been a social milieu in the day before electronic gadgetry and TVs (e.g., a train depot). Social conduct even in the traditional liquor bar is difficult to maintain when so many screens commandeer one’s attention. Blanket disregard for each other is understandable to Crawford because we now face so many arbitrary demands on our attention (e.g., advertising everywhere, now even on the trays the TSA uses at security checkpoints) that the response is often to cocoon oneself away from the world. Thus, according to Crawford, “we engage less than we once did in everyday activities that structure our attention.” His antidote to living in our heads, transfixed by representations of reality (as opposed to actuality), is to develop skilled practices that focus and refine attention. This is his subject in his previous book Shop Class as Soulcraft, which I have not read. Thus, to be more authentically human, or to be “a powerful, independent mind working at full song,” is to be situated within “narrow and highly structured patterns of attention” that require bodily engagement and submission to constraints that remove the faux freedom of choice.

Here I must pause to register my dismay that Crawford fails to acknowledge Albert Borgmann and his description of focal practices. Many philosophers and their ideas are cited in the book so far (I’m up to p. 95), but to omit Borgmann is an egregious error someone should have caught. I also find it astonishing that Crawford quite clearly speaks my language and makes many of the same points I have been making here at The Spiral Staircase, though with far greater detail and thoroughness as book form requires. However, the language is often clunky and reads too much like a psychology text (which is why I stopped reading books by Robert Putnam — and Albert Borgmann — partway through). I will read to the end of The World Beyond Your Head, but I won’t turn it into a book blogging project since it’s so close to the things I already write about.

So what’s one to do in the presence of others who are steadfastly disengaged from everyone else? I recall last month stepping onto an elevator with maybe five others on the way home from work where no one had yet face-planted into a phone. We all looked at each other briefly, relaxed, not in that awkward elevator way, when inevitably one fellow dove into his pocket a produced a phone. I blurted out without exercising my usual self-editing restraint, “So you’re the one who just had to whip out his pacifier.” Luckily, it came across as a joke and everyone laughed, but I think my intent was really sanction. I resist the pull of electronics as much as I can (I have a cell phone but no data line), but I recognize that though I may be swimming upstream, I cannot redirect the flow of everyone’s attention inexorably into personal screens. Storms along the eastern seaboard last week knocked out power for many for a few hours. No doubt some (re)discovered what it means to be with their families (or alone with their thoughts) without electronic mediation. Do they look dumbly at each other and say “What now?” (as was reported to me) or be sociable?

An article in Wired pushes the meme that coal, whilst claiming the lion’s share of responsibility for releasing CO2 into the atmosphere, can be cleaned up to continue to provide (mostly electrical) energy for everything we use. Pshaw, I say. Comments at the magazine’s website also call bullshit on the article, going as far as to baldly accuse Wired of shilling for big energy, and note that hundreds of similar comments following publication of the story have been purged. Pro-and-con debate on the subject lies beyond my technical prowess, though I have my suspicions. Most interesting to me, however, is what’s not said.

The implicit assumption is that energy demand must be met somehow. Totally and utterly outside of consideration is demand destruction, whether through pricing, metering, or simple unavailability. Sure, there’s 100+ years of coal still available to be mined (or harvested, or exploited, or <choose your euphemism>). Guess we have no choice but to go after it, right? The author does shed some hazy light on environmental and health costs from burning coal, especially in China where it’s worst, but nowhere is there a suggestion that we might stop burning so much of the stuff, which I find a serious omission. Instead, in true technophiliac fashion, an unproven innovation will rescue us from the consequences of our own behavior and deliver salvation (BAU, I suppose, including gadgety distraction if that’s your idea of fun) in the form of “clean coal,” namely, underground resequestration of CO2 released in the process of burning coal. Basically, it’s the equivalent of continuing to dig the hole we’re in by attempting to refill it with its own pollutants. And never mind the delayed effects of what’s already done.

The “clean coal” meme was risible on its face when it appeared a few years ago. Innovation notwithstanding, it continues to be primarily the work of fiction authors marketers and, I guess, stringers for Wired. Several coal ash spills and tonnes upon tonnes of CO2 added to the atmosphere (increasing year over year without stalling) since the meme was hatched are far more convincing to me than hopes of a technofix. Facts and figures make better arguments most of the time. I have none to offer. Instead, let me simply point to everyday sights (and smells inferred from the visuals) we confront. Here is an image from twenty years ago of the city where I live:

Here’s a more recent one:

These days are become a lot less exceptional. How far down this path do we intend to go? All the way, I surmise.

/rant on

The Pew Research Center offers what it calls a News IQ Quiz with the following blurb:

Test your knowledge of prominent people and major events in the news by taking our short 13-question quiz. Then see how you did in comparison with 1,052 randomly sampled adults asked the same questions in a national survey conducted online August 7-14 by the Pew Research Center. The new survey includes a mixture of multiple-choice questions using photographs, maps, charts, and text.

When you finish, you will be able to compare your News IQ with the average American, as well as with the scores of college graduates and those who didn’t attend college; with men and women; and with people your age as well as other ages.

First, there is no such thing as a news IQ. The presumption that awareness of news equates in any fashion to IQ is scurrilous. I scored 75%, which surprises me considering how much of the mainstream media (MSM) I ignore. Second, most of the “prominent people and major events” and hot issues filter down to me in time, but without the editorial spin that brands news organs. Third, I contend that attending to the news is essentially asking to be brainwashed and conned incessantly with respect to the truth (which we rarely glimpse except in the most banal sense) and what constitutes suitable expectations for life and living (which judging from the news are wildly distorted toward the tacky and salacious).

(more…)

Filtering performs an important explanatory function in shaping the ways we understand ourselves and our existence within a larger outer reality. Perception and cognition filtered through language is normally regarded as better than preverbal, ontological existence in its raw forms, such as we all experience in infancy prior to acquiring language, perhaps because truth (the adult kind, not the kid kind) is so powerful and unpalatable we either lose or never really had the ability to face up to it. Using words as symbols of thought, language performs an intermediary function by shaping mental activity, mostly of the intellectual variety, into stories, narratives, scenarios, and straightforward lies, each with their own subtle transformation of reality into something else, something quasifictional, something disembodied and distorted from the original source of direct experience. Figurative language, including similes, analogies, metaphors, euphemisms, metonymy, synecdoche, etc., establish notions extended even further from direct perception. Again and again, I stumble across metaphors that offer explanations of how objective truth/reality (assuming such a thing exists) is not merely compared to something more readily relatable but is in fact spun around through various mental and perceptual agencies and faculties, typically with our own willingness to grant authority to these quasifictions.

Plato’s Allegory of the Cave, telling how we perceive shadows cast against the wall rather than the reality projecting those shadows, is perhaps the earliest of such metaphors. Arthur Schopenhauer’s philosophical work, The World as Will and Representation, is another exploration of underlying realty vs. our mental image or perception of it. Jean Baudrillard wrote another philosophic treatise on the same subject called Simulacra and Simulation. The field of semiotics deals with similar categorical divisions, namely, the signifier and the signified. No doubt there are others of which I am unaware, and truth be told, I dare to blog about this subject without really having delved too deeply into the subject (books linked to above are unread). But then, I’ve always been an armchair intellectual, not a pundit, writer, or university professor with the time and position to devote to rigorous explorations.

Perhaps the most universal albeit pejorative term I have come across recently to describe the entire complex of mental associations that yield social consensus and cohesion is by Joe Bageant: the hologram, as described in his blog/article “Escape from the Zombie Food Court” (and elsewhere). Nicolas Carr has a thought-provoking blog at Rough Type discussing online/offline experience, which might be more recognizable as digital/analog or virtual reality/meatworld. I also learned recently from Thomas Frank, a columnist at Harper’s who is quickly earning my admiration, that in the political realm, the preferred term to denote the appearance of actuality, not truth itself, is optics. (Worthy of note is the fact that the optic nerve connects to the emotional center of the brain, bypassing the logical/rational part. So video in particular, and even imagistic words (those that conjure pictures in the mind), play on our sensibilities more effectively than text even though text is nominally perceived through the eye as well.)

In the age of marketing and mass media, the most significant means of shaping perception and consensus are undoubtedly television and cinema, including all the advertising and product placements that work insidiously to manufacture desire. Constantly served up for our brainwashing entertainment are the rich, powerful, young, beautiful, fashionable, and famous. The picture of the good life, or the American dream, that emerges from these ubiquitous images is glamorous and glitzy, much like the commodities with which these people surround themselves, but the picture typically bears little resemblance to day-to-day life as most of us experience it. And on closer inspection, the images/ideals are revealed to be hollow, sometimes even tawdry and trashy, mere fabrications used as inducements (carrots) to participate fully in commodity culture.

Given how omnipresent this interlocking set of filters is, it should be no surprise that the resulting ideology appears to sensitive souls completely false and meaningless yet paradoxically the only thing that matters, since all other competing worldviews are driven out of sight, mind, and existence. So, for instance, we continue to subscribe to the idea that political action can effect positive, meaningful change even though what actually appears before us is the political theater of profoundly dysfunctional institutions no longer able to solve fundamental problems of social organization and justice. This is the tragedy behind Lawrence Lessig’s latest TED Talk, which tantalizes the viewer (instead of the listener) with a whizbang PowerPoint show and falsely reified textual talking points but really recommends that the citizenry deal with the eternal problem of undue political influence flowing from deep pockets by throwing even more money at the problem, now siphoned off the entire population (sorta like a electoral tax). There is little danger, however, that the scales will fall from our eyes. The self-reinforcing nature of social consensus ensures that nothing outside the hologram will intrude until, at last, unavoidably, the entire, fragile daydream shatters like so many fallen tree ornaments.

I made a discomfiting observation in another venue, for which I was roundly criticized as being snarky and discompassionate, namely, that the media and indeed the wider public was delighted, delirious, and drunk over the prospect of destruction by Sandy, the storm that recently washed over the East Coast. (I refuse to adopt any of the colorful names applied to the storm, which are merely marketing.) The excitement before, during, and after the event was palpable. Media outlets and people alike sprang into meaningless action (meaningless especially in the sense of trying to mitigate storm surge) and began chattering like finches. In the aftermath, government and corporate public relations departments lit up like newly activated phone trees, spreaching (my new portmanteau for spreading/preaching) about how well they were managing the crisis, not the least of which was FEMA. After its utter failure responding to Hurricane Katrina, it appears FEMA may have gotten its house in good order and become a valuable resource to the disaster stricken, at least according to reports I have gathered from some credible sources in positions to know.

I would never wish upon people the awful suffering inflicted by our now overexcited mother (Nature), nor do I regard those who knowingly put themselves in harm’s way being struck down with any righteous gratification. But neither am I blind to the irony of so much wailing (why!? WHY!?) by those who lack normal risk aversion. Yes, cozying up to the beach is just as attractive on the Jersey Shore as in Japan and Indonesia. None of these locations can hold back waters known to inundate beaches periodically. (Nor can Venice, Italy, or The Maldives.) Those intervals are shortening, now that the water in the bathtub is simultaneously filling and sloshing. Global warming/climate change has undoubtedly won some converts now that actual series of events provide the proof eggheaded scientific reports and prognostications lack — at least for those blithely unable to extrapolate the obvious.

But even that isn’t really what sticks in the craw. Rather, it’s something I observed once before: our taste for destruction. Bertrand Russell observed that during WWI, the British rail stations were “crowded with soldiers, almost all of them drunk, half of them accompanied by drunken prostitutes, the other half by wives or sweethearts, all despairing, all reckless, all mad … I had supposed that most people liked money better than anything else, but I discovered that they liked destruction even better.” [quoting from John Gray’s Straw Dogs, p. 182] It’s not just TV news reporters stupidly defying gale winds and sideways rain to demonstrate verisimilitude; we all, to varying degrees, seek proving grounds and hardships against which to establish character, riding out the storm(s) if we can. The psychology points somewhere beyond need, beyond heedless, and is perhaps more accurately and succinctly described by Russell: mad.