Posts Tagged ‘Neologisms’

In my travels and readings upon the Intertubes, which proceed in fits and starts, I stumbled across roughly the same term — The NOW! People — used in completely different contexts and with different meanings. Worth some unpacking for idle consideration.

Meaning and Usage the First: The more philosophical of the two, this refers to those who feel anxiety, isolation, estrangement, disenfranchisement, and alienation from the world in stark recognition of the self-other problem and/or mind-body dualism. They seek to lose their identity and the time-boundedness that goes with being a separate self by entering a mental state characterized by the eternal NOW, much as animals without consciousness are believed to think. Projection forward and back more than a few moments in time is foreclosed; one simply exists NOW! Seminars and YouTube videos on radical nonduality are offers by Tony Parsons, Jim Newman, Andreas Müller, and Kenneth Madden, but according to my source (unacknowledged and unlinked), they readily admit that despite study, meditation, openness, and desire to achieve this state of mind, it is not prone to being triggered. It either happens or it doesn’t. Nonetheless, some experiences and behaviors allow individuals to transcend themselves at least to some degree, such as music, dance, and sex.

Meaning and Usage the Second: The more populist and familiar of the two, this refers to people for whom NOW! is always the proper time to do whatever the hell they most urgently desire with no consideration given to those around them. The more mundane instance is someone stopping in a doorway or on an escalator to check their phones for, oh, I dunno, Facebook updates and new e-mail. A similar example is an automobile driver over whom traffic and parking controls have no effect: someone double-parked (flashers optional) in the middle of the road or in a fire lane, some who executes a U-turn in the middle of traffic, or someone who pointlessly jumps the line in congestion just to get a few cars lengths ahead only to sit in yet more traffic. The same disregard and disrespect for others is evident in those who insist on saving seats or places in line, or on the Chicago L, those who occupy seats with bags that really belong on their laps or stand blocking the doorways (typically arms extended looking assiduously at their phones), making everyone climb past them to board or alight the train. These examples are all about someone commandeering public space as personal space at the anonymous expense of anyone else unfortunate enough to be in the same location, but examples multiply quickly beyond these. Courtesy and other social lubricants be damned! I want what I want right NOW! and you can go pound sand.

Both types of NOW! behavior dissolve the thinking, planning, orchestrating, strategizing mind in favor of narrowing thought and perception to this very moment. The first gives away willfulness and desire in favor of tranquility and contentedness, whereas the second demonstrates single-minded pursuit of a single objective without thought of consequence, especially to others. Both types of NOW! People also fit within the Transhumanist paradigm, which has among its aims leaving behind worldly concerns to float freely as information processors. If I were charitable about The NOW! People, I might say they lose possession of themselves by absorption into a timeless, mindless present; if less charitable, I might say that annihilation of the self (however temporary) transforms them into automatons.

The sole appeal I can imagine to retreating from oneself to occupy the eternal moment, once one has glimpsed, sensed, or felt the bitter loneliness of selfhood, is cessation of suffering. To cross over into selflessness is to achieve liberation from want, or in the Buddhist sense, Nirvana. Having a more Romantic aesthetic, my inclination is instead to go deeper and to seek the full flower of humanity in all its varieties. That also means recognizing, acknowledging, and embracing darker aspects of human experience, and yes, no small amount of discomfort and suffering. Our psycho-spiritual capacity demands it implicitly. But it takes strong character to go toward extremes of light and dark. The NOW! People narrow their range radically and may well be the next phase of human consciousness if I read the tea leaves correctly.

Advertisements

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

I have always remembered a striking line from the movie The Dancer Upstairs where the police investigator, who is tracking the leader of Shining Path in Peru in the 1980s, says (paraphrasing from Spanish), “I think there is a revolution going on.” Elsewhere on the globe today, Arab Spring has morphed from a series of U.S.-instigated regime changes into an emerging Arab state (ISIS), though establishing itself is violent and medieval. According to Tom Engelhardt, even the U.S. has a new political system rising out of the ruins of its own dysfunction. Unless I’m mistaken, a revolution is a political system being overthrown by mass uprising of the citizenry, whereas a coup is a powerful splinter within the current regime (often the military wing) seizing administrative control. What Engelhardt describes is more nearly a coup, and like the quote above, it appears to be coalescing around us in plain sight, though that conclusion is scarcely spoken aloud. It may well be that Engelhardt has succeeded in crystallizing the moment. His five principal arguments are these:

  1. 1% Elections — distortion of the electoral system by dollars and dynasties.
  2. Privatization of the State — proper functions of the state transferred into the hands of privateers (especially mercenaries and so-called warrior corporations — nice neologism).
  3. De-legitimization of Congress and the Presidency — fundamental inability to govern, regulate, and/or prosecute at the Federal level, opening up a power vacuum.
  4. Rise of the National Security State (Fourth Branch of Government) — the dragnet complex revealed (in part) by whistle-blower Edward Snowden but plain to see post-9/11.
  5. Demobilization of the American People — surprising silence of the public in the face of such unwholesome developments.

Please read the article for yourself, which is very well written. (I am no great fan of the journalistic style but must acknowledge that Engelhardt’s work is terrific.) I especially like Engelhardt’s suggestion that a grand conspiracy (e.g., New World Order) is not necessary but that instead it’s all being improvised on the run. Let me offer a couple observations of my own.

Power has several attributes, such as the position to influence events, the resources to get things done, and the ability to motivate (or quell) the public through active management of perception. High offices (both government and boardroom, both elected and appointed) are the positions, the U.S. Treasury and the wealth of the 1% are the resources, and charismatic storytelling (now outright lying) is management of perception. Actors (a word chosen purposely) across the American stage have been maneuvering for generations to wield power, often for its own sake but more generally in the pursuit of wealth. One might assume that once personal wealth has been acquired motivations would slacken, but instead they divert in not a few psychopaths to maniacal building of multigenerational dynasties.

Pulling the levers of state in one capacity or another is a timeworn mechanism for achieving the proxy immortality of the American statesman. However, as dysfunction in the political arena has grown, corporations (including banks) have assumed the reins. Despite corporate personhood being conferred and recently expanded, largely via judicial fiat, the profit motive has reasserted itself as primary, since there is no such thing as a fully self-actualized corporation. Thus, we have the Federal Reserve System acting as a de facto corporation within government — but without conscience. Multiply that hundreds of times over and voilà: an American corporatocracy.

The effect has been extrapolated in numerous movies and television shows, all offering dystopic warnings of things to come where people, domestic and alien, are all expendable as power seeks to perpetuate itself. How far this can go before financial collapse, climate change, energy scarcity, or a host of others looming calamities overtakes is yet to be seen. Some hold out hope for true revolution, but I believe that possibility has been contained. Considering how the world has been accelerating toward ecocide, I venture that at most a few more decades of desperate negotiations with fate are in store for us. Alternatively, I find it entirely feasible that the delicate web of interconnections that maintain life in all its manifestations could suffer a phase shift rather quickly, at which point all bets are off. Either way, in no one’s wildest imagination could our current civilization be considered the best we can do, much less the best of all possible worlds.

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

Peter Van Buren has a new book out and is flogging it at TomDispatch. He’s a good enough writer, so I have no objection to the promotional aspect of disseminating his own work. But as I read his article describing an America gone to seed, I realized that for all his writerly skill, he misses the point. As a former State Dept. administrator (charged with assisting Iraqi reconstruction) turned whistle-blower, Van Buren is clearly outside the mainstream media and somewhat outside mainstream opinion, yet he appears to be well within the dominant paradigm. His new spin on regime change takes as implicit all the teachings of economics and politics as systems ideally suited to engineering an equitable social contract where everyone benefits. But as cycles of history have shown, those systems are even more prone to manipulation by a power elite who care little about people they pretend to serve. Whether that carelessness is learned or ingrained in the kleptocracy plutocracy is open to debate.

Van Buren’s article offers a few interesting tidbits, including a couple neologisms (I’m always on the lookout for new coin):

dirt shadow = the faint but legible image left behind an uninstalled sign on the exterior of a closed storefront or building

street gravy = the dirt and grunge that collects over time on a homeless person

Neither is too picaresque. The second is obviously a (sad because it’s too hip) euphemism, since gravy suggests richness whereas the actuality is downright unpleasant. As Van Buren surveys, similar unpleasantness is currently experienced all across America in towns and niche economies that have imploded. Interestingly, his counterexample is a U.S. Marine Corps base, Camp Lejeune located in North Carolina, that functions as a gated community with the added irony that it is supported by public funds. Van Buren also notes that, according to the Congressional Budget Office, an average active-duty service member receives a benefits and pay compensation package estimated to be worth $99,000, some 60 percent of it in noncash compensation.

If there is a cause why our regime is in disarray, however, Van Buren busies himself with standard economic and political (one might even say military-industrial) explanations, demonstrating an inability to frame the decline of empire as the beginning of an epochal shift away from plentiful energy resources, famously termed The Long Emergency by James Howard Kunstler. (We ought to resurrect that phrase.) Other frames of reference are certainly not without their impacts, but the inability to connect all the dots to see the underlying cause is commonplace in the mainstream.

In contrast, consider this passage from Harvesting the Biosphere by Vaclav Smil:

There are two simple explanations why food production in traditional agricultural societies — despite its relatively high need for claiming new arable land — had a limited impact on natural ecosystems: very low population growth rates and very slow improvements in prevailing diets. Population growth rates averaged no more than 0.05% during the antiquity and they reached maxima of just 0.07% in medieval Eurasia — resulting in very slow expansion of premodern societies: it took Europe nearly 1,500 years to double the population it had when Rome became an empire, and Asian doubling was only a bit faster, from the time of China’s Han dynasty to the late Ming period. [pp. 118–119]

Smil goes on to provide exhaustive detail, much of it measurement (with acknowledged ranges of error), showing how modern mechanisms and energy exploitation have enabled rapid population growth. Although population has (apparently) not yet peaked, we are already sliding back down the energy gradient we climbed over the past 250 years and will soon enough face widespread food shortages (among other things) as productivity plummets due to diminishing energy inputs and accumulated environmental destruction (including climate change). Economics and politics do not possess solutions to that prospect. That’s the real dirt in the historical narrative, which remains largely uncovered (unreported) by paradigmatic thinking.

I have watched, blogged about, and embedded my share of TED Talks over the years I’ve been active as a blogger. As a cottage industry for the cognoscenti, TED is an impressive, multinational undertaking being fed into by lots of impressive scientists, researchers, and know-it-alls. What’s not to like? One spends about 12 min. watching and listening to someone simplify complex issues, recommend a handful of impossible solutions, and make promises to roll up one’s sleeves and do the hard work, but then one forgets about it all the next day. Or one might watch another talk, or a series of talks, to get the brain goosed before moving on. The most succinct criticism I have heard of the phenomenon called it insight porn. So after my initial excitement with TED and its voluminous insights, I grew to be more and more skeptical of various speakers’ claims. This was especially true after I learned that Allan Savory admitted to having been instrumental in wholesale murder of 40,000 African elephants — a massacre undertaken with the full authority of scientific consensus (at the time, which Savory is now attempting to reverse — oops, his bad).

In light of my misgivings about TED, I found it curious that a TED Talk by Benjamin Bratton has attracted lots of positive attention. His subject is TED Talks themselves, and he makes his case in a TED Talk, which is reproduced as a transcript in an article by The Guardian. There is much to admire in Bratton’s analysis but more perhaps to find objectionable. He occupies a difficult position, both behind the curtain and before it, which creates recursive conflicts. (I’m rather fond myself of busting through rhetorical frames, but I try not to stand inside them at the same time.) Further, I read the article before watching the talk in video form and found quite a difference between the two media. The lack of visual distractions and audience response to the jokes in the transcript allow me to penetrate much deeper. I suspect that a large part of that is not being entrained by sympathetic response.

Bratton offers several worthwhile insights, among them the cult of the solution (my term, not his), which becomes hypothetical since implementation rarely goes forward. This is true especially when it comes to reconceptualizing fondly held myths and self-delusions about what science and sociology can really do for our understandings about the ways the world really works. That’s one of Bratton’s related criticisms: the nonworkability of TED Talk solutions. It’s ironic that Bratton observes so many TED Talks founder on practicality when at the same time his initial anecdote about a scientific presentation failing to motivate the listener (be more like Malcolm Gladwell and engage the emotions, transforming epiphany into entertainment) receives serious derision. There’s room for both interpretations, but some circumspection is needed here. Nothing could be more obvious than how various political and corporate entities have succeeded in motivating the public into action or inaction through propaganda, marketing campaigns, and emotional manipulation, whereas the sober, objective posture of scientific inquiry has failed utterly to get action on an extremely pressing civilizational disaster quite different from the one Bratton envisions. (Regular readers of this blog know what I’m referring to.) Bratton actually takes note of GOP dominance of messaging (what he calls bracketing reality) but fails to connect the dots.

The T-E-D in TED

Bratton also objects that the technology-education-design orientation of TED ought to be instead tech-econ-design. This reconfiguration got by my internal bullshit sensor on video, but the transcript fares less well. For instance, he notes that TED often delivers placebo technoradicalism (no lack of clever word formations) but concludes that it fails precisely because we’re too timid to actually embrace technology with enough gusto. He doesn’t seem to get that technological advance is at best mixed and at worse disastrous (e.g., hypercomplexity, WMDs, nuclear mismanagement and accidents, overconsumption of resources leading to population overshoot leading back to overconsumption). Bratton apparently still worships the tech idol, failing to recognize that the world it has delivered serves us rather poorly. Further, neither education nor economics have turned out to be much of a panacea for our most intractable problems. I’m unsure what Bratton thinks exchanging one for the other in the context of TED will accomplish. Last, redefining design as immunization rather than innovation sounds like something worthwhile but is the same hypothetical, unrealizable solution he criticizes earlier. The notion that we can engineer our way out of problems if only designers wise up and others stop gaming systems for profit or self-aggrandizement begs quite a lot of questions that go unaddressed.

In spite of his disavowal of a simple takeaway, Bratton offers standard PoMo word salad filled with specialized jargon:

… it’s not as though there is a shortage of topics for serious discussion. We need a deeper conversation about the difference between digital cosmopolitanism and cloud feudalism … I would like new maps of the world, ones not based on settler colonialism, legacy genomes and bronze age myths, but instead on something more … scalable … we need to raise the level of general understanding to the level of complexity of the systems in which we are embedded and which are embedded in us. This is not about “personal stories of inspiration”, it’s about the difficult and uncertain work of demystification and reconceptualisation: the hard stuff that really changes how we think …

Really changes how we think? For someone who writes about “entanglements of technology and culture,” Bratton has a serious misconception about how the brainchild of the Enlightenment can ultimately win the day and deliver salvation from ourselves. Myth and story and narrative are not mistakes to be replaced by rationalism; they are who and how we are in the world. They don’t go away by wagging the bony finger of science/tech but are only slowly forgotten, revised, and replaced by yet other myths, stories, and narratives. The depth of our entrenchment in such cultural baggage is one of the very things forestalling change. We can peer behind the curtain or see from outside the bubble sometimes, but we can’t escape them. We never could.

It’s been a long while since I’ve written about neologisms. They come across my radar with some regularity, though I don’t bother to collect them. It’s arguable, too, that since most neologisms arise in pop and hipster culture, there is no point to referring to this post as a “pop edition.” Those caveats in place, here goes:

The old prank about directing visitors to NYC to addresses on Avenue of the Americas got an update. Now the joke is to offer a restaurant recommendation: the Umbrella Room. Turns out this in-joke actually refers to one of the street vendor carts selling pretzels or hot dogs. I suppose the joke is especially gratifying in two parts: a first New Yorker initiating and then the poor sap asking a second New Yorker for help locating the damn place. I think maybe I heard this term in a movie.

Considering Miley Cyrus has been on a graceless promotional bender for most of the year, I finally got around to learning what twerking is. She didn’t originate the move (dance? really?), but she’s probably more closely associated with the term than anyone else. Like other bits of Cyrus ephemera lodged in my brain, I’m none too happy to have my mind colonized by her nonsense. But with the media gaze still firmly fixed her, the latest pop-tart sensation, it’s inevitable that some of her antics penetrate my defenses.

Editors of the estimable Oxford English Dictionary have named selfie the word of the year. Really? Word of the year? What is this, seventh grade? Google provides trend analysis for those who care. The meaning is utterly unimportant, and I’ve not bothered to provide a definition. Other than admitting new terms into the dictionary, when did such fluff warrant the attention of OED editors?

The most interesting one by far (for me, at least) is neckbeard, which refers (variously) to a nerdy enthusiast who doesn’t bother shaving his neck. It’s appeared derisively in several columns and blogs I read, though without apparent provocation or context. I especially like a definition found at Urban Dictionary:

Talkative, self-important nerdy men (usually age 30 and up) who, through an inability to properly decode social cues, mistake others’ strained tolerance of their blather for evidence of their own charm.

Other associations include excessive video gaming and social awkwardness. Is it only time before a feminine equivalent appears?

Update: I forgot to mention one that isn’t new coin exactly but is new to me, namely, four on the floor. This refers to the driving beat in dance music that is consistently weighted across the standard four-beat pattern, as opposed to the more traditional back-beat emphasis on two and four. It is no surprise to me that, while being a rather sophisticated musician, I’d never heard this term. Reason being, I don’t dwell on pop or dance or synth or rock. My tastes run more to classical and jazz. Thus, I fare very poorly at karaoke not because I can’t carry a tune but because I frankly don’t know many of the songs to sing. Plus, what’s being produced these days has little of the appealing tunefulness of, say, the Great American Songbook.