Posts Tagged ‘Celebrity’

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielded from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’sĀ remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

Advertisements

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)

I’m a little gobsmacked that, in the aftermath of someone finally calling out the open secret of the Hollywood casting couch (don’t know, don’t care how this news cycle started) and netting Harvey Weinstein in the process, so many well-known actors have added their “Me, too!” to the growing scandal. Where were all these sheep before now? As with Bill Cosby and Bill Clinton, what good does it do to allow a serial abuser to continue unchallenged until years, decades later a critical mass finally boils over? I have no special knowledge or expertise in this area, so what follows is the equivalent of a thought experiment.

Though the outlines of the power imbalance between a Hollywood executive and an actor seeking a role (or other industry worker seeking employment) are pretty clear, creating a rich opportunity for the possessor of such power to act like a creep or a criminal, the specific details are still a little shrouded — at least in my limited consumption of the scandal press. How much of Weinstein’s behavior veers over the line from poor taste to criminality is a difficult question precisely because lots of pictorial evidence exists showing relatively powerless people playing along. It’s a very old dynamic, and its quasi-transactional nature should be obvious.

In my idealized, principled view, if one has been transgressed, the proper response is not to slink away or hold one’s tongue until enough others are similarly transgressed to spring into action. The powerless are duty bound to assert their own power — the truth — much like a whistleblower feels compelled to disclose corruptions of government and corporate sectors. Admittedly, that’s likely to compound the initial transgression and come at some personal cost, great or small. But for some of us (a small percentage, I reckon), living with ourselves in silent assent presents an even worse option. By way of analogy, if one were molested by a sketchy uncle and said nothing, I can understand just wanting to move on. But if one said nothing yet knew the sketchy uncle had more kids lined up in the extended family to transgress, then stepping up to protect the younger and weaker would be an absolute must.

In the past few decades, clergy of the Catholic Church sexually abused many young people and deployed an institutional conspiracy to hide the behaviors and protect the transgressors. Exposure should have broken trust bonds between the church and the faithful and invalidated the institution as an abject failure. Didn’t quite work out that way. Similar scandals and corruption across a huge swath of institutions (e.g., corporate, governmental, military, educational, entertainment, and sports entities) have been appearing in public view regularly, yet as a culture, we tolerate more creeps and criminals than we shame or prosecute. (TomDispatch.com is one of the sites that regularly reports these corruptions with respect to American empire; I can scarcely bear to read it sometimes.) I suspect part of that is a legitimate desire for continuity, to avoid burning down the house with everyone in it. That places just about everyone squarely within the “Me, too!” collective. Maybe I shouldn’t be so gobsmacked after all.

Caveat: This thought experiment definitely comes from a male perspective. I recognize that females view these issues quite differently, typically in consideration of far greater vulnerability than males experience (excepting the young boys in the Catholic Church example).

The Internet is now a little more than two decades old (far more actually, but I’m thinking of its widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating pushback and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene, medical conditions, or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

I discovered “The Joe Rogan Experience” on YouTube recently and have been sampling from among the nearly 900 pod- or webcasts posted there. I’m hooked. Rogan is an impressive fellow. He clearly enjoys the life of the mind but, unlike many who are absorbed solely in ideas, has not ignored the life of the body. Over time, he’s also developed expertise in multiple endeavors and can participate knowledgeably in discussion on many topics. Webcasts are basically long, free-form, one-on-one conversations. This lack of structure gives the webcast ample time to explore topics in depth or simply meander. Guests are accomplished or distinguished in some way and usually have fame and wealth to match, which often affects content (i.e., Fitzgerald’s observation: “The rich are different than you and me”). One notable bar to entry is having a strong media presence.

Among the recurring themes, Rogan trots out his techno optimism, which is only a step short of techno utopianism. His optimism is based on two interrelated developments in recent history: widespread diffusion of information over networks and rapid advances in medical devices that can be expected to accelerate, to enhance human capabilities, and soon to transform us into supermen, bypassing evolutionary biology. He extols these views somewhat regularly to his guests, but alas, none of the guests I’ve watched seem to be able to fathom the ideas satisfactorily enough to take up the discussion. (The same is true of Rogan’s assertion that money is just information, which is reductive and inaccurate.) They comment or joke briefly and move onto something more comfortable or accessible. Although I don’t share Rogan’s optimism, I would totally engage in discussion of his flirtation with Transhumanism (a term he doesn’t use). That’s why I’m blogging here about Rogan, in addition to my lacking enough conventional distinction and fame to score an invite to be a guest on his webcast. Plus, he openly disdains bloggers, many of whom moderate comments (I don’t) or otherwise channel discussion to control content. Oh, well.

(more…)

/rant on

With a new round of presidential debates upon us (not really debates if one understands the nature of debate or indeed moderation — James Howard Kunstler called it “the gruesome spectacle of the so-called debate between Trump and Clinton in an election campaign beneath the dignity of a third-world shit-hole”), it’s worthwhile to keep in the front of one’s mind that the current style of public discourse does not aim to provide useful or actionable information with regard to either the candidates or the issues. Rather, the idea is to pummel the hapless listener, watcher, or reader into a quivering jangle of confusion by maintaining a nonstop onslaught of soundbites, assertions, accusations, grandstanding, and false narratives. Our information environment abets this style of machine-gun discourse, with innumerable feeds from InstaGoogTwitFaceTube (et cetera), all vying simultaneously for our limited attention and thereby guaranteeing that virtually nothing makes a strong impression before the next bit of BS displaces it in a rapid succession of predigested morsels having no nutritional content or value for earnest consumers of information (as opposed to mouth-breathers seeking emotional salve for their worst biases and bigotry). Many feeds are frankly indecipherable, such as when the message is brutally truncated and possessed of acronyms and hashtags, the screen is cluttered with multiple text scrolls, or panel participants talk over each other to claim more screen time (or merely raise their asshole quotient by being the most obnoxious). But no matter so long as the double barrels keep firing.

I caught Republican nominee Donald Trump’s campaign manager Kellyann Conway being interviewed by some banal featherweight pulling punches (sorry, no link, but she’s eminently searchable). Conway proved adept at deflecting obvious contradictions and reversals (and worse) of the Trump campaign by launching so many ideological bombs that nothing the interviewer raised actually landed. Questions and conflicts just floated away, unaddressed and unanswered. Her bizarre, hyperverbal incoherence is similar to the candidate’s stammering word salad, and ironically, both give new meaning to the decades-old term “Teflon” when applied to politics. Nothing sticks because piling on more and more complete wrongness and cognitive dissonance overwhelms and bewilders anyone trying to track the discussion. Trump and Conway are hardly alone in this, of course, though their mastery is notable (but not admirable). Talking heads gathered in panel discussions on, say, The View or Real Time with Bill Maher, just about any klatch occupying news and morning-show couches, and hosts of satirical news shows (some mentioned here) exhibit the same behavior: a constant barrage of high-speed inanity (and jokes, omigod the jokes!) that discourages consideration of an idea before driving pellmell onto the next.

Thoughtful persons might pause to wonder whether breathless, even virtuoso delivery results from or creates our abysmally short attention spans and lack of serious discussion of problems plaguing the nation. Well, why can’t it be both? Modern media is all now fast media, delivering hit-and-run spectacle to overloaded nervous systems long habituated to being goosed every few moments. (Or as quoted years ago, “the average Hollywood movie has become indistinguishable from a panic attack.”) Our nervous systems can’t handle it, obviously. We have become insatiable information addicts seeking not just the next fix but a perpetual fix, yet the impatient demand for immediate gratification — Internet always at our fingertips — is never quelled. Some new bit will be added to the torrent of foolishness sooner than it can be pulled down. And so we stumble like zombies, blindly and willingly, into a surreality of our own making, heads down and faces blue from the glare of the phone/tablet/computer. Of course, the shitshow is brightly festooned with buffoon candidates holding court over the masses neither intends to serve faithfully in office. Their special brand of insanity is repeated again and again throughout the ranks of media denizens (celebrity is a curse, much like obscene wealth, or didn’t you know that?) and is seeping into the ground water to poison all of us.

/rant off

A couple of posts ago, I used the phrase “pay to play” in reference to our bought-and-paid-for system of political patronage. This is one of those open secrets we all recognize but gloss over because, frankly, in a capitalist economy, anything that can be monetized and corrupted will be. Those who are thus paid to play enjoy fairly handsome rewards for doing not very much, really. Yet the paradigm is self-reinforcing, much like the voting system, with promises of increased efficiency and effectiveness with greater levels of participation. Nothing of the sort has proven to be true; it’s simply a goad we continue to hear, some believing in the carrot quite earnestly, others holding their noses and ponying up their dollars and votes, and still others so demoralized and disgusted with the entire pointless constellation of lies and obfuscations that refusing to participate feels like the only honest response. (Periodic arguments levied my way that voting is quite important have failed to convince me that my vote matters a whit. Rather, it takes a bizarre sort of doublethink to conclude that casting my ballot is meaningful. Of late, I’ve succumbed to sustained harangues and shown up to vote, but my heart’s not in it.) I can’t distinguish so well anymore between true believers and mere manipulators except to observe that the former are more likely to be what few civic-minded voters remain and the latter are obviously candidates and their PR hacks. Journalists? Don’t get me started.

The phrase put me in mind of two other endeavors (beyond politics) where a few professionals enjoy being paid to play: sports and performing arts. Both enjoy heavy subscription among the masses early in life, as student sports and performing groups offer training and experience. The way most of us start out, in fact, we actually pay to play through classes, lessons, training, dues, and memberships that provide access to experts and put us in position to reap rewards later in life. Maybe you attended tennis camp or music camp as a kid, or you paid for a college education (defrayed perhaps by activity scholarships) majoring in athletics or theater. Lots of variations exist, and they’re not limited to youth. As an endurance athlete, I continue to pay entrance fees to race organizers for the opportunity to race on courses with support that would otherwise be unavailable without the budget provided by participants, sponsorship notwithstanding. Chicago’s popular 16-inch softball leagues are pay-to-play sports.

A second phase might be giving it away for free. As with paying to play, pure enjoyment of the endeavor works as a strong motivation and justification. This is probably more common in the community-level performing arts, where participation is just plain fun. And who knows? Exposure might lead to a big break or discovery. It’s also what motivates quite a lot of amateur athletes, especially for sports that have not gone mainstream. Olympic athletes (tertiary events) might fall roughly into this category, especially when their primary incomes are derived elsewhere. A third phase is being paid to play. If the audience or fan base is big enough, the financial rewards and fame can be considerable. However, those who enter the professional ranks don’t always demonstrate such great prowess, especially early on. More than a few blow up and flame out quickly, unable to sustain the spark that launched their careers. There’s also being paid to play but earning well short of a livable wage, which borders on giving it away for free or at least for too little. A final phase is being paid not to play. A mean interpretation of that would be that one is screwing up or blocking others’ opportunities to the point where it becomes worthwhile to pay someone to not show up or to go away. A more charitable interpretation would be that one’s employment contract includes time-off benefits that require continuous payments even when not playing.

As with my post about the differences between the Participation, Achievement, and Championship Models, I’m now content with numerous endeavors to be either pay to play, play for free, or play for too little. Participation makes it worthwhile under any payment regime, the alternative typically being sitting at home on my couch wasting my time in front of the TV. I never made it to the enviable position of being paid to play (full-time, anyway) or paid not to play. Still, as an individual of some attainment and multiple areas of expertise, I admit finding it irksome to observe some truly awful people out there pulling in attention and wealth despite rather feeble efforts or abilities. The meritocracy may not be dead, but it often looks comatose.

I’m not paying close attention to the RNC in Cleveland. Actually, I’m ignoring it completely, still hoping that it doesn’t erupt in violence before the closing curtain. Yet I can’t help but hear some relevant news, and I have read a few commentaries. Ultimately, the RNC sounds like a sad, sad nonevent put on by amateurs, with many party members avoiding coming anywhere near. What’s actually transpiring is worthy of out-loud laughter at the embarrassment and indignities suffered by participants. The particular gaffe that caught my attention is cribbing from Michelle Obama in the speech delivered on Monday by Melania Trump. The speech writer, Meredith McIver, has accepted blame for it and characterized it as an innocent mistake.

Maybe someone else has already said or written this, but I suspect innocent plagiarism is probably true precisely because that’s the standard in quite a lot of academe these days. Students get away with it all the time, just not on a national stage. Reworking another’s ideas is far easier than coming up with one’s own original ideas, and Melania Trump has no reputation (near as I can tell) as an original thinker. The article linked to above indicates she admires Michelle Obama, so the plagiarism is from a twisted perspective an encomium.

The failure of Trump campaign officials to review the speech (or if they did review it, then do so effectively) is another LOL gaffe. It doesn’t rise to the level of the McCain campaign’s failure to vet Sarah Palin properly and won’t have any enduring effects, but it does reflect upon the Trump campaign’s ineptitude. My secret fear is that ineptitude is precisely why a lot of folks intend to vote for Trump: so that he can accelerate America’s self-destruction. It’s a negative view, and somewhat devil-may-care, to say “let’s make it a quick crash and get things over with already.” Or maybe it’s darkly funny only until suffering ramps up.

The first Gilded Age in the U.S. and the Roaring Twenties were periods that produced an overabundance of wealth for a handful of people. Some of them became synonymous with the term robber baron precisely for their ability to extract and accumulate wealth, often using tactics that to say the least lacked scruples when not downright criminal. The names include Rockefeller, Carnegie, Astor, Mellon, Stanford, Vanderbilt, Duke, Morgan, and Schwab. All have their names associated in posterity with famous institutions. Some are colleges and universities, others are banking and investment behemoths, yet others are place names and commercial establishments. Perhaps the philanthropy they practiced was not entirely generous, as captains of industry (then and today) seem to enjoy burnishing their legacies with a high level of name permanence. Still, one can observe that most of the institutions bearing their names are infrastructure useful to the general public, making them public goods. This may be partly because the early 20th century was still a time of nation building, whereas today is arguably a time of decaying empire.

The second Gilded Age in the U.S. commenced in the 1980s and is still going strong as measured by wealth inequality. However, the fortunes of today’s tycoons appear to be directed less toward public enrichment than toward self-aggrandizement. The very nature of philanthropy has shifted. Two modern philanthropists appear to be transitional: Bill Gates and Ted Turner. The Gates Foundation has a range of missions, including healthcare, education, and access to information technology. Ted Turner’s well-publicized $1 billion gift to the United Nations Foundation in 1997 was an open dare to encourage similar philanthropy among the ultrarich. The Turner Foundation website’s byline is “protecting & restoring the natural world.” Not to be ungrateful or uncharitable, but both figureheads are renowned for highhandedness in the fashion in which they gathered up their considerable fortunes and are redirecting some portion of their wealth toward pet projects that can be interpreted as a little self-serving. Philanthropic efforts by Warren Buffet appear to be less about giving away his own fortune to charities or establishing institutions bearing his name as they are about using his notoriety to raise charitable funds from others sources and thus stimulating charitable giving. The old saying applies especially to Buffet: “no one gets rich by giving it away.” More galling, perhaps, is another group of philanthropists, who seem to be more interested in building shrines to themselves. Two entries stand out: The Lucas Museum (currently seeking a controversial site in Chicago) and The Walmart Museum. Neither resembles a public good, though their press packages may try to convince otherwise.

Charity has also shifted toward celebrity giving, with this website providing philanthropic news and profiles of celebrities complete with their causes and beneficiaries. With such a wide range of people under consideration, it’s impossible to make any sweeping statements about the use or misuse of celebrity, the way entertainers are overcompensated for their talents, or even how individuals such as Richard Branson and Elon Musk have been elevated to celebrity status primarily for being rich. (They undoubtedly have other legitimate claims to fame, but they’re overshadowed in a culture that celebrates wealth before any other attribute.) And then there are the wealthy contributors to political campaigns, such as the Koch brothers, George Soros, and Sheldon Adelson, just to name a few. It’s fair to say that every contributor wants some bang for their buck, but I daresay that political contributors (not strictly charity givers) expect a higher quotient of influence, or in terms more consistent with their thinking, a greater return on investment.

None of this takes into account the charitable work and political contributions stemming from corporations and unions, or indeed the umbrella corporations that exist solely to raise funds from the general public, taking a sizeable share in administrative fees before passing some portion onto the eventual beneficiary. Topical charities and scams also spring up in response to whatever is the latest natural disaster or atrocity. What’s the average citizen to do when the pittance they can donate pales in comparison to that offered by the 1% (which would be over 3 million people in the U.S. alone)? Or indeed how does one guard against being manipulated by scammers (including the burgeoning number of street panhandlers) and political candidates into throwing money at fundamentally insoluble problems? Are monetary gifts really the best way of demonstrating charity toward the needy? Answers to these questions are not forthcoming.

Update: Closure has been achieved on the Lucas Museum coming to Chicago. After 2 years of litigation blocking any building on his proposed site on the lakefront, George Lucas has decided to seek a site in California instead. Both sides had to put their idiotic PR spin on the result, but most people I know are relieved not to have George Lucas making inroads into Chicago architecture. Now if only we could turn back time and stop Donald Trump.