Why are young people so bad at interacting with old people?
It is something I have wondered for a very long time, and the thought sprung to mind again as I watched Michael Haneke’s Amour, which centers on an elderly man and woman, Georges and Anne, and their struggles to coexist with the much younger folk in their lives.
First is their daughter, Eva, who sits at her stricken mother’s bedside talking incessantly about money, oblivious to Anne’s obvious physical discomfort and utter disinterest in such trivial matters.
Then there is Alexandre, a piano prodigy and former student of Anne’s, who visits shortly after her stroke and lacks the tact and maturity to overlook the physical maladies she plainly does not wish to talk about.
Finally, the last straw: The horrible visiting nurse devoid of any and all empathy for her patients who is shocked—shocked!—to be dismissed for incompetence, huffily telling Georges she has never been so reprimanded in her life.
There is a disconnect here that extends far beyond the terrain of Haneke’s film.
On certain social and political matters, the existence of a marked generational divide is both clear and uncontroversial.
The story of the 2008 presidential election was precisely one of the young versus the old. Among voters under 30, Barack Obama beat John McCain by a score of 66-32, while McCain beat Obama 51-47 among voters 60 and older. The 2012 election saw similar figures for Obama and Mitt Romney.
The hot social issue of the time, same-sex marriage, follows the same thrust, supported by a supermajority of young’uns and opposed by a majority of old fogies.
Of course, we could recite statistics ‘till the cows come home. It doesn’t mean that we necessarily know anything.
The case for the inevitability of gay marriage acceptance is as follows: Today’s old people are the final holdouts against the cause, so it is only a matter of waiting for them to die before support becomes universal.
The problem with this formulation (apart from its abject callousness) is the assumption that the opinions and general disposition of most old people is a simple function of the time in which they came of age. That is to say, that a person’s outlook stays more or less the same throughout his or her life—whatever one believed about gay marriage in 1963 will be retained in 2013 and forevermore.
Thinking in this way, we neglect to consider the effect of aging itself on a person’s inner and outer natures.
Recall the old witticism, widely but falsely attributed to Winston Churchill, “If you’re not a liberal when you’re 25, you have no heart. If you’re not a conservative by the time you’re 35, you have no brain.”
While the particular sentiment here is a bit flippant, its broader implication is an essential one: With age, people change.
We do wrong, in other words, in regarding people of different ages as if they are somehow of a separate species that we will never completely understand.
They are, rather, a species we will ourselves one day become. We should be careful not to so boldly assume to the contrary, on marriage or anything else.
I hasten to add that I make these observations not merely as a check against treating one’s elders with scorn, but also against treating one’s elders with an overindulgence of affection.
A particular bugaboo of mine is the way so many of us—with the purest of intensions, to be sure—speak to old folks as if they are infants, our voices assuming an unnaturally high pitch and our words carrying an air of condescension, as though old age and senility were indistinguishable from one another and that the former implies the latter.
I am skeptical whether most of the recipients of this odd behavior appreciate it. While I certainly should not presume to speak for an entire generation of people, I am fairly confident that in my own latter years, should I reach them, I would hope to be spoken to like the adult I would then be.
In the meanwhile, I try my level best (with varying results) to regard all my fellow Homo sapiens as if age did not separate us, and was nothing more than the series of cosmic accidents that it is.
We are all one. That’s amour.
In this month’s 60 Minutes/Vanity Fair poll, all the questions were about love.
While most of the queries were boring and predicable—“Do you believe in love at first sight?” “How important is good sex to a successful relationship?”— there was one consideration that caught my eye, and is worth pondering at greater length: “Which marriage vow is the hardest to keep?”
Is it “To always be faithful”? “In sickness and in health”? “For richer and for poorer”? Or perhaps, simply, “For better and for worse”?
In other words, how might we truly take the measure of one’s love for someone else? That is, of course, assuming such a thing can be measured at all.
These impossible questions are the subject of Amour, the amazing new movie by Michael Haneke, which opens in Boston this weekend.
What the film is about—indeed, all that the film is about—is that it’s easy enough for two halves of a marriage to declare their love for one another when they’re young, healthy and relatively carefree. It is the arrival of difficulty, disease and death when the measure of one’s devotion is put to the test.
Amour is the story of Anne and Georges, a long-married couple now in their 80s. After a lifetime of mutual self-sufficiency, Anne suffers a stroke and requires Georges’s support—moral and physical—in ways neither of them is used to or particularly adept at handling.
What makes Amour great—nay, what makes it tolerable—is its understanding that true love, in the context of a long marriage, has very little to do with sex or even romance, and everything to do with commitment, sacrifice and accepting that some things are more important than your own happiness.
In one sequence, we see Georges feeding Anne a glass of water through a straw, which she is no longer able to do herself. Anne is deeply demoralized by having to go about such a basic task in this manner, and George’s own impatience is evident as well.
Georges’s measure of devotion here is proved not by the pleasure he might derive from assisting his wife, but by the obvious agony. Scenes of him helping Anne off the toilet, raising her from bed and cutting up her vegetables make a similar point: He doesn’t particularly enjoy doing any of these things, but his marriage vows demand it.
The movie contains no musical score, no moments of overt melodrama, no yelling and shouting—no “action,” at least by the standards of conventional cinema. Amour is largely a series of long, static shots as the characters carry on their lives as best they know how.
As a movie, Amour would be unbearably tedious were it not so well-acted, well-directed and, well, true. It is dramatic in the sense that life itself is dramatic. It works because we understand why Anne and Georges behave as they do—even if we might have acted differently in a comparable situation.
But then we can’t know such things until they actually happen. People express love in different ways, and there are certain forms we might not notice or appreciate until after the fact. In his first Late Late Show monologue following the death of his father, Craig Ferguson very affectingly recounted the way his father never expressed emotion, but that through four decades of hard work as a postal worker, providing steady support for his wife and kids, “I was never in any doubt that he loved me.”
In its way, Amour is a cautionary tale against entering into a marriage lackadaisically, not taking the commitment seriously and not thinking things through. It is an institution that is not for the fainthearted.
As America grapples with the changing meaning of marriage in today’s society, we have come to recognize that for a time marriage was largely about commitment, but that today it is largely about love.
What Amour suggests above all else is that these two enigmatic concepts are not mutually exclusive. Those traditional marriage vows, as old as the hills, are not a hindrance to true love, but rather are the means to its fullest expression. For better and for worse.
Perhaps you missed the news, but something mildly remarkable—yet largely unremarked-upon—occurred at last Thursday’s announcement of this year’s Academy Award nominees.
That is, David O. Russell’s screwball romantic comedy Silver Linings Playbook became the first movie in 31 years to secure nominations in all four acting categories, with citations for leads Jennifer Lawrence and Bradley Cooper and supporting players Jacki Weaver and Robert De Niro.
Never in the history of the Oscars has a single film won in all four departments (Silver Linings Playbook is not expected to, either); on only two occasions has a single film won in three (A Streetcar Named Desire and Network).
A much likelier headline on Hollywood’s biggest night, February 24, is for Daniel Day-Lewis to become the first man ever to win three Oscars for performances in leading roles, having won in 1989 for My Left Foot and in 2007 for There Will Be Blood. Katherine Hepburn won for Lead Actress four times; to date, no man has won more than two.
If all else fails, perhaps Quvenzhané Wallis will win and become the youngest Best Actress in history or, alternatively, Emmanuelle Riva will win and become the oldest.
What does this all mean? You guessed it: Not a goddamn thing.
It’s trivia—the little bits of information we have no reason to know or care about, except it’s just too much fun.
Everyone has their area of expertise—the one subject on which their breadth of knowledge is unquestioned and completely out of proportion.
Like Bradley Cooper’s character in Silver Linings Playbook, one of my pet subjects is U.S. presidents. Whenever I meet a fellow presidential trivia buff, it is only a matter of time before one of us asks the other what the “S” in “Harry S Truman” stood for. (Answer: Nothing.)
My inkling is that everyone wants to be a go-to human encyclopedia about something. To an extent, it doesn’t really matter what it is. The more obscure, the better—after all, you don’t want too much competition. Otherwise, you risk being Steve Carell in Little Miss Sunshine, whose claim to fame is being the country’s No. 2 scholar of Marcel Proust.
The reason our obsession with trivia is so vexing, and so interesting, is because it is so meaningless.
For all the fun facts about U.S. presidents I have managed to cram into my head, I am under no illusion that they are of any practical use. That the Maxwell House slogan “Good to the Last Drop” was coined by Teddy Roosevelt might be amusing, but it tells us absolutely nothing about U.S. history.
I wonder: Is there some evolutionary reason for this seemingly irrational attraction to the inconsequential?
Dave Barry has expounded at length about the curious way our brains seem wired to store utterly useless information (and really annoying pop songs) at the expense of more pertinent things like credit card information and where we left our keys.
Could the cause of this phenomenon also explain our inclination to memorize frivolous data on purpose? Are we trying to protect ourselves from seriousness and profundity, from exerting ourselves beyond what is absolutely necessary?
No doubt there is a wealth of research by cognitive scientists that can provide explanations for all of these questions, which I could freely look up at any old time. But somehow I’m not feeling up to it today. Too much Oscar trivia to get to.
President Barack Obama will formally begin his second term next Sunday, January 20, and on the following day the nation will mark his second inauguration on the steps of the U.S. Capitol in Washington, D.C.
While provisions for the ceremony have largely proceeded according to plan, the administration ensnared itself into one significant controversy by, for the second time, hiring a radioactive clergyman to perform the pageant’s benediction.
In this case, the pastor in question is a gentleman by the name of Louie Giglio, an evangelical from Georgia best known as the founder of the Passion Movement, an extremely well-attended getup that holds conferences and events throughout the country. Giglio has been roundly praised for drawing attention to the horrors of human trafficking.
However, upon the announcement of Giglio’s participation in Monday’s festivities, he became equally known for having delivered a sermon in the 1990s in which he condemned homosexuality in no uncertain terms, calling it “less than God’s best for his creation,” and assailed the gay rights movement as having an “aggressive agenda […] to seize by any means necessary the feeling and the mood of the day, to the point where the homosexual lifestyle becomes accepted as a norm in our society and is given full standing as any other lifestyle, as it relates to family.”
While the latter portion of this thought is undoubtedly true, various gay rights organizations did not care for the pastor’s tone and applied pressure on the Presidential Inaugural Committee to rescind its invitation to Giglio, who swiftly withdrew from the program in the interests of not being a distraction. The Inaugural Committee, for its part, said it had been unaware of the dicey sermon when it selected Giglio for the Inauguration Day gig.
Predictably—and in light of the similar spectacle of Rick Warren at the 2009 inauguration—many on the left excoriated Obama for again anointing such a divisive figure to the ostensibly unifying role of wishing the president and the country well. What, they ask, is so terribly difficult in finding a member of the clergy who does not have a record of public condemnation toward gays or any other group?
For me, however, the real question is the one nobody seems to be asking: Why does the inauguration of the president include a benediction in the first place?
The United States, one must never tire of saying, is a secular country bound by a secular constitution. We have no official state religion, and our founding documents’ only mention of religious faith is to limit its role in the public square.
We rightly prohibit religious displays on all public property, mandating that the freedom of religion guaranteed by the First Amendment is the business of individuals and private organizations, not the government. The right to worship includes the right not to worship as well. As far as the government is concerned, no particular faith is superior to any other, and none at all shall be observed or practiced on the proverbial public dime.
The inclusion of a clergyman invoking religious language in a foundational American public exercise such as a presidential inauguration—as has occurred at every such ceremony since 1937—would seem to be a textbook violation of this most sacred of national principles.
Even when rejecting this whole premise—as the present administration evidently does—one need not expend any effort whatever in examining why the task of locating a preacher with an unblemished record of inclusion is a troublesome one.
Churches are, by their very nature, exclusionary. To believe in one god is to reject all the others. As Richard Dawkins cheekily put it, “We are all atheists about most of the gods that societies have ever believed in. Some of us just go one god further.”
There is very little, if anything, that all of America’s houses of worship agree on. Accordingly, anything that one priest, rabbi or imam says that is particular to his or her own faith is destined to offend adherents of other faiths, not to mention some within the speaker’s own church.
Should a religious leader manage an entire address without inciting umbrage from a sizable chunk of the public, he or she does so through an appeal to a common humanism, which only further suggests that said speaker need not be associated with a religious organization.
Why do we need our national quadrennial transfer of power to be “blessed”? Why invite such controversy to a setting in which it is not welcome and does not belong? Constitutional questions aside, is it really worth all the trouble?
Here is a heart-warming story left over from the holidays.
A woman in Denham Springs, Louisiana, suspecting a neighbor had kidnapped her dog, erected a Christmas-style light display on her roof in the shape of a giant middle finger, directed at said neighbor.
The woman, Sarah Childs, along with the American Civil Liberties Union of Louisiana, is currently suing the city of Denham Springs, after being told by police that she must remove the display, which the authorities said constitutes “disturbing the peace” and is not protected speech under the U.S. Constitution.
In the English language—nay, in all languages—few mysteries are more vexing than the existence of profanity. How very odd it is that we would invent words only to forbid their use.
The comic George Carlin explored this curiosity to great effect throughout his career, most famously when he singled out the seven most frowned-upon words of all—these being “shit,” “piss,” “fuck,” “cunt,” “motherfucker,” cocksucker” and “tits”—later expanding the list well into the thousands.
The whole concept of words being offensive might have long died out by now, except that enough of us Homo sapiens have opted to be co-conspirators in this dance by being offended by words.
The middle finger—rather, the extending thereof—is more or less a silent variation of this same concept. We, as a culture, have decided such a gesture is offensive—an affront to decency—without quite being able to explain why.
A central question in the Denham Springs case—and a worthy question in general—is whether “the finger” is merely rude and disrespectful or is, in fact, obscene.
This distinction is not without precedent. In 1976, an appellate court in Hartford, Connecticut ruled that raising one’s middle finger at another person is “offensive, but not obscene,” the judge reasoning that “for the finger gesture to be obscene it must be significantly erotic or arouse sexual interest.” The Connecticut Supreme Court upheld the decision.
As the appellate court noted at the time, “flipping the bird” can be traced as far back as Ancient Greece, where the gesture did indeed carry an explicitly sexual connotation, appearing in the work of Aristophanes and elsewhere, in both playful and contemptuous contexts.
In America, anthropologists trace use of “the finger” to Italian immigrants, with the first recorded instance in the States occurring in 1886 during a baseball game between the Boston Beaneaters (later known as the Braves) and the New York Giants.
The other crucial consideration as to whether Sarah Childs should be made to remove the light display from her roof is the all-important quandary as to when speech can be defined as an action, and therefore restricted.
In the seminal 1919 Supreme Court case Schenck v. United States, Justice Oliver Wendell Holmes, Jr., popularized the hypothetical scenario of “falsely shouting fire in a theatre and causing a panic” as an example of “words used […]of such a nature as to create a clear and present danger.”
In point of fact, Childs was informed by police that her unsightly roof message could be interpreted as threatening toward her neighbors. While it is undeniable that the display’s intended and perceived message is one of hostility, I find it difficult to entertain the proposition that one could then come to feel physically threatened by it—unless, of course, Childs rigged it to snap off her roof and tumble onto her neighbor’s driveway as he walked out to get the mail.
My own quaint view is that the right to free expression must include expressions that are rude, disrespectful and—within the boundaries of local laws—intimidating. Displaying a giant, illuminated middle finger might not endear one to one’s community, and may well incite some kind of a backlash, but that is hardly grounds for prohibiting such behavior.
Letting your entire neighborhood know that you are an immature, sociopathic nut is what freedom is all about.
In a recent column, I spent so much time excoriating Zero Dark Thirty—in particular, the disingenuousness of director Kathryn Bigelow and screenwriter Mark Boal—that I failed to mention how very much I enjoyed it.
As much as Bigelow’s follow-up to The Hurt Locker demanded a thorough rebuke, so also does it deserve a resounding defense.
I began last time with a plea from Gustave Flaubert to Harriet Beecher Stowe, “Is it necessary to utter one’s ideas about slavery? Show it, that’s enough.” My recommendation was, and is, that Zero Dark Thirty ought to be considered in this “show, don’t tell” context with respect to its depictions of torture.
As an extension of this thought, I offer the following quotation from Andrew Sullivan: “The case against torture is simply that it is torture, that capturing human beings and ‘breaking’ them physically, mentally, spiritually is a form of absolute evil that negates the core principle of human freedom and autonomy on which the West is founded. It is more fatal to our way of life and civilization than terrorism.”
If Sullivan is correct, then Flaubert is correct. If torture is axiomatically, viscerally and morally repugnant, then Bigelow’s film need not make any comment on it other than simply showing it being done. Those who are repulsed by torture will conclude the movie is against its use, while those who are not might think differently.
It is suggestive of the film’s greatness, not failure, that its politics can be subject to utterly contradictory interpretations by its viewers. The very existence of a debate over the film’s intentions is the most persuasive argument yet for Bigelow’s and Boal’s contention that Zero Dark Thirty is a movie without an agenda.
I am reminded of the brief, but passionate, brouhaha that erupted in early 2005 regarding Clint Eastwood’s film Million Dollar Baby, in which (spoiler alert!) the character played by Eastwood is compelled to assist in ending the life of a stricken dear friend. Critics argued that because Eastwood’s character was clearly intended to be sympathetic—the “hero,” as it were—the film was effectively in favor of assisted suicide.
To this, Roger Ebert countered that a freethinking person could just as easily see the film and conclude that Eastwood was a good man who made a bad decision, and that such a phenomenon does not diminish the movie one whit.
I would optimistically wager that a similar sentiment might be made about Zero Dark Thirty, although in this case it’s a bit more complicated—first because Bigelow’s film is based on real events, and second because its implications reach far beyond the conscience of a single person.
My own view, having seen the thing once, is that Zero Dark Thirty does not glorify or justify torture, although one can be forgiven for concluding to the contrary.
The film shows the employment of waterboarding, stress positions and so forth as part of the amassing of intelligence that led to the killing of Osama bin Laden, but it does not suggest that the intelligence that actually cracked the case was a direct result of said “techniques.”
What we see, rather, is a prisoner providing valuable information to two CIA agents as they offer him hummus and fruit across a picnic table in the warm sunshine—that is, as they treat him with basic human dignity.
The complication is that this follows a sequence in which this man is indeed tortured, and his present tip-off might well spring from fear of being tortured again. Would he not have cooperated had he been treated humanely the whole time? Or, perhaps, might a lack of torture made his information even better?
It is a complex and nasty business. Good on Bigelow for dealing with complexity and nastiness. Few American filmmakers go to such trouble. I wish more of them did.
Of course, we are hardly done with the hard questions about the long journey from September 11, 2001 to May 1, 2011. Was torture necessary to gather the intelligence we required to conduct the so-called war on terror? If so, does that axiomatically make it justified? Or is Andrew Sullivan correct that some things—certain fundamental American values—are simply more important?
On the practicality question, I refer you to Nice Guy Eddie from Reservoir Dogs, who cautioned a pair of cop-torturers, “If you beat this guy long enough, he’ll tell you he started the goddamn Chicago fire—now that don’t necessarily make it so!”
On the moral question, I leave it up to you.
I can’t say exactly when my infatuation with Chris Christie began.
Perhaps it was that instant-classic press conference, shortly after Christie became governor of New Jersey, in which he gamely responded to a reporter’s query about his “confrontational tone” by, well, confronting him. “This is who I am,” said the governor. “Like it or not, you guys are stuck with me for four years.”
Or it might have been earlier, when he was still campaigning against incumbent Jon Corzine, who had made sloppy, obvious allusions to Christie’s wide girth in TV attack ads, about which Christie deadpanned to Wolf Blitzer, “I’ll let you in on a little secret, Wolf: I’m overweight.”
Whatever his faults, Christie at least earns points for cheeky self-awareness.
As a political figure, Christie, who is up for re-election this November, has served as a Rorschach test of sorts since his earliest days as New Jersey’s chief executive. You can tell a fair amount about a person based on how he or she reacts to Christie’s antics in Trenton and on the national stage. It’s interesting, in part, because the two camps do not always divide along traditional political lines.
Of course, the most obvious reason for this phenomenon is that Christie himself does not divide along traditional political lines—more precisely, he has no qualms about criticizing members of his own political party or praising members of the opposition.
He is presently experiencing a wellspring of support and affection from New Jersey voters, 73 percent of whom said they approve of his job performance, according to a recent poll.
Certainly such uncommon popularity began in response to Christie’s leadership amidst the calamitous hurricane that struck the coast in late October, during which Christie famously—or infamously, for Republicans—heaped praise upon the Obama administration for its own swift action in that emergency.
In recent weeks, Christie further grated Republicans and delighted Democrats by attacking the House for failing to hold a vote on a bill to provide hurricane relief funding, saying, “There is only one group to blame for the continued suffering of these innocent victims: The House majority and their speaker, John Boehner.”
Theories abound as to what Christie is “up to” in these moments of rhetorical treachery, most of them rooted in the assumption that he will eventually run for president, possibly as early as 2016.
The theory I most prefer, which also serves to explain Christie’s curious cross-section of admirers, is that he is not up to anything at all. That he is—to coin a phrase—who he is. That he will say exactly what is on his mind, no matter the political peril involved, and if you don’t like it, you can shove it.
This is an attitude that Americans tend to like. The man who won’t be pushed around. The gruff father figure who will do what he thinks is best for his children. The righteous bully whom you would love to have on your side.
What is more, in addition to embodying much that Americans appreciate in a leader, Christie is the antidote to much that Americans hate: The shady, the equivocating, the pandering and the intellectually bankrupt who are prepared to say whatever they think their public wants to hear and who tend to run the show in Washington, D.C.
But the mystery is hardly solved. I, for one, have every reason in the world to dislike the governor, from his tendencies toward the anti-intellectual—asked about his views on evolution, he replied, “That’s none of your business”—to his stated opposition to the one issue I really care about, same-sex marriage.
Yet I’m crazy about the guy all the same, and I suspect my admiration cannot be explained as simple masochism on my part. Nor do I think I am the only person suffering from this cognitive dissonance.
Should Christie decide to seek the White House in 2016 or sometime thereafter, the question will arise (as it already has), “Is he too fat to be president?” The premise, I presume, is that a man of Christie’s rotundity is a man without discipline, and thus a man unsuited to the profound difficulties of the Oval Office.
But there is an alternative narrative to this. That Christie’s weight, in this most judgmental and politically correct of times, is emblematic of his most marked and invaluable characteristic: His ability and self-confidence to just not give a crap.