With a Clinton running for president, Donald Trump aiming to “Make America Great Again,” and PC culture making a noxious comeback, countless commentators have appropriately noted the “politics of nostalgia.” Especially among whites, data of many sorts suggest a longing for the past—a past where “life for people like [them]” was better, where their family ties and civic commitments and religious communities were stronger, where their future economic prospects seemed brighter. My friend Dimitri Halikias is writing a series of reviews on a recent book by political scientist Yuval Levin discussing this very topic.
But despite a rise in the collective awareness of this “politics of nostalgia,” discussion of the much broader culture of nostalgia has been, with a few notable exceptions, quite absent. In at least the last five years, almost all media of artistic expression have seen an obsession with nostalgia driving much of our present cultural formation (well, re-formation) and engagement.
The most egregious culprit producing this hypernostalgia is our film industry. At my local movie theater, there are nine films currently on screen: among them, Alvin and the Chipmunks: The Road Chip, the fourth film installment since 2007 about a group dating back to the ’50s; Ben-Hur, a remake of the classic film from 1959; Jason Bourne, the fifth film of the franchise based on the 1980 novel; Mechanic: Resurrection, a sequel to a film in 2011 that was itself a remake of a film from 1972; Pete’s Dragon, Disney’s remake of its own film of the same name in 1977; Suicide Squad, which features a host of DC Comics characters we’ve seen for over half a century; and Star Trek Beyond, which, well, you get the point. (If I went to my theater and wanted to watch a film with an original script, my sole options would be some horror flick and Sausage Party.) Also currently playing in other theaters: sequels of The Purge and Ice Age and remakes of The BFG, Ghostbusters, and Tarzan.
And that’s just what’s in theaters right now in August 2016. As New York Times columnist Ross Douthat put it back in January, “The biggest blockbuster of 2015 was about . . . Darth Vader’s grandchildren. It is directed by a filmmaker who’s coming off rebooting . . . Star Trek. And the wider cinematic landscape is defined by . . . the recycling of comic-book properties developed between the 1940s and the 1970s.” Coming soon: more Jedis, more Avengers, and probably at least one more James Bond.
Even on TV—a medium that, to be fair, has continued to produce original, mass-market material in this “Golden Age”—there’s apparently a market for Fuller House, which earlier this spring was renewed for a second season. Jimmy Fallon hosted the Saved by the Bell cast to act in a skit on his show last year, using a recreated set of Bayside High; and the cast of Friends held a reunion at the start of the year.
To prove our hypernostalgia in television isn’t limited to just the ’90s, the show 24, after eight relatively successful (and solid) seasons from 2001 to 2010, rebooted and released a ninth season a full four years later in 2014; a spinoff, titled 24: Legacy, is scheduled to premiere in 2017. The producers of the best show in the last decade, Breaking Bad, just couldn’t help themselves when the show finished: Less than two years later, they put out the spinoff, Better Call Saul. And The Simpsons, while not a reboot, a remake, a spinoff, or a sequel, is still on air—beginning its 28th season(!) in a few short weeks.
This unfortunate phenomenon also affects the books we read, the musicals we watch, the games we play, and even the clothes we wear. According to Publishers Weekly, the number-one bestselling work of fiction in 2015 was Go Set a Watchman, Harper Lee’s prequel to To Kill a Mockingbird, which itself made the list at number seven. Also in the top 20 bestsellers of 2015: The Great Gatsby, Fahrenheit 451, and The Alchemist. As of this moment, on Barnes & Noble’s list of the top 100 bestsellers of 2016, Harry Potter and the Cursed Child tops the list. Moreover, in the last year, the musical that has swept the nation—more so than any musical in my lifetime—features the story of a famous Founding Father we’ve presumably all learned about in grade school. Hamilton obviously takes a shot at building upon our history and delivers it with a wonderful twist, but it’s nevertheless an old story with an old character. And suddenly in 2016, it became okay for folks in their 20s, 30s, and 40s to walk around with their faces dug into their phones as they swiped up to catch Pokémon. Yes, Charizard has been the object of my fascination and affection—but that was when I was 10, playing the Red and Gold versions on my Game Boy Color and trading cards (against the school rules) with my friends—other 10-year-olds.
Facebook, Twitter, and the rest are no help. They do more than just promote all these media; they actively cultivate this disposition of hypernostalgia in each of us individually. They’ve developed hashtags—most notably, #tbt (short for throwback-Thursday) and #fbf (short for flashback-Friday), both of which are often not even used on their specified days—to encourage us to share something from our pasts with others and then feel rewarded through likes, retweets, and so forth. And in the last year or so, Facebook has taken upon itself to show you, the individual user, a “memory” from your own past that got lots of likes so that you can share it again and get even more likes.
Even in clothing, what’s vintage and what’s retro is often what’s in: dresses, suits, glasses, shoes, the works. Pick up the latest style magazine, and you’ll inevitably see something making a comeback. With no effort, I found a post titled “Trucker Hats Are Coming Back,” written just last week on GQ’s website; in it, one finds the following passage:
With looser-cut ’90s-throwback pants making a return, ’80s cars on the rise, and even square-toed shoes inching their way into the current zeitgeist, could it really be that long before the early aughts [2000s] are due for a re-up? Are Ed Hardy graphics and tracksuits next? Wait . . . it’s already happening.Reboots, remakes, spinoffs, sequels, continuations, comebacks, flashbacks, throwbacks: None of this is new. To give just one example, Twelve Angry Men was originally written as a play in 1954, was made into a truly fantastic film a few years later, and was remade 40 years later in 1997 in (and with) color.* As one writer argues, our society tends to produce nostalgically in 20-year cycles: reliving the ’60s in the ’80s, the ’70s in the ’90s, the ’80s in the 2000s, and now the ’90s in the 2010s.
Nor is any of it necessarily bad. These things give us a connection to the past, a relationship that might be weaker without them. Certainly, adapting old stories to our modern tastes requires its own kind of creativity; indeed, Hamilton, in all its artistic brilliance, stands out. Certainly, even more-recent remakes can be valuable when the stories are introduced to different audiences; House of Cards and The Office, both American remakes of British originals, are perfect examples. Certainly, reading great works of literature is always a good thing. And certainly, on a more individual level, reliving happy moments from our own pasts can even keep us grounded at times, and sharing those moments with others can be a valuable (often necessary) way to reconnect.
But what is new in our cultural moment is the level of obsessive nostalgia we’ve reached, and what is bad are the costs of its extremity. In the article linked to above, Douthat quotes historian Jacques Barzun to argue that this phenomenon is best explained by a “decadence” defined as a “falling off” whereby the “forms of art as of life seem exhausted” and so “repetition and frustration are the intolerable result.”
That description seems quite apt. We’re not just reliving the ’90s; we’ve mostly exhausted all decades available to us: sporting styles from the ’30s (tab and club collars are in; haven’t you heard?), conjuring up characters from the entire century, posting photos from last month with “#fbf.” On a cultural scale, we’re losing the desire to create new stories, we’re losing the ability to let go of the characters within them when those stories reach natural endings, and we’re more than ever forcing these stories to live beyond their years. Hollywood doles out ever-more superhero flicks, J. K. Rowling expands her magical world further still, and the Pokémon franchise now promotes an old game through a new app—all for one reason: It all sells. What’s more, what these specific examples all have in common is the development of their specific “universes”: the Marvel Universe, the Wizarding World, the Pokémon Universe. Pixar and many others have given in to this impulse. Even when the stories or characters themselves are original, by placing them in a given “universe,” the creators give a nod to our hypernostalgia, thereby retaining us, their loyal consumers.
So long as our audiences keep demanding these recycled stories, characters, backdrops, and memories, the producers in Hollywood, Broadway, Silicon Valley, and elsewhere will keep spitting them out. There’s nothing wrong with looking fondly upon the past, but keeping our focus there crowds out our own originality, creativity, and ability to move on.
We’ve definitely overstayed our visas in our many trips down memory lane, and it’s time we bring ourselves back to the present state of reality.
_____
*A cultural medium that serves as an exception to this new phenomenon of obsessive nostalgia is music, it seems. For many decades, we’ve had tribute bands, Elvis and MJ impersonators, best-hits albums, and covers upon covers upon covers. I admit that I can’t quite put my finger on it, but I strongly sense that despite these unoriginal elements of music that spread across decades, the lack of originality (for want of a better phrase) has much less to do with any sense of nostalgia compared to these other media I’ve discussed. But check out this book review in the Atlantic of Simon Reynolds’ Retromania: Pop Culture’s Addiction to Its Own Past, which is almost exclusively about music and which explains why I’m wrong.
No comments:
Post a Comment