by Niles Schwartz
Maybe I should have focused on market analysis. Or baseball statistics. Or ultimate Frisbee. The downhill snowball, accumulating mass and speed, seems to plow into the movie enthusiast again and again, as the columnists herald the Death of Movie Culture. And daily life gives reinforcement: in the last week there have been about ten people who’ve told me, in casual talk, that they rarely, if ever, see movies anymore. And if they do by chance go, it certainly wasn’t a critic who planted the seed (e.g., “to get my kids out of the house, I took them to Hotel Transylvania“). Dissatisfied voices in the critical community lament the youth has no respect for the artists who paved the future way, and that the Internet, television, and video games have had detrimental effects on the current business models used in Hollywood deal making. If you click on the Movies link at Yahoo.com, the news isn’t necessarily strewn with reviews, trends, or upcoming projects. More often it’s celebrity gossip, and strangely most of the celebrities being talked about aren’t known for their film work. Yes, I’m discouraged.
Last Thursday on Tommy Mischke’s WCCO radio show, we were set to dissect The Master, the year’s most-discussed motion picture, which has successfully inspired chatter in regards to its aesthetic approach, its performances, its meaning and mysteries, and now the divided reactions of viewers. After doing my prep, including writing and reading loads of material, collecting sound clips, and assembling montages of director Paul Thomas Anderson’s body of work, Mischke threw a curve, and essentially threw in the towel for talking earnestly about the movie.
“I want to start this differently,” he said (and I paraphrase) as On-Air went green. “If anyone listening has seen The Master, please call in right now with your reaction, whether you hated it, loved it, or didn’t understand it.” And the air was dead as we waited. “This is what concerns me,” Mischke then said. The question was, how could we have a meaningful discussion if no one out there had seen the movie? Maybe a See It/Rent It/Skip It review of Hotel Transylvania was more in order.
Of course, maybe the movie’s audience was sleeping (I like that option better than “we have no audience”). The younger, working podcasters would pick up on things the next day. And even if people hadn’t seen The Master, more people had heard or read about it – either in the masses of reviews, commentaries, or articles tying the story to the now-hot issue of Scientology (thanks to Tom Cruise and Katie Holmes) – than they had heard about the new book by Steven Pinker (whom Mischke interviewed the night before during a stellar hour). But the silence muffled a serious approach to the picture and its themes, and I sort of felt defeated – a hobbling movie snob struggling to articulate a prepared series of points – before an unsympathetic jury who just wanted some popcorn and Ron Howard movies (thus I felt a strange kinship with President Obama during his stilted debate performance the following Wednesday). The sense was that this amazing and singular film was not even worth talking or arguing about, even if it was the most prominently featured title during the recent rush of film festivals, and incessantly posted about on sites like Huffington Post, The Atlantic, Salon, and Slate (not exactly the elitist pomp of “Lord Chumley’s Oxford Graduate Journal” or “The Quarterly Post-Modern Marxist Deconstructionist”).
It wasn’t that no one called that was disturbing. It was how I intuited the sense of “why bother?” Even if 30 million people go to the movies every weekend, in most cases they’re not there to be emotionally tangled around the ambiguities of The Master, or sympathetically engage with Oslo, August 31, or catch the nihilistic reflections in Cosmopolis or Killer Joe. Indeed, the user reviews on Moviefone.com for The Master are hilariously hostile, and set alongside the Top Critics at RottenTomatoes.com, shows a kind of growing Movie Class Warfare. But should movies be about a big “conversation” or discourse surrounding a film or group of films, tying them to culture – from Bonnie and Clyde to Apocalypse Now to Hannah and Her Sisters to Pulp Fiction to Brokeback Mountain to Avatar? Is it all a collective pool of sensations, ideas, images, feelings and reflections for everyone to mingle about?
Probably not. Talking isn’t necessary, and sometimes you just want to love something (or hate it). After Moonrise Kingdom, I didn’t want to do much of anything but walk around for a few hours with the feeling Wes Anderson implanted in me. There’s no reason to make a “Socrates Cafe” about it (even though I suppose that’s what one wants to do on the radio). Regarding the “Movie Experience” we like to idealize it, thinking of it as a group of people dreaming together in the dark. But that’s a romantic blunder. People rudely yelped and made disturbances in theaters in the 1940s, much as they use their cellphones currently. And though Martin Scorsese has always romanticized the movies, Hugo being an ode to that special place where dreams are made, don’t forget his earliest features. In Who’s That Knocking at My Door? (1968), the Scorsese alter-ego (Harvey Keitel) flirts with a girl by discussing John Wayne westerns (“Everyone should love westerns. It would solve everyone’s problems if they loved westerns”), even though cinema doesn’t really mean anything to her. In Mean Streets (1973), part of the culture of film-going includes tolerating – and giggling at – angry patrons goading each other. A respectable friend of mine from Brooklyn, who taught English at the University of Minnesota for many years, recalls how in the 1940s he was part of the unofficial “Alfonso Bedoya” fanclub, and during classic films like Treasure of the Sierra Madre, he and his pals would all holler Bedoya’s name and applaud when he came on screen. Maybe that would piss me off more than the guy texting in front of me.
I think the point is that there was still, once upon a time, a more vital role for the moviegoer and movie-critic, and that “Film Culture” was a sturdy thing on which you could plant your interests and not look, well, terribly silly, like a Dungeons and Dragons geek. One would have to get a time machine to be sure, but I have a feeling that Blow-Up, The Conversation, and Hannah and Her Sisters meant more for their respective years than they would had they been released right now (though considering those films made right now is a lot of ways, it’s unfair to set up the comparison). But back then there was no such thing as The Wire, Breaking Bad, Louie, Girls, Homeland, Treme, and Mad Men – the works of creative narrative that are endlessly discussed and seem to mean something. The current auteurs, bridging on becoming household names like Coppola, Scorsese, Spielberg, and Lucas, are Milch, Simon, Dunham, Chase, Winters, and of course writer, director, editor and star Louis CK.
But as Shame director Steve McQueen expressed last year, there are things that movies can do in two hours that HBO still cannot, in terms of getting to more abstract points, or trapping a viewer in that large dark room with no respite for a little while, forcing them to be in dialogue with what they are seeing. Though some film directors have jumped from movies to HBO with cable movies (Philip Kaufman, Curtis Hanson, Jay Roach, and soon Steven Soderbergh), it’s hard to think of some of the great films in recent years premiering on the smaller screen. As the most vocal of malcontents, David Denby, writes, “Such films as Sideways, The Squid and the Whale, and Capote have a fineness, a nuanced subtlety, that would come off awkwardly on television.” Is he right? Thinking of Sideways, where the duplicitous characters are exposed in scene after scene, to watch it in private (as opposed to the shared space of an equally voyeuristic public) might change the way the whole thing is processed. The movie is about private embarrassments and failures that no one wants to admit, the protagonists drinking (like Paul Giamatti’s Miles) or fucking (Thomas Haden Church’s Jack) to escape. The catharsis and humor is in watching it with strangers, where it’s not only funny, but healing.
Denby (The New Yorker) is joined by other voices commenting on this dying animal: David Thomson (The New Republic), Matt Zoller Seitz (Indiewire) and Andrew O’Hehir (Salon) have written columns detailing the fall of the seventh art, Denby and Thomson having full-length books coming out this month about the topic: Denby’s Do the Movies Have a Future, and Thompson’s The Big Screen. O’Hehir’s column from last Friday joins similar pieces penned in the last year by Pictures at a Revolution author Mark Harris (“The Day the Movies Died” in GQ) and Manohla Dargis (“Now Playing: The Usual Chaos” in The New York Times, with A.O. Scott). Seitz disparages the lack of seriousness younger moviegoers bring with them to older films, unable to watch a film on its own terms. Then there are well known bloggers like Jeffrey Wells of Hollywood Elsewhere, who tears apart the lack of context a younger audience has because they can’t see farther back than a couple of decades.
These influential voices are met with reactive counterarguments, put together most elegantly by The New Yorker blogger Richard Brody, seeming to offer a long-view challenge that insists that cinema is always undergoing change, with new means of production and exhibition in accordance with culture and technology. His sentiments remind me of Francis Ford Coppola’s irony regarding the aging Don Tommasino from The Godfather: old men like Tommasino are always saying that the young have no respect, the joke being that old men have been saying that for thousands of years. Critics in line with Brody, like Glenn Kenny, Peter Labuza, and Scott Tobias, shrug the cinematic eschatologists off. Brody points to Wim Wenders’ Room 666, where Michelangelo Antonioni is interviewed about the future of film, and instead of being dispirited, he’s enthusiastic about where the new technology will lead us. Watching films on video, at home, is just one more step in an ongoing evolution. Creativity, for Brody and company, is always there. And we should remember that decadence too was always there. There were always shitty movies, shitty viewers, shitty critics.
Like other narratives of eschatology, film has been in a state of decay since well before I was born: the world’s always been ending. The common antagonist, whether in the mid-1950s, 1960s, or right now, seems to be television. Cinema fought back with Cinemascope, 70mm projection, 3D, and pushing censor envelopes. Now, in our current Golden Age of Cable with streaming technology, the cinema has never been so vulnerable. Critics and cultural commentators, myself included, love to propose big theses, taking a handful of titles and pointing to social energies creating a kind of unconscious. Whether we’re right or wrong, the critic’s role becomes creative, and the discrimination of focus determines a light or dark outlook. A lot of these griping folks love to use examples to vocalize some ongoing grudge. Denby obviously detests much in CGI and longs for more adult-oriented stories instead of comic books, rich in narrative and character. Thomson misses the pre-Godard days of the old star system, when movies didn’t treat the audience’s sentiments with contempt. Jeffrey Wells, meanwhile, will probably be spending the next few months pointing to John Ford and Henry Fonda’s vision of Abraham Lincoln, which “feels right,” while Steven Spielberg (with whom Wells has apparently always had a beef) and Daniel Day-Lewis ruins the president with a historically accurate shrill voice. Spielberg is obviously getting the movies wrong, right?
In 1994, one could point out the burgeoning state of independent film as low budget Miramax films became huge: Pulp Fiction, The Piano, The Crying Game. But at the same time, you could have looked at the nostalgic heart-tugging of Forrest Gump and The Shawshank Redemption, or similar manipulations in Schindler’s List, and prophesied a decline. Was Tarantino’s Pulp Fiction video-store junk-food defecation? A conscious statement of post-modern referential energies? Or simply a director, born from that VHS womb along with a bunch of Elmore Leonard novels, writing only what he could write? And while we love to fire arrows at the Academy Awards and see their choices as signs of decline, wasn’t a string of relatively recent Best Picture winners – The Departed, No Country for Old Men, The Hurt Locker – much more interesting and rich than any of the similarly rewarded films from the entire 1980s, and most – if again not all – of the 1990s? The King’s Speech defeating The Social Network is not an unprecedented disappointment, and neither is the head-slapping ridiculousness of selecting Crash over Brokeback Mountain.
No, we’re not making Lawrence of Arabia, The Godfather, or maybe even – as Denby complains – Michael Clayton, but look, we have Moonrise Kingdom, Beasts of the Southern Wild, The Master, the films of David Fincher, Hugo, Moneyball, The Tree of Life, and Inglourious Basterds. The coming two months are promising with Argo, Silver Linings Playbook, and maybe Cloud Atlas. Here I suppose a learned statistician would have to take into account the ages of the various filmmakers, the budgets, the grosses, and the reviews, comparing them to what’s come before in previous decades, all in order to make a clear evaluation of whether or not things are in decline. For example, one should probably give Denby props for pointing out how John Huston and Alfred Hitchcock churned out a studio movie once-a-year, most of them intelligent and exceedingly well-crafted. It took Bennett Miller six years to get from Capote to Moneyball, David O. Russell six to get from I Heart Huckabees to The Fighter, P.T. Anderson five to get from There Will Be Blood to The Master, and it will have taken Alfonso Cuaron seven to have gone from his magnificent Children of Men to next year’s Gravity. Is this by choice, with everyone working at a very leisurely, Kubrickian pace? I’m sure it’s a more complicated thing than Denby – or myself – can define.
What of the studios? They’ve always been driven by money, while paying lip-service to art. Robert Altman skewered them in 1992’s The Player by having the greasy exec Griffin Mill (Tim Robbins) give a speech promising to find the next Huston and Orson Welles, when in fact everything’s a zero-sum game of economics and Oscars (and there’s the irony that Welles was destroyed by the system). At that time, however, the major money releases weren’t comic book adventures (Batman was released three years earlier, but hadn’t spawned much like it), but Spielberg/Lucas blockbusters and action-hero star vehicles (Stallone, Schwarzenegger, Willis – the Expendables in their younger years). Today, The Player with an Avengers subtext feels perversely hilarious: the decadence of that Hollywood strung out on HGH. The thought of such a film, a 2012 version of The Player, is a project that feels like it would be too much to bear. Altman always said that he made a different product from what the studios manufactured, like he made hats and they made shoes. There are still a few luminous hat-makers, to be sure, but the shoes have become nuclear jet-skis.
Even in those “good old days,” the late great Marxist critic Robin Wood was attacking the coming empire of Spielberg and Lucas, whose films were distracting us from real world problems: “Spectacle – the sense of reckless, prodigal extravagance, no expense spared – is essential: the unemployment lines in the world outside may get longer and longer, we may even have to go out and join them, but if capitalism can still throw out entertainments like Star Wars (the films’ very uselessness as aspect of the prodigality), the system must be basically okay, right? Hence, as capitalism approaches its ultimate breakdown, through that series of escalating economic crises prophesized by Marx well over a century ago, its entertainments must become more dazzling, more extravagant, more luxuriously unnecessary.”
Wood wrote that in the 1980s, just as Geek sensibilities (before they were “Geek”) were being drilled into us. That mindset now seems to rule (the biggest film of the year is the Geek Wet Dream The Avengers, directed by Geek god Joss Whedon, as another recycler of incessant pop-reference, Family Guy’s Seth MacFarlene, has just been asked to host the 2013 Oscars). Matt Zoller Seitz’ recent chagrin with snark-happy audiences unable to take a classic movie at face-value, sounds a lot like Wood’s complaint about young viewers: “[The] same young people who sit rapt through Star Wars find it necessary to laugh condescendingly at, say, a von Sternberg/Dietrich movie or a Ford western in order to establish their superiority to such passé simple-mindedness.” Seitz, writing about a recent From Russia With Love screening, notes, “It’s up to the individual viewer to decide to connect or not connect with a creative work. By ‘connect,’ I mean connect emotionally and imaginatively—giving yourself to the movie for as long as you can, and trying to see the world through its eyes and feel things on its wavelength.” Instead, the audience was muttering wryly about James Bond’s sexual harassment issues. The interactiveness of Internet, extending to Reality TV, has probably contaminated us. We’re “too contemporary,” to borrow a phrase from David Cronenberg’s Cosmopolis. I love midnight movies in suburbs (like at the Muller Family Theaters’ Willow Creek cinema), but I almost had a stroke caused by a rage while listening to some kids behind me during an Exorcist screening.
The most aggravating difficulty one has in engaging in the debates of sterility in the entertainments is the “I loved this article until you started trashing that one movie I like” factor. Wood tore into the Bourne franchise and Paul Thomas Anderson’s Boogie Nights and Magnolia, while David Denby praises director Paul Greengrass’ handling of Bourne’s action sequences as being anomalous in the larger decline of Hollywood action movies. Meanwhile, Denby is quick to lump Christopher Nolan’s Dark Knight films and Inception into the same box of empty junk food with which he has assigned The Avengers. With those films, “sensation has been carried to the point of a brazenly beautiful nihilism, in which a modishly ‘dark’ atmosphere of dread and disaster overwhelms any kind of plot logic or sequence or character interest.” In fact, I think Nolan is ironically making commentary on a sociological addiction to sensation, an overabundance in technological “noise” that stomps out clarity – and why the muffled voice of last summer’s Bane was, to me obviously, a deliberate provocation along with the film’s leaps in continuity and logic: Nolan’s spectacles are commentaries on spectacle and our hollowness. (I concede that The Avengers is, meanwhile, a very fun, but hollow, cash machine).
At least Denby’s general attitude about the Death of Film allows that a handful of luminaries persevere: Martin Scorsese, Terrence Malick, Wes Anderson, David Fincher, Steven Soderbergh, and so on. David Thomson is a critic who’s long since given up, having grown cold to anything tracing far away from the point in time when he must have become infatuated with movies (specifically, it seems, the ’40s and ’50s). The attitude applies to film generally, but also to directors that Thomson championed. He was one of Martin Scorsese’s biggest enthusiasts, but has been unwilling to follow the filmmaker down his more sagacious and prestigious route he’s pursued since the 1990s; an admirer of early Roman Polanski, he is dismissive of all of the fugitive filmmaker’s work in exile, including The Pianist and the information age masterwork, The Ghost Writer (I’m convinced that if a younger filmmaker had made either of those films, Thomson would justly acclaim them as films by a prodigious visionary). I agree with Richard Brody’s assessment of Thomson, and it applies to a lot of commentators: it isn’t that movies are dead, it’s Thomson who’s deadened to movies. I once met Camille Paglia, and briefly discussing Thomas Mann she mentioned Visconti’s adaptation of Death in Venice. “They don’t make them like that anymore,” she said, an expression of which I’ve always been suspicious. Perhaps they don’t make them like Death in Venice, but did they make them like Mulholland Drive and Uncle Boonmee back then?
The conflict of a positive or negative outlook has much to do with the technology with which the art is constructed. Film purists are persistent in their disparagement of the digital, and in a lot of ways it makes sense, but a complaint about the arts extends beyond the topic at hand and to a vaster entropy that envelops everyone and everything, so that the critic is not merely making evaluations of a film, but also – however subconsciously – laying out their own anxieties about the future. I cannot say that I’m in disagreement with Denby’s fears of the digital, where big movie action, constructed by graphic designers rather than being photographed and cohesively sewn together, is close to animation, oftentimes exuding weightlessness and buffoonery. I write about it often, and “Information Age Cinema” is split between films that are hip to the sculpting control of technological change, while others readily feed into the noise and speed, the nicotine-like rush that it provides for an audience. One is reflective, the other an escapist narcotic. One progressive, the other, in my opinion, regressive. The Internet gives us more knowledge than ever before, writes Nicholas Carr in The Shallows, but less depth. Even as Brody argues with Denby that movies have always had special effects and been crafted with artifice (Citizen Kane is almost as much of a special effects movie as Star Wars), the realm of the graphic designer feels different from the landscape of real-world miniatures and light. Director Terry Gilliam notes how the organic feeling of objects being photographed under light is so different from the wizardry of a computer creating something out of nothing. Our eyes cannot fool us to the accidents of space, age, and matter.
But different directors handle the new technology in different ways. The new paradigm of digital is part of their process, and we’re not supposed to interpret what we’re seeing the same way we would interpret an old-fashioned miniature shot. When Roger Ebert says that he knows when a movie “feels right” or not, I can’t agree (it gets close to that Jeffrey Wells argument about Lincoln’s voice not being right). David Fincher and Michael Mann both make dramatic films with strong protagonists, their narratives oftentimes – I would say now exclusively – showing how systematic constructs, joining hands with technological capabilities, are conflicting with symbolic human exchanges and organic relationships. But both filmmakers are technophiles, having embraced digital tools for aesthetic purposes in addition to convenience (and, I’ve argued, for meaningful and subtextual gravitas in their work). When people complain about the differences between film and digital, Mann explains that it’s akin to developments in architectural design. A brick building is not a steel building, though they are, of course, both “buildings.” “No device is intrinsically more moral than another,” writes Brody, “no technique is intrinsically better than any other; a fixed-focus shot isn’t better than a zoom, a dolly shot isn’t better than a hand-held move, direct sound isn’t better than dubbing, color isn’t better than black-and-white, and film isn’t better than digital.”
But is Resident Evil no worse than Indiana Jones or Paul Verhoeven’s Total Recall? And are those blockbusters from the 1980s any worse than the classier spectacles of The Towering Inferno or The Poseidon Adventure? For me, in my 21st century bubble, I again require that time machine. The makers of a lot of bygone “trash,” be it Verhoeven or John Milius (Red Dawn) are, by some, considered intelligent filmmakers (Verhoeven is perhaps a genius). How do they measure up against the flexing arm cocky nihilistic grandstanding of Roland Emmerich and Michael Bay? And are the patterns of speed in film art irresistible, the outcome of the digital, and so an essential thing with which we’ll all have to evolve? Scorsese, Coppola, Ang Lee, David Fincher, Ridley Scott, Baz Luhrmann, Michael Mann, even Bernardo Bertolucci and Wim Wenders, have embraced digital and 3-D. While Brian De Palma, with his recent Passion, is reportedly making a dialectical examination of the digital eye’s “truth” (and has apparently been hampered by digital projection misfires at the New York Film Festival, a brutal irony amidst all this “death of film” stuff), Anderson’s The Master, a film about intoxication, poison, and jaundiced reality, paints in a darkroom with the chemicals of celluloid: the image is organically contaminated.
The question of adult movies at the multiplex is another pressing problem, and again, TV may be the future of that while the two-hour drama dies off. They are still around, but they are not widely distributed or seen (unless they’re being pushed for Academy Awards). The recent Richard Gere thriller Arbitrage and the comic sci-fi drama Robot and Frank, with Frank Langella, are two examples. Both are by young directors (Nicholas Jarecki and Jake Schreier), and both are about aging men struggling to hold onto their sense of control: they don’t want to believe that they’re mortal. Gere’s Ponzi-scheming anti-hero has no sense of ethics as he struggles to preserve his fortune, performing duties of family man and business icon while committing fraud and covering up a mistress’ accidental death. But he’s on the verge of financial collapse, getting grey, and on Lipitor. When his wife (Susan Sarandon) says that he’s going to be the richest man in the cemetery, he responds, “I don’t want to be in the cemetery.” Jarecki cleverly has him giving his granddaughter a dinosaur doll for a present, which should make us wonder who the dinosaur here really is. Is it Gere’s Wall Street maverick who’s going extinct? Is it this kind of moviemaking? Or is it the audience watching it, as the screening I attended for Arbitrage was almost exclusively filled with senior citizens?
The same with Robot and Frank, except Langella’s dementia-ridden codger, assisted by technology (a robot voiced by Peter Sarsgaard), uses the tool – of which he’s initially weary – to regain a fading sense of who he was, what he did (as a safecracker), and ultimately who he loved. The public libraries in Robot and Frank‘s future world are fading away, becoming museums, and the film is bittersweet in how it equates the inevitable decay of Frank’s memory with a cultural one. Both films, Arbitrage and Robot and Frank, cast Susan Sarandon as the forgotten wife, so appropriate because she’s sort of an iconic matron for a lost world of mature mainstream movies (Atlantic City, Bull Durham, Thelma and Louise, Dead Man Walking), but here she’s also ushering in a hopeful new era with emerging directors. Whether Jarecki or Schreier will develop is the question. Or will they, like Greg Mottola (Daytrippers, Adventureland) be relegated to the land of HBO and Showtime, directing pilots for Aaron Sorkin and David Milch?
A more nerve-slicing hope comes from Rian Johnson and his recent Looper, a fantastic futuristic yarn about a contract hit-man (Joseph Gordon-Levitt) who dispenses with people sent from the future, something he’s fine with until he realizes his next target is his aged self (Bruce Willis). Johnson is a talented filmmaker interested in how the past is always scripting things out for us, authoring us, so to speak. His cult-film debut, Brick (2005), is Hammett set in contemporary high school, its characters acting out (without irony) archetypal roles and scenarios from 1940s noir. The characters talk in an almost incomprehensible antique language, while the main gumshoe (Gordon-Levitt) credits his wording to Mrs. Kaspryzk’s “Accelerated English” class. Johnson’s follow-up, the overly whimsical misfire Brothers Bloom (2009) follows two fraternal con men (Adrien Brody, Mark Ruffalo) whose fates are the fulfillment of a script. The younger Bloom (Brody) says to his elder Stephen (Ruffalo), “I’ve only ever lived my life through roles that weren’t for me. They were written for me by you.” Their story is rife with literary allusion (Ulysses, Homer, Dostoyevsky, Melville), and the sense is that future outcomes are predestined by authors (“There are no unwritten lives, only badly written ones.”)
Death of Movie Culture pundits be damned, Looper might be grim about the future, but doesn’t hold off the possibility that an aware and creative personality can author or rewrite the future. Indeed, the film, a noir set in Kansas (an early image of feet sticking out of body bag necessitates a memory of The Wizard of Oz, while the name of an urban cabaret – La Belle Aurore – alludes to Casablanca, though it’s useful to also remember that it’s a text font), reminds us how movies endlessly circulate our reality and choices. Gordon-Levitt’s boss (Jeff Daniels) derisevly mocks the “20th century affectation” of his employees, who wear clothes copied from old movies which were, Daniels reminds us, only copied from other movies before that. The pressure is to “be new,” but like another Bruce Willis time-travel quandary reminded us, Twelve Monkeys, the authentic self existing presently probably doesn’t exist. When Looper‘s protagonist gets some perspective, he’s able to see beyond his time loop and possibly change the future. The financial rewards for the present’s moment (the Loopers take their money, knowing that in 30 years they’ll be killed off, victims to their employment) indicates a historical blankness, a history taken away as individuals are sent back to the working period preceding their leisures and marriages. As with the best genre films, Looper is a conscious action thriller with dazzling set-pieces that don’t dull the senses or neglect the reality of space. There very well may be “cinema entropy” or film culture death, but Johnson’s Looper, an R-rated film that refuses to sentimentalize its characters, is a creative big-release exercise in sci-fi identities with an anxiety of memory, and that such a film can be financed is cause for hope.
And so, even if it becomes increasingly pointless, people will nevertheless find ways to persist in talking or writing about movies, with or without an audience, just as creative people like Rian Johnson will keep on trying to make interesting stories. Honestly, yes, I do think things are bad. This is the landscape of Honey Boo Boo and Resident Evil, when serious adult movies are The Best Exotic Marigold Hotel (the same people who like that tripe will oftentimes hate Moonrise Kingdom and The Master), and a recent South Park episode hilariously tells us that the “bar has been set lower than ever before” (it’s meaningful that it’s the maverick artist, James Cameron, who plunges into the depths to raise it again). But things have always been bad. From Brian Williams back to Herodotus, the news has always been about sucky events. Suckiness is the way of things. The economy, like movies, sucks now particularly, but as far as I’m concerned, it still sucked in the years preceding the 2008 collapse (the GNP may have been huge, with billionaires having a hell of a time, but some of the jobs where I worked had pay-freezes in effect since 2001). And in suckiness, to which we’ve been conditioned, we should more greatly appreciate the good things. I once read about a conversation between Scorsese and De Palma in the early 1980s. Neither was able to get the movies they wanted to make off the ground (I think Scorsese’s funding for The Last Temptation of Christ had fallen through), and were thinking (much like Soderbergh currently) that maybe there was no place for them in the film industry. But what would they do? Teach? Write? After a while, they resigned themselves to their fate. They would just keep on making films, for other people if not for themselves. Like Nick Nolte’s artist says in Scorsese’s short film Life Lessons (1989), art isn’t about choice. “You make art because you have to, because you got no choice. It’s not about talent. It’s about no choice but to do it.” So it is with film critics and their praises or discontents. We adulate something, or bitch about something, because we have to, and have been doing so forever, and will do so for ever, ever more.