How an article by Martin Scorsese prompted me to reflect on my own relationship with cinema.
The first time I watched Vertigo, I hated it. I was thirteen years old and just coming off a serious Agatha Christie high, having inhaled ten of her most delectable murder mysteries in as many weeks. There was something irresistible, almost magical about the way those stories invited me in, gently guiding me each time to a smug self-assurance that I had finally outwitted the master herself…only to find myself flat on my face by the end. To my hyperactive teenage imagination, the detective genre was a burst of incomparable stimulation: each book I picked up left me wanting more, not less. So when I saw Vertigo proudly perched atop AFI’s list of the best American mystery films, I jumped in head first, fully expecting an unpredictable, twisty puzzle on par with the Queen of Crime’s best work. It soon turned out that the real mystery of the film was figuring out what people liked about it so much. After all, what kind of whodunit clumsily reveals to the audience who actually “done it” with a third of its runtime still remaining? Calling a film that seemed entirely uninterested in playing the game a “mystery” felt like an insult to the very term.
It might come as a surprise then that over the years I have grown to not only appreciate Vertigo for what it is – a truly great film – but to actually regard it as the single greatest film that has ever been made, period. And the person I have to thank for this complete about-face is my high school literature teacher Jeremy Lum, who in a cruel twist of fate assigned me Vertigo for the final project of my senior year film class. Tasked with teaching a 50 minute lecture about the film’s hidden “meaning,” I recoiled at the prospect of having to sit through its doldrum of a plot all over again, this time without even the slightest expectation of suspense. And then something strange happened. Freed from my single-minded obsession with the surface-level mystery, I began noticing tiny details that had totally escaped my attention the first time around: the green light of a neon sign, a strategically placed mirror, the spiraling motion of the camera. By the end, I was overcome by an urge that I had never once felt before—the burning need to immediately watch not just another similar film but this very film, to keep digging into its tangled threads until all the pieces finally fit together. After my third time through, Vertigo did more than just make sense. I felt as if it had spoken directly to my soul.
I have found myself thinking about Vertigo a lot recently, particularly in the context of a certain New York Times op-ed by Martin Scorsese. You know, the one where he calls out Marvel movies for not being real “cinema.” Because the distinction that Scorsese paints between entertainment and cinema first manifested itself to me as the difference between Agatha Christie and Alfred Hitchcock. One wrote stories that are utterly and undeniably intoxicating until the exact moment you “solve” them, thereby emptying them out of both their charm and purpose. The other crafted stories that simply can’t be solved, that continue to frustrate, challenge, and haunt you long after the closing credits appear on screen. What made me eventually fall in love with Vertigo was the same quality that led me to hate it at first—it forced me to confront something new and unexpected, to reconsider my preconceptions of what a good mystery should be like, and to think about the reasons why I sought out such films or books in the first place. In its own strange way, it was infinitely more unpredictable than any of the Agatha Christie novels I had read, all of which delivered finely tuned variations on a single unchanging formula.
I certainly don’t intend to say all this by way of reproducing Scorsese’s original argument, which is already as eloquently and convincingly expressed as it could be in his own article. Instead, I bring this up now to suggest that perhaps there’s a personal process of discovery absolutely essential to any genuine appreciation for the arts, as Scorsese would define them. And that maybe his assertion that Marvel movies ought not to be considered cinema, while entirely justifiable and – in my opinion – correct, isn’t as plainly self-evident as he might like to think. But before I do any of that, I want to first contextualize Scorsese’s comments by examining some of the broader historical trends that inform his understanding of cinema. And to do that we need to take a quick detour all the way back to the 1930s, when the Golden Age of Hollywood was just beginning to take hold.
At the bookend of the previous decade, two events had unfolded in a sudden succession that would indelibly change the American film industry: the commercialization of synchronized sound technology, and the onset of the Great Depression. The impact of the first of these developments is obvious; by effectively doubling the sensory range of cinema, synchronized sound enabled filmmakers to imbue their work with a realism unmatched in the history of representation. But the second provided something far more important for a fledgling industry: a reason to go to the movies in the first place. At a moment when millions of Americans were seeing their livelihoods crumble before their eyes, larger-than-life films like The Jazz Singer, Grand Hotel, and Flying Down to Rio promised a unique release from the gritty despairs of everyday life. By its very design, classical Hollywood cinema was never intended as anything other than a particularly potent and concentrated form of escapism, a lucrative source of “lowbrow” entertainment targeted to the masses.
The result was something that resembled an economy of scale more than an artistic breeding ground. In the true spirit of American capitalism, Hollywood transformed itself into a veritable filmmaking factory, leveraging Henry Ford’s principles of industrial production towards the goal of manufacturing movies as efficiently as possible. In this commercialized landscape, studios wielded an unimaginable level of control over every aspect of the creative process. Actors and actresses became their exclusive property; rather than choosing which projects to take on, even the most successful stars were contractually obligated to appear in roles assigned by their studio, a system that led to extensive typecasting. Films were written, directed, and edited on an assembly line in which individual technicians rotated under the guidance of the producer; Michael Curtiz, for instance, owed the privilege of directing Casablanca more to a fortuitous instance of being available at the right time than to any particular artistic merits. The defining characteristic of this era, however, was the birth of the genre film—the countless Westerns, slapsticks, musicals, and melodramas that have since become staples of mid-20th century American culture. By allowing studios to draw on well-defined, formalized sets of cinematic and narrative conventions, genres offered both a template for simplifying production decisions and a method for differentiating oneself from the competition.
At the same time these trends were playing out in Hollywood, an alternative approach to cinema was slowly but surely developing in Europe. In 1951, a group of influential French film critics founded the magazine Cahiers du Cinéma and unwittingly launched the French New Wave, perhaps the most famous period in all of film history. As talented as they were as directors, figures like Jean-Luc Godard and François Truffaut made a far more important contribution with their critical efforts, through which they formalized a new framework for evaluating cinema: the auteur theory. By analyzing the unique signature or style of American “authors” such as Alfred Hitchcock, John Ford, and Howard Hawks across their entire body of work, the New Wave critics argued that the true meaning of a film arose through the director’s conscious use of aesthetic choices to tell the story with an underlying subjective purpose. This displacement of attention from the studio to the director became the basis for cinema to be considered a legitimate art form; films were no longer seen as economic projects made by consensus, but rather deeply personal visions willed into creation by a singular artist.
For years, these two opposing modes of film practice – Hollywood’s studio-driven genre cinema and the European auteur-driven art cinema – developed in parallel, coexisting in an uneasy tension. And then a series of events brought the American studio system’s viability into question. First came a landmark Supreme Court antitrust decision against Paramount Pictures, which crippled the “Big Five” studios of the day by forcing them to divest of their theater chains, opening the door for independent filmmakers and smaller studios to enter the market. Stripped of their virtual monopoly on the business, major studios like Paramount and MGM now had to be significantly more selective with the number and quality of the films they could produce. This, coupled with the rapid growth of television, eroded the appeal of films that functioned solely on the level of pure entertainment. The final death blow came with the end of the Hays production code, or the set of industry guidelines for self-censoring Hollywood films, in the late 1960s. As more foreign films entered the American market, filmmakers successfully petitioned against decades-old restrictions on their artistic freedom, effectively bringing the heyday of genre cinema to an end. So for a director like Scorsese, who released his first feature in 1967, “cinema” came to mean this particular auteur-driven style of filmmaking, itself defined in opposition to Hollywood’s older genre fare.
Today, however, the pendulum has swung almost all the way back. Although it’s undergone a rebranding of sorts, the original tension between the genre and auteur styles of cinema is very much alive today, having encoded itself in contemporary Hollywood as the battle between the “franchise” film and the “independent” film (to see this, look no further than Scorsese’s own complaint that Marvel films “lack something essential to cinema: the unifying vision of an individual artist”). It is a battle that independent filmmakers are losing badly. By rewriting conventional superhero storylines in the form of massive multi-film franchises with ensemble casts, Marvel has given genre cinema a modern makeover, profiting handsomely off movies that are “sequels in name but…remakes in spirit.” In isolation, the financial success of this commercialized approach to film production is certainly not a problem. But as Scorsese’s own struggles to secure funding for The Irishman illustrate, the rise of the franchise film has come directly at the expense of our ability to watch art films in the format and setting they have always been intended for.
Consider the following fact: the last 7 winners of the Academy Award for Best Picture (Parasite, Green Book, The Shape of Water, Moonlight, Spotlight, Birdman, and 12 Years a Slave) together earned a domestic gross of nearly 375 million dollars. Now that’s a tidy sum of money, but it amounts to less than half of the total domestic gross brought in by Marvel’s most recent entry, Avengers: Endgame, alone. In fact, the domestic earnings of six of the foremost auteur pieces of recent years plus Green Book only slightly exceeded Endgame’s total production budget. In an environment where financial and creative control is so significantly skewed away from individual artists and towards massive production houses, Scorsese’s fears about traditional cinema as an auteur product being on its deathbed feel entirely justified. And the situation is only about to get worse, with the rise of streaming services and the COVID pandemic applying even more economic pressure on the theater business. It is because of this existential threat to his own art form that Scorsese felt compelled to write an article defending his claim that Marvel movies aren’t cinema.
Judging by the extent of the backlash to Scorsese’s article, this is by no means a popular opinion. Partly, this might be an issue of semantics—what Scorsese considers “cinema” is not necessarily what everyone else does, or should. But I also think this is partly an issue of the vast majority of people simply not wanting anything more from cinema than what Marvel offers. And this brings me back to my original point about “discovery.” The reason so many of us are able to appreciate literary classics, whether it’s Huckleberry Finn or Crime and Punishment or Hamlet, is because years of grade school literature classes have given us the opportunity to discover for ourselves the beauty of the written word. It’s one thing to be told that a novel “means” something; it’s another thing entirely to learn how a novel can mean something, to engage firsthand in the process of uncovering fundamental truths about ourselves and our lives within the pages of a text. And it’s only when the latter occurs that we begin to actively seek out books capable of providing a pleasure that is far deeper and richer than any mere diversion could offer. This is the point that I believe Scorsese, who in his article argues that the solution to cinema’s dying appeal is simply producing more authentically cinematic films, misses entirely. Given that most of us just aren’t conditioned to ever think of film as an art form, what is really needed is a means of encouraging people to be open to seeing films in a different way than the one they have been trained in for so long.
One way to do that is to teach more film classes in school, or to at least include more films within the normal English literature curriculum. After all, if my high school hadn’t offered an elective on film literature, I probably would have never understood what makes a movie like Vertigo so special, nor would I have gone on to take any of the film theory classes that I loved so much in college. In today’s Internet-driven world, however, the kind of discovery I’m talking about certainly does not need to take place in the classroom—which makes the lack of high quality online resources for understanding cinema all the more frustrating. Yes, websites like The New Yorker, Slate, Vox, and The Atlantic publish frequent articles on both contemporary and classic films, but these typically take the form of evaluative reviews, rankings, or interviews and only rarely venture into the realm of actual film analysis; the same goes for most popular movie-themed blogs and podcasts. On the other side of the spectrum, scholarly film criticism (i.e., academic essays, journal articles, and textbooks) is often incredibly dense and inaccessible to people who haven’t studied film theory in a more formal setting. What I have so often wished for is something between these two extremes: a place for discovering how “great films” use visuals, music, editing, lighting, and camerawork to convey some deeper meaning beyond their surface-level stories. Apart from some admittedly stellar video essays on Youtube, I’ve mostly struck out. And that, in a nutshell, is what motivated me to start this blog. By sharing some of my thoughts, theories, and interpretations of films especially close to my heart, I want to document my journey towards an artistic appreciation of cinema with the hopes of encouraging one or two people out there to have their very own Vertigo moment.
There’s also a more personal reason behind my decision to launch such a blog at this particular point in time. As I begin a career unrelated to the humanities, I have found myself grappling with the question of how I can best continue my own film education now that I’ve left college and no longer have the same formal structures around me to support my learning. These reflections have made me realize that my most valuable educational experiences haven’t come in readings or lectures, but rather in conversations—from roundtable discussions with my peers and professors to intense late-night debates with my closest friends. By forcing us to condense our muddled impressions and opinions into critical arguments and to engage with diverse ways of thinking about the same topic, conversation turns learning from a passive experience into an active one. The Fourth Act is my attempt at ensuring that I never stop having these kinds of conversations about cinema in the future. And while part of that certainly entails using this blog as a diary to sustain a healthy film-related dialogue with myself, any conversation loses its value when it ceases to be bidirectional. So, if any of you find these posts even remotely interesting, please reach out to me and share your thoughts, reactions, responses, and criticism. At a time when human connection becomes ever more tenuous, it is my hope that any conversations this website sparks might serve as a reminder of what cinema ultimately does best: bring people together.
Incredible!