Part Two of The Death of Film/The Decay of Cinema
Sometime within the next few years–it may take a decade or more, though a nearer date is more likely–the last commercial movie theater in the U.S. to adopt digital projection will make the switch, and the medium of film will reach its effective end. Thereafter, to see actual films displayed, as opposed to things that for a while may call themselves “films” but in fact are not, you will need to go to places like the Museum of Modern Art and the American Museum of the Moving Image, where projections of celluloid classics will probably remain very popular even while gaining an increasingly archaic air. If you have
a child who is a toddler now, the chances are excellent that you will one day have to explain what film was, and how different theaters were before digital projection brought live tv, interactivity and a dazzling array of other novelties into them, drastically altering what people thought of (and patronized) as moving-image entertainment.
The change has been bruited for years, but now the writing is on the wall, quite literally: the commercial runs of digitally projected movies (The Phantom Menace, An Ideal Husband and, more recently, Tarzan) have ushered the new medium into theaters in Los Angeles, New Jersey and New York, quietly making the summer of 1999 a pivotal moment in the history of entertainment technology. The next phase, when digital totally replaces film projection, will probably happen, when it happens, in such a short amount of time as to leave people astonished in retrospect. For that, at least, there is precedent. Do you suppose that audiences in the fall of 1927, when The Jazz Singer opened, had any idea that the form of expression they knew as movies would be overthrown in a mere matter of months, replaced by a very different form called “talkies”? What we call movies haven’t seen that kind of drastic change in the half-century since; now, suddenly, they are about to.
Of course, the arrival of sound was meant to be dramatic and revolutionary (Warner Bros., its sponsor, might have gone down the drain if audiences had shrugged), while the shift in projection technologies is intended to be smooth and little noticed. But the latter change is sure to have consequences as profound and sweeping as the former’s, and there will be other similarities as well. For example, the movie business today seems as incognizant as audiences (and most critics) of the impending effects of this technological leap. In late 1927, studio bosses saw sound as an audience-pleasing gimmick; they didn’t realize it would scuttle their existing star system and overturn virtually every aspect of film production. Likewise, digital’s studio backers regard it as a money-saving, technically superior means of delivering their wares; they seem barely aware of how extensively it will reshape those wares and the culture and business surrounding them.
What’s essential to understand is that digital replaces film with television technology. However innocuous that may sound, it’s a fundamental change that will open the way for movie theaters to greatly expand and diversify their programming. Given
the capacities of live tv, you don’t expect theaters to limit themselves to old-fashioned, feature-length, fictional movies, do you? Of course they won’t. They’ll leap at the chance to offer big-screen, technically wondrous versions of concerts, sports events, special events like the Oscars and all manner of other live, interactive and tv-originated attractions. (Once the industry really grasps what all this portends, expect a mad scramble by different movie- and tv-related businesses trying to claim parts of the bonanza.)
At the same time, movies themselves will undergo a technical sea change. Video technology will displace film in the shooting as well as the showing of movies. After an initial phase of trying to maintain the old look of film (as many tv dramas still do), movies will start to look a lot more like tv–very high-quality tv. Digital will greatly facilitate low-budget production, but that won’t mean you’ll be seeing more low-budget movies, because the change will also facilitate the means by which super-expensive movies get to be superspectacular. Computer generated imagery (CGI), like that used to create the dinosaurs in Jurassic Park and planets and cities in movies like The Phantom Menace and The Matrix, is the key. It will make many things about movies more fantastical and awesome than ever before, but also less real. You may love what you see, but you literally won’t be able to believe your eyes. And with that sudden, decisive break from the old esthetic and ethical moorings of photography, moving-image technology will enter a new era.
Yes, the new age will contain movies. But what about cinema? In this article (including last week’s installment), I’m using terms that are normally interchangeable in distinct ways: film refers to the old, celluloid-based technology; movies refers to motion pictures as entertainment; and cinema refers to motion pictures as art. You will notice that the latter term is inherently more slippery and subjective than the previous two. We could argue its definition all night. But before we do, let me skip right to my conclusion so you’ll know what we ultimately are arguing about.
I think that 50 years from now people will regard what we call cinema as belonging to the past, i.e., the current century. In fact, the consensus on this may take only 10 years to coalesce, but in any case, it will reflect agreement on a crucial perception: that cinema and film were–are–fundamentally linked. If you take away film, what you have left may look much the same for a while, but soon enough you’ll realize that it doesn’t function the same. And one function that will accompany film into the museums, I think, is cinema, a peculiar kind of storytelling technological art that has reigned widely and gloriously through most of the 20th century.
The other point to be stressed is that cinema’s displacement isn’t just beginning with the appearance this summer of digital projection in theaters. It has been going on for close to a half-century. It has been going seriously for a quarter-century. And that long goodbye points to a great irony: that cinema may owe both its extinction and its zenith to its archnemesis, television.
Anyone who describes or believes in motion pictures as an art will have their own particular definition of art vis-a-vis film, one that will necessarily disagree with many other actual and possible definitions. So it is here.
To give a few examples, my own definition of cinema does not automatically include the following: movies adapted from Shakespeare, Jane Austen or other highfalutin’ literary sources; movies in which people speak with British accents or, alternately, speak in foreign tongues translated by subtitles; movies featuring beautiful photography; movies disliked by my fellow critics; movies liked by my fellow critics; movies by famous directors whose past work I have admired; movies featuring comely performers that I personally find very attractive and, thus, talented; movies that win Oscars and other accolades; movies that offer showcases for great acting or state-of-the-art special effects; movies from Iran, Hollywood or, for that matter, anywhere else.
What’s left? Well, definitions are important, so here goes: cinema is filmed entertainment taken to its most refined, sophisticated and characteristic form of expression. In practice, said form of expression is usually the product of a single shaping intelligence, and does not neglect either of the poles (reality and dream) that together make movies such a unique form of communication. But this, I hastily admit, is still too vague and could easily be subscribed to by people whose examples of cinema I would find preposterous or abhorrent. So let me offer a brief retrospective sketch that, I hope, will lead us to a clearer and more concrete definition.
When people first saw film, they didn’t see movies. This is pretty remarkable if you think of it. Imagine you’re at one of the first projections of motion pictures, in 1895 or so. The projector whirs on, images flicker on the wall, and for a few seconds, maybe longer, you struggle to figure out just what the hell you’re seeing. Is it one of those fairground illusions? Some new kind of illuminated painting? A camera obscura reflection from outside? Then you realize: No, it’s a moving photographic image, showing people and objects caught by a camera. You identify the nature of the picture, and yourself in relation to it. Quite amazing–downright astounding actually–but after suitable gawking–so what? People soon grew bored, which led film pioneer Louis Lumiere to famously, astonishingly call motion pictures “an invention without a future.”
We laugh, but he might have been right. Instead, certain people thought to tell stories using film, and viewers got the conceit: they identified the people on the screen as fictional characters played by actors, as in theater or vaudeville. Let’s call this the second identification. It was soon followed by a third: viewers began to identify actors who appeared regularly in different stories, and moreover, began to identify with them. Thus were born movie stars, along with the star system and all that it implies. By extension, you can say that people at this stage also began to identify (and to an extent, identify with) the movies’ different genres, studio styles and so on.
All of this went from the germ of a possibility to a remarkably elaborate imaginary-cum-industrial universe in the first two decades of this century. It was the flowering of movies as an entertainment system, as a ubiquitous cultural form, as a great popular art.
Like the previous three, the fourth identification didn’t have to happen. But it did. Some viewers began to identify the styles and recurrent concerns of different directors. Identifying with the director was inherently different than identifying with an actor or star, however. Rather than identifying with the seen, it meant identifying with the seer, and the mode of seeing. And that instantly reconfigured the entire process of understanding movies and, with it, the movies’ potential as art. As an example, here’s a passage from Andrew Sarris’ The American Cinema: Directors and Directions 1929-1968 (1968) concerning a John Ford film of 1956:
“There’s a fantastic sequence in The Searchers involving a brash frontier character played by Ward Bond. Bond is drinking some coffee in a standing-up position before going out to hunt some Comanches. He glances toward one of the bedrooms, and notices the woman of the house tenderly caressing the Army uniform of her husband’s brother. Ford cuts back to a full faced shot of Bond drinking his coffee, his eyes tactfully averted from the intimate scene he has witnessed. Nothing on earth would ever force this man to reveal what he had seen. There is a deep, subtle chivalry at work here, and in most of Ford’s films, but it is never intrusive enough to interfere with the flow of the narrative. The delicacy of the emotion expressed here in three quick shots, perfectly cut, framed and distanced, would completely escape the dulled perception of our more literary-minded film critics even if they deigned to consider a despised genre like the Western. The economy of expression that Ford has achieved in fifty years of film-making constitutes the beauty of his style. If it had taken him any longer than three shots and a few seconds to establish this insight into the Bond character, the point would not be worth making.”
Granted, there are things about this passage that now strike us as quaint. But in its time everything that the commentary above espouses and exemplifies was as revolutionary as the jump from film-as-novelty to movies-as-entertainment.
Sarris’ gloss overlays a new, not at all obvious reading over the one we’re accustomed to from movies-as-entertainment. In both, the drama in this tiny slice of The Searchers hinges on looks. The older, obvious reading comprehends the meaning of the woman’s look and its relation to the past (she was in love with her brother-in-law) as well as Bond’s look and its implication for the future (he will never tell). But Sarris’ reading locates the primary source of meaning here not in the looks of the characters but in Ford’s look, his entire way of ordering and assigning emotional weight to the characters and the visible, fictional world they inhabit. In his handling, there’s significance in the smallest details of the presentation: the distance from the camera to the characters, the way a door frames a character and the camera frames the door, the length of time each shot is held, the number of shots used, etc.
Such a reading fundamentally reorients us, accomplishing several things at once. First, it invites us to see movies as (at least potentially) a highly personal form of expression, capable of far more delicacy, subtlety and idiosyncrasy than film’s mechanical nature or the movies’ industrial organization might seem to permit. Second, it heightens our awareness of the expressive possibilities in formal elements like framings, camera moves and so on. Third, it obliges us to look to a director’s
body of work for a full understanding of his style’s meaning and resonances.
Last, and perhaps most importantly, it provides associations and precedents that allow us to align cinema’s expressiveness, as practiced by Ford and comparable directors, not only with some of the deepest currents in our culture (“chivalry” alone takes us to the wellspring of what used to be called Western Christendom and its ideas of romantic love, honor and poetry) but also with certain key attitudes and ingredients of modern art and literature. It is not just in the great refinement and formal precision of Ford’s art, after all, but also in the way Sarris evokes and explains them, that cinema becomes a medium that doesn’t invite embarrassment when discussed alongside the work of Joyce and Mann, Ibsen and Manet.
There’s something else to notice about the passage quoted above, with its intricate weave of gazes. Every look it notices is visible, surely, yet also deeply personal and private. The woman’s look is for her alone, enclosing years of secret sorrow, while Bond’s instantly becomes an image to hold close, a sacred, unspoken trust. Ford’s own gaze has a kindred reticence. The exchange of glances he conjures between this man and woman doesn’t violate their privacy any more than they violate each other’s. Likewise, Sarris’ view of Ford, which he invites us to share, is at once illuminating and discreet. Its insights unite reader, critic, filmmaker and characters in what might be called a circuit of privacies, hidden in plain sight from (though always available to) ordinary movie-viewing.
David Thomson wrote that the method of Bresson’s movies “stresses the privacy of minds.” Sarris finds supreme value in the “privileged moments” that great directors allow us to share. What we sense in such formulations, I think, is the notion that cinema, while dealing extravagantly and specifically with the surfaces of the outward world, finds its poetry, its essence as an art, in suggesting and defining inner worlds. It’s a bit mechanistic, but you could call cinema a technology unsurpassed at encouraging the idea that every person has an inner life, one potentially as vast and rich as the West itself, and that such privacies need respect and a certain apartness from external forces to survive and flourish.
In any case, the view of Ford that Sarris advanced was virtually unknown in the U.S. when The Searchers was released in 1956. Westerns were indeed a despised genre, and the nation had a long history of “literary-minded” film critics and pictures to please them. The notion of serious, artistic movies arose almost as soon as the medium started to tell stories. In 1907 a French company piquantly called Film d’Art brought Sarah Bernhardt, the Comedie Française, ballet performances and stories by the likes of Victor Hugo and Anatole France before the cameras; Camille Saint-Saëns was commissioned to score the company’s first production. Anxious to overcome the movies’ plebeian, flea-bitten image, producers around the world rushed to follow suit. As Arthur Knight noted, “American producers were soon filming Shakespeare in vast quantities–the whole of Hamlet in a hectic 10 minutes!”
In many ways, that mind set remained firmly in place for a half-century. To be sure, the movies of D.W. Griffith and other of the silent era’s great directors, as well as brilliant comic creators like Keaton and Chaplin, were appreciated as representing a new, buoyantly populist vernacular. (When sound overthrew this world, there were esthetes ready to wail that an entire art was being lost–as perhaps will happen again in the next few years.) And critics like Robert Warshow and James Agee eloquently appraised the vital charms of Hollywood genres and stars. But most reviewers–and indeed, mainstream audiences–remained stuck in a stolid middlebrowism that equated art with solemn subjects and scripts derived from reputable novels and plays, and reflexively assumed the superiority of most things European to anything American.
If the revolution needs a launch date, that would be 1951, when Andre Bazin and Jacques Doniol-Valcroze founded Cahiers du Cinema, the journal that gave cinema its modernist identity and agenda for the next generation. François Truffaut, Jean-Luc Godard and most of the young Cahiers critics were bent on making movies themselves, and it’s worth noting that their intellectual, director-celebrating approach involved a crucial bifurcation from the first. On the one hand, it was descriptive: it invited you to review the entire corpus of movies up to the present and reevaluate it according to how successfully each director was able to imprint his personal signature on the medium. On the other hand, it was prescriptive: it suggested that establishing such a signature should be any moviemaker’s prime objective, and that the best moviemaking conditions were those allowing maximum directorial autonomy.
Was there anything wrong with this brash counter-orthodoxy? Of course–tons. It had a headlong insouciance and reductive monomania that perhaps could have only been concocted by a bunch of brainy, desperately ambitious French twenty somethings. As Truffaut freely admitted, the politique des auteurs (which Sarris imported to the U.S. in 1960 as the “auteur theory”) was essentially a polemical battering ram aimed at forcing the gates of the French film industry to admit him and his pals. As such, it was deliberately skewed and inherently excessive. Its descriptive side undervalued many varieties of film practice in overvaluing directors. (Even Andre Bazin famously objected. “Auteur, to be sure–but what of?” he asked in 1957.) And the prescriptive side was an instant invitation to all manner of solipsism, showboating and self-indulgence.
Yet the auteur idea was essential, catalytic. It elevated declasse genres like Westerns and gangster films to cultural respectability, and started a vital conversation between European esthetes and the supposedly lowbrow products of Hollywood. Suddenly John Ford and Ward Bond were sharing the stage with Ingmar Bergman and God. Hitchcock and Sartre were seated within speaking distance of each other; Howard Hawks, Abel Gance and Andy Warhol chatted nearby. Such inspiring proximities and revealing juxtapositions, together with auteurism’s credo of self-expression, produced a self-consciously new form of cinema in the first films of the French New Wave, and they in turn sparked a taste for–and general idea of–”new cinema” that quickly became a global phenomenon.
You can’t credit any single factor or clique for this explosion. The conditions that made it possible were numerous and unrepeatable. When movies were the dominant popular visual medium, they had a power and a centrality that they would later lose. You might see a movie of an evening and be enraptured by it for days, until another superimposed itself. When national cultures were close enough to communicate but not so close as to begin melting into each other, “foreign” films held a compelling fascination. When societies were still straitlaced and convention-bound, experimentation and all manner of avant-gardish audacity registered as purposeful. In such circumstances, ambitious artists and educated young audiences found common cause in the suddenly worldwide idea of film as art.
The great period of cinema identified with this idea lasted from 1960 to 1975, i.e., from the transatlantic success of the first New Wave movies until Jaws announced the movies-tv detente. In America it spanned the late works of first-generation directors like Ford, Hitchcock and Hawks and the new blood of Cassavetes, Scorsese, Coppola, Malick, Warhol and a vast array of others, including nonnarrative filmmakers such as Brakhage, Mekas and Frampton and documentarians such as Frederick Wiseman, the Maysles Brothers and D.A. Pennebaker. In Europe it linked the still-active Buñuel, Gance, Lang and Renoir to the numerous talents of the New Wave, plus Rossellini, De Sica, Fellini, Antonioni, Bertolucci, Pasolini, Polanski, Wajda, Bergman, Loach and Britain’s Free Cinema as well as the leaders of new cinemas in Czechoslovakia and Hungary. In Latin America: the cinematic insurgencies of Cuba and Brazil. In Japan: great works by Kurosawa, Ozu, Mizoguchi, Naruse, Oshima, Kinugasa, Ichikawa, et al. (This is a start on a short list.)
This was the era when “film culture” became a fully operative and familiar concept. When art houses proliferated. When film festivals and film societies and clubs were founded; when film journals and film books began being published in profusion; when film courses entered university curricula and film schools sprouted internationally. Looking back from 1999, it’s easier than ever to see the singularity that the idea of cinema attained in this era, and to mark the course of its gradual and ineluctable decay since.
Of course television is merely the most obvious culprit for cinema’s fall. In reality, if you must have culprits, it’s at the top of a long list that must also include the natural exhaustion that all cultural forms are prey to and the devouring self-absorption that was always auteurism’s Achilles’ heel. And perhaps tv deserves to be thanked as well. Marshall McLuhan, who was extraordinarily perceptive in some matters if rather batty in others, suggested that movies owed their status as art to television. He said that no medium graduates to highbrow status until a newer medium comes along to play the lowbrow role. I think there’s more than a little truth to that. With the arrival of the “idiot box” in the 1950s (coincident with the exertions of the Cahiers crew), the movies started gaining an aura of stately seniority and tasteful exclusivity.
In the 1960s, the two media bounced off each other in friendly, productive fashion. They were still of comparable size on the cultural landscape; neither overwhelmed the other. Television initially aped the movies, like a fond and frisky younger brother, taking on some of its most appealing hand-me-downs (westerns, Lucy, etc.). And movies learned from tv’s immediacy, flexibility and expanded technical vocabulary; Robert Altman was one of the directors who forged his style working in tv and had no trouble transplanting it to the movies. The downturn came, as I observed in an article earlier this summer, in the mid-to-late 70s with the arrival of Jaws and Star Wars, films that constructed a formidable new movie-business paradigm (pulp blockbusters advertised on tv) atop a profound if little-analyzed cultural shift: It was the moment when the first generation exposed to tv since birth came of age.
Thereafter, it became clear that these media weren’t natural allies, and certainly weren’t meant for peaceful coexistence. Television, a thousand times more powerful and pervasive, was destined to swallow its older sibling whole, and with it a vast range of cultural understandings and values. In McLuhanesque terms, if you will, cinema marks the last stand of the culture of literacy before its final submersion in tv’s postliterate whazzit. In any case, these two now hardly look like brothers at all. Film/movies/cinema is a palpable thing, amenable to individual, formal manipulation and understandings; its nature is bound up with difference, distinctness, discretion. Television counters with homogeneity, simultaneity and a profoundly undifferentiated kind of universality. It doesn’t propagate formal distinctions or understandings because its own nature approaches formlessness; it is everywhere, all the time. Film still suggests a world of hierarchies, where concepts like “great artist” and “best” and “worst” mean something. Television is the world of anything goes, never mind the bollocks, what was that you just said?
Of course many, many great movies were made in the quarter-century after Jaws, and though it’s arguable that film culture experienced a general dilution during that time, it also spread notably, too: witness the 90s boom in American independent movies. VCRs didn’t kill off moviegoing any more than tv did in the 50s; if anything, they stimulated awareness and appreciation of movies. I would even argue that the 1990s have been one of the most vital and interesting decades for cinema since sound arrived. But much of what’s been good about it have been things that suggest a final flowering, a last burst of energies, not a step toward the future. Some of the era’s most important artistic accomplishments, movies that would have been at the center of cultural discussion 20 or 30 years ago–Todd Haynes’ Safe springs immediately to mind–were barely noticed by audiences. And masterpieces such as Terrence Malick’s The Thin Red Line and Abbas Kiarostami’s Taste of Cherry intertwine a lyrical feeling for the precise textures of photography with a sense of mortal life’s perishability for a result that, in both cases, seems to suggest an elegy for film itself.
Reasons for such laments are easy to find. From the time anyone began to think of movies as art, the tensions between Hollywood’s use of film and that of foreign moviemakers made for a dialogue that energized and advanced both sides, and that was crucial to everyone’s understanding of cinema. That, obviously, is nearly over. The visits I made to China and Iran during the 90s convinced me that these countries experienced film renaissances during the past 15 years for several shared reasons, chief among them that tv had not yet permeated their societies as it has in the West. Even if Americans’ interest in foreign movies had not drastically declined in recent years, it’s doubtful we’ll see another national cinema like Iran’s. Under the influence and example of tv, movie culture has rapidly gone from local to global, with Hollywood’s hegemony becoming
as unchallenged as those of Nike and McDonald’s.
That movie (cum cyber) culture is what’s fast replacing the film culture that Cahiers du Cinema and the French New Wave unloosed on the world 40 years ago. When I was in Iran two years back, a report circulated that Jean-Luc Godard had said, “Cinema is Griffith to Kiarostami.” Apocryphal or not, the declaration’s note of finality catches the sense that cinema will pass into the history books and museums with the end of film. Of course: movies will still be made, they’ll be grander than ever and people invested in things like the Oscars will acclaim them as artistic. But they will increasingly be like Titanic, splashy spectacles made for a global 12-year-old whose main education comes from you-know-what. The New Wave’s sense that the cinema could place one in conversation with Aeschylus and Goya, Sartre and Cervantes will fade. So too, I think, will nuances of tenderness and tragedy, of profound inwardness and chivalrous discretion, and of the individual artist’s very personal way of envisioning the world, like those you can glimpse in a certain fleeting passage of John Ford’s The Searchers.