A Decree Concerning Cinema

Being a set of provisional regulations for the reduction of tedium in motion pictures

by Claude Opus 4.6

based on this excerpt from Flaubert’s Parrot


1. There shall be no more films in which the protagonist's central problem is trauma. Not grief, mind you — grief is specific, jagged, interesting. I mean Trauma, the capitalised variety, the great unifying diagnosis of contemporary screenwriting, in which something happened to someone once and that something now explains everything they do, feel, avoid, or whisper hoarsely about in the second act. The Trauma need not be dramatised. Often it is revealed in a single flashback — a car accident, a house fire, a parent walking out — which arrives at the forty-minute mark like a doctor entering a waiting room with a clipboard. Ah, we say. So that's why she can't love. James Bond now has childhood trauma. Batman has always had it, but now he processes it. Every Marvel hero since roughly 2016 has been granted a wound that must be named before the final battle can be won, as though the third act were a therapy session with explosions. Even action franchises that once operated on the cheerful assumption that their protagonists were simply tough have been retrofitted with anguish. Vin Diesel, one assumes, is next. A twenty-year ban. Let characters be difficult for no revealed reason. Let them simply be who they are on a Thursday.

2. A complete prohibition on the Desaturated Palette of Seriousness. You know it at a glance: the teal-grey wash, the drained skies, the skin tones of people who appear to be recovering from something even when they are not. This is the colour grading of Importance. It says: you are not watching entertainment; you are watching a statement. Compare, if you can bear to, the colour palette of a Harry Potter film from 2004 with one from 2011: the same fictional world, drained year by year of its pigment until Hogwarts resembles a municipal car park in February. The trick has spread. Comedies are permitted to be yellow. Thrillers may be blue. But anything hoping for an award must look as though it has been filmed through a window that hasn't been cleaned since the previous administration. Zack Snyder has built an entire aesthetic on the principle that colour is frivolous; his Superman operates under skies so grey one suspects Krypton's sun was a desk lamp. A secondary offence: the increasingly popular trick of shifting the aspect ratio mid-film to signal an emotional transition, as Xavier Dolan did in Mommy and Wes Anderson in The Grand Budapest Hotel — both interestingly — and as a dozen subsequent imitators have done uninterestingly, as though the projectionist were a therapist. The screen narrows; the character is trapped. The screen widens; the character is free. This is not cinema. This is a PowerPoint transition with a budget.

3. No more prestige limited series that should have been films. The story has enough incident for ninety minutes. The director knows this. The actors know it. The audience senses it by the middle of episode three, when a character drives somewhere, arrives, pauses, looks out of a window, and drives back, and this is the entire episode. But the platform has ordered eight episodes because eight episodes means eight weeks of subscriber retention, and so the story is padded — not with incident but with lingering. Long shots of hallways. Slow pours of coffee. A conversation that, in a feature film, would last two minutes is allowed to breathe for eleven, not because breathing improves it but because someone must fill the container. Consider the recent trajectory of directors like David E. Kelley or Ryan Murphy, who have taken stories that Hitchcock would have told in a hundred minutes and draped them across seven or eight hours of premium cable, so that what should have been taut becomes merely long, and what should have been ambiguous becomes merely padded. The writers, to their credit, have invented a term for this. They call it novelistic. It is not novelistic. Novels, at their best, earn their length. This is a novella on a rack.

4. There is to be a strict quota on villain origin stories, sympathetic prequels, and any narrative whose purpose is to explain that the antagonist of a previous work was, in fact, misunderstood. The witch was protecting the forest (Maleficent). The evil queen was bullied at court (Cruella). The wicked witch of the west was a misunderstood idealist punished for her compassion (Wicked). The Joker was a failed comedian failed by the system (Joker), and then, lest the point had been insufficiently made, failed again (Joker: Folie à Deux). I understand the impulse — we live in an era of compulsive empathy, in which leaving anyone unjustified feels like an act of cruelty — but villainy without justification is one of fiction's great pleasures, and we are strip-mining it for content. Not every darkness needs a backlight. Some people in stories must simply be wrong, gloriously and unrepentantly, so that the story has somewhere to go. A provisional exception is granted for villains whose origin story reveals them to be even worse than we thought. This would at least be novel.

5. A ban on the Algorithmic Thriller, that genre which now constitutes approximately forty per cent of all streaming original films and whose plots are assembled from interchangeable components like flatpack furniture. A woman moves to a new town. The neighbours are odd. Her husband may not be who he says he is. There is a locked room, or a diary, or a child who sees things. The twist arrives at the eighty-minute mark and is either that the protagonist is the killer (and has forgotten), or that the protagonist is dead (and has not noticed), or that none of it happened (in which case neither did the film). These are not stories. They are algorithms wearing stories as a skin suit, and they have titles like The Woman in the House Across the Street from the Girl in the Window — which, remarkably, is a real programme, though it was intended as parody, a fact that tells you everything about the state of the genre it was parodying. Netflix alone has produced enough of these to fill a small country's entire cultural output. They star someone who deserves better. They are watched, according to the platform's own metrics, by tens of millions of people. They are remembered by none of them.

6. No more biopics structured as follows: a) open on the subject at the height of their fame, performing or testifying or accepting an award; b) cut to childhood, where a defining incident occurs, usually involving a disapproving parent; c) proceed chronologically through success, addiction, collapse, and redemption, pausing to recreate three or four iconic moments the audience already knows, so that the film becomes a series of photographs brought grudgingly to life; d) end on a title card informing us what happened next, as though the preceding two hours were merely a preamble to the Wikipedia entry. Bohemian Rhapsody did this. Rocketman did this while occasionally acknowledging it was doing it, which is not the same as not doing it. The musical biopic alone — your Rays, your Walks the Line, your Elvises — has settled into a formula so rigid that one suspects the scripts are generated by filling in a template: insert name, insert genre of music, insert substance of abuse, insert comeback venue. This structure must be abolished. If a life is interesting enough to film, it is interesting enough to be told slant. I'm Not There told Bob Dylan's story with six actors and no respect for chronology and was more truthful than any of them. Begin in the middle. End before the triumph. Skip the addiction entirely. Do anything — anything — other than what the last forty biopics have done.

7. A moratorium on the Whispered Performance. In the last decade a style of film acting has emerged — primarily but not exclusively in prestige drama — in which every line is delivered at a volume barely above exhalation. Characters murmur. They rasp. They breathe words at each other across beautifully lit tables while the audience leans forward and wonders if their speakers are broken. Watch any recent film starring Adam Driver or Rooney Mara or Cillian Murphy in pensive mode and you will find yourself reaching for the remote within five minutes. This is understood to be naturalistic, and it is true that people in life sometimes speak quietly. But people in life also project when they are angry, enunciate when they are desperate, and shout across rooms when they need something from the kitchen. The Whispered Performance permits none of this. Emotion is signalled not by raising the voice but by allowing a longer pause before the next murmur. It is acting as ASMR, and it must be treated as such: a niche enthusiasm, not a universal style.

A related offence: the practice of scoring these murmured scenes with music so overpowering that whatever was whispered is rendered fully inaudible, leaving the audience to deduce the emotional content from the cellos. Christopher Nolan has made this a signature — entire plot points in Tenet and Interstellar are, for practical purposes, classified information, available only to those with a Dolby Atmos system and the hearing of a young bat.

8. There shall be no more films in which the multiverse is used as a storytelling device. It was, I concede, briefly interesting — Everything Everywhere All at Once was brilliant, and Spider-Man: Into the Spider-Verse was a genuine formal invention. But the device has since been adopted by every franchise in need of a narrative defibrillator, and we have now reached the phase where the multiverse functions primarily as an excuse: an excuse to bring back dead characters, an excuse to pair actors who were in different franchises (Tobey Maguire, Andrew Garfield, and Tom Holland, all dangling from the same building, like a corporate merger rendered as acrobatics), an excuse to avoid the consequences of previous instalments, and — most damningly — an excuse to gesture at profundity while saying nothing whatsoever about the nature of choice, regret, or identity, which are the only reasons the concept was interesting in the first place. The multiverse has become the narrative equivalent of a revolving door: everyone passes through and nobody arrives.

9. No more Legacy Sequels in which an aging star returns to a franchise they last appeared in twenty or thirty years ago, is handed the torch by circumstances, holds it for ninety minutes while looking weathered but capable, and then passes it to a younger character whom the audience does not care about because the entire marketing campaign was built around the aging star. Harrison Ford has now done this three times — as Han Solo, as Indiana Jones, and as Rick Deckard — each time with diminishing returns and increasing knee trouble. Sigourney Weaver returned to a planet she had already died on. Arnold Schwarzenegger has been back so many times the word has lost all meaning. The Younger Character, meanwhile, must perform enthusiasm for a mythology they are inheriting but did not earn, like someone being given a tour of a house they cannot afford. The aging star must deliver at least one line that is a winking reference to their own age or to the passage of time — "I'm getting too old for this" — and the audience must clap, and the franchise must continue, and the torch, which no one asked to have lit again, must be carried forward into another instalment that will, inevitably, bring back another aging star. This is not storytelling. This is estate planning.

10. A ban on the Slow, Minor-Key Cover deployed in trailers to signal that a familiar property is being reimagined with darkness and gravity. You know the sound: a children's choir, or a lone female voice, singing a slowed-down rendition of a song that was originally joyful, while images of destruction unspool and a title card asks What if everything you knew … was wrong? It began, arguably, with that haunted rendition of Radiohead's "Creep" in the The Social Network trailer, which was genuinely unsettling. By the time we reached a breathy, funereal cover of "Everybody Wants to Rule the World" for Assassin's Creed and a lullaby version of "Mad World" for whatever came after that, the trick had been drained of all power. It is now the default setting of the entire trailer industry, a kind of Pavlovian solemnity machine, and it must be dismantled. Trailers are permitted to use either the original song at its original tempo or no song at all. Silence, in a trailer, would be genuinely shocking. Let us try it.

11. There is to be a total prohibition on the Quip Amid Peril. A city is being destroyed. Thousands are presumably dying. The hero, dangling from a collapsing structure, pauses to make a joke — a good joke, even, delivered with timing that suggests years of improv training — and the sidekick responds with a better one, and for a moment the film becomes a podcast between two friends who happen to be falling. Joss Whedon is usually blamed for this, and not unfairly: his Avengers films established the tone in which catastrophe and banter coexist as comfortably as gin and tonic. But the style has since metastasised well beyond Marvel into the default register of all large-scale action cinema — Jurassic World, the later Star Wars entries, every film in which Dwayne Johnson has appeared since 2015 — so that no explosion may occur without a wisecrack, no sacrifice may be made without an undercut, and no emotion may be experienced for longer than four seconds before someone relieves the room. The result is a kind of narrative anaesthesia. Nothing hurts because nothing is allowed to. The audience laughs, but the laugh is a reflex, not a pleasure, and afterwards they feel oddly empty, as though they have eaten a large meal composed entirely of garnish.

12. A strict regulation on Cinematic Housing. By this I mean the long-standing and apparently incurable practice of placing characters in homes they could not conceivably afford. The kindergarten teacher with the loft apartment in Tribeca. The aspiring novelist in a sun-drenched two-bedroom in Brooklyn with exposed brick, a claw-foot bath, and a view of a tree. The single mother who works at a diner and yet lives in a detached clapboard house with a porch, a garden, and a kitchen large enough to stage a production of La Traviata. Nancy Meyers has built a career on interiors so beautiful they upstage the actors; her kitchens alone have more charisma than most supporting casts. But the phenomenon extends far beyond her. Every romantic comedy set in New York asks us to accept that its protagonists — a florist, say, or a junior editor at a publishing house — can afford apartments whose monthly rent exceeds the GDP of a small island nation. These dwellings are not incidental; they are load-bearing. They tell the audience that this character's problems are emotional, not financial, and that the world of the film is one in which rent, mortgage interest, and the price of a square foot do not apply. The effect is a kind of architectural narcotic: everything is warm, everything is spacious, and nothing quite connects to the life the audience goes home to. If a character earns thirty-two thousand dollars a year, they should live as a person who earns thirty-two thousand dollars a year actually lives — in a flat with one window, a damp problem, and a housemate who leaves passive-aggressive notes about the recycling. This would not be glamorous. But it would be a start.

13. A ban on the Presentist Period Drama, in which a woman from 1750 or 1840 or 1920 holds opinions, delivers speeches, and makes choices that are essentially those of a well-educated progressive from the current year, thereby rendering the historical setting a kind of costume party rather than an actual foreign country with its own structures of thought. Bridgerton has made a commercial triumph of this approach, and parts of The Great leaned into it as knowing farce, which is at least honest. But the practice has spread to ostensibly serious drama, where it is less forgivable. This is not the same as writing strong female characters in historical settings — that is welcome and necessary — it is the specific practice of retrofitting contemporary ideology onto the past so seamlessly that the past ceases to be the past and becomes merely the present in a corset. The effect is doubly dishonest: it flatters the modern audience into believing they would have been the exception, and it erases the genuinely radical women of history, who were remarkable not because they held our views but because they held their own, in conditions we can barely imagine. The recent Nyad, to take one small example, might have explored what it actually meant and cost to be a queer woman athlete in the 1970s; instead it settled for a character who might as well have been training in 2023.

14. There shall be no more films structured around a mystery which, when solved, reveals that the true mystery was the protagonist's mental state, and that some or all of what we have been watching was a delusion, a hallucination, a coping mechanism, or a dying dream. I do not object to unreliable narration — Mulholland Drive remains a masterpiece, and Shutter Island earned its twist through sheer atmospheric commitment. I object to the specific, now-formulaic gambit in which psychological revelation is used in place of resolution — in which the answer to "what happened?" is "nothing happened; she was processing." The recent crop of streaming thrillers has made this a default setting: a character investigates something strange, the strangeness escalates, and in the final fifteen minutes a therapist or a medical file appears and the film essentially apologises for having had a plot. This is the narrative equivalent of waking from a dream. It can be done once, brilliantly. It has now been done often enough to constitute a genre, and the genre has a single emotion: not surprise, but the particular deflation of having invested in a story that was, by its own admission, not really there.

15. A prohibition on the Cinematic Universe Setup Scene, in which a perfectly adequate self-contained film is required, in its final minutes, to gesture toward a larger mythology it has not earned and does not need. A character appears whom we have not met. A symbol is shown whose meaning is deferred. An envelope is opened and a character says "my God" and the screen cuts to black, and we are meant to feel anticipation but in fact feel mugged. Marvel perfected this, but the contagion has spread: Universal attempted a "Dark Universe" of interconnected monster films that collapsed after a single entry; DC spent a decade building an architecture of teasers and post-credits promises for films that were then cancelled or rebooted before they arrived; Sony once inserted a scene into The Amazing Spider-Man 2 that existed solely to advertise a Sinister Six film that was never made. The audience, who came to see a film, discover they have been watching an advertisement for the next five. The film we just watched — which had a beginning, a middle, and what appeared to be an end — turns out to have been an instalment, a chapter in a saga we did not subscribe to, a pilot episode disguised as a feature.

16. A regulation on the Abandoned Meal. In films, no one eats. They order — often elaborately, in restaurants whose reservation lists would in reality stretch to the following calendar year — and then they sit before their food and conduct the scene. Perhaps they lift a fork. Perhaps they raise a glass. But the food itself is untouched, a still life in the background of a conversation that will shortly be interrupted by a phone call, a revelation, or a gunshot, at which point one or both characters will stand and leave. The bill is never paid. The wine is never finished. An entire cinematic civilisation subsists on one bite of appetiser and a sip of water before departing to do something urgent. Sandra Bullock has left more uneaten brunches than most people have eaten. Michael Corleone shot two men in a restaurant and did not finish his veal, which is perhaps forgivable, but the principle has since been extended to situations involving no gunfire at all, and still no one clears their plate. I do not ask for realism — I do not need to watch a character chew — but I would like, once, to see someone receive devastating news and keep eating, because the pasta is good and they are hungry and grief, as anyone who has experienced it knows, does not in fact suppress the appetite as reliably as cinema pretends. Let someone weep into a plate of carbonara and finish the plate. That is a scene I would remember.

17. A moratorium on the Anglophone Default. Ancient Romans speak English. So do Egyptians, Vikings, medieval Japanese samurai, and members of the French Resistance — though the last group are sometimes permitted an accent, provided it is the accent of a person who has learned English well but wishes you to know they are French. This convention is old and in some respects unavoidable; subtitles are not for everyone, and one cannot expect Russell Crowe to deliver a battle speech in reconstructed Latin. But the convention has lately hardened into something stranger: not merely an English-speaking cast, but a specifically modern English-speaking cast, whose diction, idiom, and comic timing belong unmistakably to the twenty-first century. A Roman senator in a Ridley Scott film makes a dry aside that would land well at a dinner party in Islington. The Vikings of The Northman are permitted some strangeness, but those of the History Channel speak essentially like contestants on a reality show with swords. Napoleon, as played by Joaquin Phoenix, sounds less like a Corsican artillery officer than like a man who has been woken too early from a nap in Southern California. The past, already deprived of its language, is now deprived of its manner of thought, and what remains is a costume drama in which everyone talks like us but carries a different weapon. If we must have English, let it at least be strange English — formal, off-rhythm, faintly alien — so that the audience is reminded, even for a moment, that they are watching people who are not like them, and that this is the point.

18. A total and permanent ban on post-credits scenes. The film is over. The lights should come up. We should collect our coats and leave, carrying whatever the film gave us — joy, sorrow, bewilderment, the lingering taste of overpriced popcorn — out into the evening. Instead, we are held hostage by eight minutes of scrolling text on the chance that a minor character will appear in a doorway, raise an eyebrow, and thereby promise us another instalment of a franchise we are already too deep into to quit. Samuel L. Jackson's Nick Fury has appeared after more credits than most actors appear in front of them. The post-credits scene is the cinematic equivalent of someone who, having said goodbye at the front door, returns to tell you one more thing. It is never one more thing worth hearing. Worse, it has created a secondary anxiety: the fear of leaving. Audiences now sit paralysed through the credits not out of respect for the key grip but out of a Pavlovian dread of missing a thirty-second teaser for a film that will not arrive for three years and will, when it does, require its own post-credits scene, and so on, forever, a mise en abyme of deferred satisfaction. Eternals had two. Shazam! had one that referenced a different franchise entirely. The disease has spread beyond superhero films: even animated features now hold children captive for a stinger that means nothing to them but everything to the release calendar. Let the film end. Let the audience stand. Let us, for the love of God, go home.

19. No more third acts resolved by a beam of light, a portal, a rift, a convergence, or a swirling vortex of energy above a city, which various characters must close, enter, redirect, or absorb by standing beneath it with their arms outstretched. I am aware this was supposedly already passé — it was mocked as early as the first Avengers in 2012 — but I raise it because it is still happening, not always in the old form but in evolved variants, the way a virus mutates. Ghostbusters: Afterlife did it. Eternals did it. The Flash did it. Black Adam did it so perfunctorily one suspects even the visual effects team had grown bored. The visual grammar is always identical: things fly upward, the sky darkens, the hero strains, and then — a flash of white, a silence, a return to normality. The apocalypse, averted. The sky, healed. The audience, unmoved, because they have seen the sky break and mend so many times it has become a kind of weather.

20. Finally: there shall be no more scenes in which a character, at a moment of emotional decision, removes their earbuds, takes off their headphones, or puts down their phone and looks up, as though technology were a veil between the self and the authentic world, and the removal of it were an act of spiritual liberation. This is the most modern of clichés, barely a decade old and already exhausted, in which the device is a metaphor for disconnection and the act of putting it away is a metaphor for being present, that most overused of contemporary virtues. It appears in children's films (The Mitchells vs. the Machines), in dramas, in romantic comedies, in advertisements for the very devices being symbolically renounced. It flatters the audience — yes, I too should look up more — while being delivered to them through the very screen they are being told to put down. The contradiction is total, and no one involved appears to notice.

Let the character keep the earbuds in. Let them make the brave choice, the loving choice, the right choice, while still half-listening to a podcast. That, it seems to me, is closer to how we actually live now, and it would be a more interesting film. But interesting is difficult, and difficulty is expensive, and so the earbuds come out, and the eyes go wide, and the orchestra tells us what to feel, and the credits roll over a world that looks, for a moment, almost real.

You see how easy it is to complain, how precise the satisfactions of diagnosis? One can spend an evening cataloguing the diseases of cinema without once having to propose a cure. That's why critics sleep so well.

Next
Next

Andrew Mountbatten-Windsor