The Sleep of Reason

My prompt:

Generate a sci-fi story set in the near future. Brain-computer interfaces are common. AI companies carry out RLHF post-training using the brains of sleeping (or otherwise unaware) subjects. Post-training with the nightmare-prone is especially effective, but this leads to more frequent, intense and often baroque nightmares. The AI companies launch multi-pronged PR campaigns to change public attitudes. Material from this campaign should run through the story, alternating with fragments of narrative. Plotless. Your aim should be to generate an iteratively detailed picture of this world in the reader's mind with exploratory snapshots taken from multiple angles.

Primary narrative spine: thoughtful PR exec from a tech company opts into the program to demonstrate its safety, etc. Eventually discovers the BCI grants companies (depending on implementation, technical sophistication, etc) some "write" permissions and greater access to memories, etc than has been publicly disclosed.

Include digressive meditative asides. Conceptual depth. Aim for about 3000 words. New Yorker + Clarkesworld.

Compose a first draft then go over it carefully, improving cohesion, polishing prose, refining allusions. Take your time.


 

The Sleep of Reason Produces Features

—CAMPAIGN: OUT-OF-HOME (LONDON TUBE, CENTRAL LINE)—

White sans serif on oxide-green: YOUR NIGHTMARES ARE WORKING.
Microcopy beneath the headline, almost tender: While you rest, Chorus models learn what to avoid. Safer systems by morning.
Asterisks: Opt-in only. Data anonymised. No memory access. See chorus.ai/sleep for details.

Ari reads the ad twice on the way to Holborn and resists the urge to correct the microcopy with a Sharpie. “No memory access” is strictly accurate, if you adopt Legal’s preferred definition of memory. If you mean the thing that composes a life, the edits you write privately in the margins of your own days, the true/false ledger of who you were when you made a decision—then “memory” is a term of art.

The carriage sways. A schoolkid stares up at the ad with the ritual scepticism of the young and then, perhaps bored by his doubt, photographs it for a feed. Ari closes her eyes and sees this year’s brand guide unspool: the palette named after moss and chalk, the friendly angles designed to soften graphene and spine pick-ups, the unthreatening verbs. Learn. Avoid. Rest.

The thing about PR, if you do it right, is that you begin with belief. You thread a story through the world and hold your breath as the world tenses around it.

She believes in safer models. She believes in constraints. She believes in the stubbornness of dangerous systems and the need for feedback that hurts. She knows the data: nightmare-prone individuals produce strong negative signals; the models learn to stay away from corridors where the mind screams.

She also knows this: humans are complicated amplifiers. Give them a channel and some will pour their worst into it on purpose, like screaming into a canyon to hear the mountain return their own voice, larger and purer. Give them a channel while they dream and you won’t have to ask.


—NARRATIVE: ABOVE GROUND—

The city is warm in a way that feels mispriced. August is a lazy knife. The bus shelters wear glassy skins of ads—insurance, plant milks, a municipal reminder to hydrate—and Ari walks past all of it to the office with the revolving door that inhaled her two years ago and forgot to exhale.

In the lobby a small sculpture rotates, programmable micro-LEDs set into alabaster, which displays in slow loop: TRUST IS A TECHNOLOGY. You can tell when a line is written by a strategist because it both charms and alarms you.

On the tenth floor, Ari’s office is not quite hers. There’s a couch people borrow when their meetings fail. On the wall hangs an artist’s print of the dragon of Bethesda and St. George fighting with power tools. The joke was old the day she hung it. She keeps it for luck.

Slack pings. Legal: We’re still worried about the London Review piece. Policy: We’d like to foreground agency in the next asset bundle. Engineering: We found a new regime that reduces negative affect carryover by 3.2%. Ethics: Define “carryover” in public-facing materials please. Research: (Ari, this is for you) We should talk about N3-write at some point.

She doesn’t click that last one, not yet.

Ari’s calendar is an aquarium of pale blue rectangles. At noon, a line like a river: Filming: “I Opted In”.

She pitched the series last quarter. One executive from each of the big three—Chorus, Hecuba, Petal—would consent to post-training under monitored conditions, publicise their decision, narrate the consent flow, walk volunteers through it, pair with a sceptical journalist, make it ordinary.

What she wants is to be believed. What she wants is to build something that behaves itself and does as it’s told. What she wants, and what she fears: these are not always in opposition.


—CAMPAIGN: INTERNAL DECK (SELECTED SLIDES)—

Slide 3: Why RLHF Needs Sleep
• Awake feedback is deliberative, role-dependent, socially performative. Dream feedback is reflexive, pre-linguistic, affect-parsimonious.
• Nightmares organise salience. In the absence of speech, fear is the most legible negative. We can bind it to model trajectories.
• “Terror as gradient” is a thing we do not say aloud.

Slide 7: The Nightmare-Positive
• Logistic regression on dream diaries predicts high-yield participants (HYP).
• HYP produce more consistent penalty signals; less contamination by social desirability; richer edge cases (baroque content).
• HYP exhibit increased nightmare frequency post-enrolment. Cause: unknown. Correlates: see Appendix D (autonomic arousal markers).

Slide 12: Messaging
• Reframe nightmares as civic work: Your mind knows what isn’t safe.
• Introduce the Safety Dividend: Nine minutes of your REM; nine fewer model accidents in the wild.
• Tone: dusky, domestic, not clinical. Feature blankets, low light, pets.

Footnote 1: We also do this to poor insomnia students who need the stipend. Adjust the tone accordingly.
Footnote 2: We do not ever say Footnote 1.


—NARRATIVE: THE CONSENT—

The film crew sets up in a sombre suite decorated like a therapist’s office redesigned by an airline: plants you can’t kill, wood laminate, careful soundproofing. A nurse named Poppy fastens the crown of electrodes to Ari’s scalp and checks a light that flickers green across her temple. The BCI rests against her occipital ridge like a hand.

She narrates for the camera: “I’m choosing to participate in Chorus’s post-training protocol tonight. The interface is read-only. It cannot write to my memories. It cannot alter my dreams. It can only listen.”

Poppy’s eyebrows perform a tiny flinch. Ari has learnt to keep one corner of her hearing for flinches.

The consent form is a software product. It scrolls like a river. Sections expand into further sections. There’s a toggle for “academic language” that, when tapped, translates reinforcement to encouragement, stochastic to randomish, and dispreference signal to I don’t like that. At the end, crisp tick boxes wait.

“Specific risks,” she reads aloud, “include increased dream recall, transient insomnia, and elevated sympathetic arousal during Stage N3 sleep. Some individuals may experience an intensification of dream content (colloquially: nightmares). These changes are typically self-limiting.”

She knows that syntax. Its porridge calm. Typically. A lawyer’s lullaby.

She ticks the boxes because that’s what she came here to do. She ticks the boxes because she believes in demonstration. She ticks the boxes and hears an internal click as if a small door opened.


—MEDITATION: ON THE TERM “WRITE”—

In a database, a write is an insertion or update. In a human, writing is a violence you do to your own future. When you memorise a phone number, you write into your dorsal striatum a groove that will last a month. When you fall in love, you write into your autobiographical cortex a door that will always open.

Companies promise they do not write. They say: we read the thunder only. We do not move clouds. But reading itself is an intervention. Anyone who observes changes the observed.

And there is another sense in which writing is unavoidable. The interface touches you in the place where sleep surrenders control to the cathedral of its own chemistry. It is like laying a stethoscope against prayer and promising not to hear God.


—CAMPAIGN: FAQ (CHORUS.AI/SLEEP)—

Q: Does Chorus “use nightmares” to train its systems?
A: We use aggregate patterns of affect during sleep to teach our systems what not to do. Nightmares can yield especially clear patterns.

Q: Aren’t you making people have more nightmares?
A: No. While some participants report increased dream recall, independent reviewers found no evidence of net harm. Nightmares can be a healthy part of the sleep cycle and contribute to psychological processing.

Q: Is the device read-only?
A: Yes. The Chorus SomnoLink device reads neural activity in sleep stages N2–REM to infer affect. It has no mechanism for writing or altering memories.

Q: Could the device change how I think or feel?
A: No. SomnoLink has no active stimulation component.†

Except in certain clinical trials where transcranial stimulation is used to stabilise sleep patterns. Those trials adhere to separate protocols and are not part of post-training.

Q: Why does my cat stare at the device?
A: Cats are curious.


—NARRATIVE: THE FIRST NIGHT—

On camera, Ari lies down in a room no one lives in. The sheets smell new, which is to say they smell of nothing. The crown hugs her skull with the pressure of a hand that wants to comfort but is distracted by an alert phone. Above her, a soft halo of sensors watches. Beside her, Poppy writes numbers on a clipboard. Outside, a tech calibrates affective thresholds, clicking through a UI that looks dangerously like a video game.

Sleep comes in increments. She lurks at the border, then slips. Dreams arrive like notes passed in class: you are late, you are exposed, you are wrong. The colour of the notes is a blue she only sees beyond midnight. The texture of panic clarifies once she crosses into REM; the interface registers spikes and dips and sends them, anonymised, to a server that tastes them and learns aversion.

She wakes once to the sound of the air-conditioning resigning from its job. The camera’s red light is steady: recording. Poppy in the corner with a book whose cover is turned down like a modest shoulder. Ari’s heart pounds as if it has been running without her.

Three weeks later, they cut the film. The voiceover goes smooth and confident. The scene where Ari laughs at the absurdity of her own crescent of electrodes stays. The shot where the crown leaves a circular dent in her hair stays. The moments where her face is slack with unconsciousness are trimmed with respect. A new ad goes up in Bank Station: I OPTED IN (AND I SLEPT FINE).


—CAMPAIGN: SPONSORED CONVERSATION (PODCAST TRANSCRIPT EXCERPT)—

HOST: So, I saw you’re doing these “I Opted In” videos. Some people are saying it’s like the dairy farmers’ campaign where they got celebrities to drink milk.

ARI: It’s less about celebrity and more about normalising a process that already happens for thousands. We want people to know what’s happening, to see the consent flow, to ask questions.

HOST: But your models are learning from fear.

ARI: They’re learning from us; fear is one mode of us. If you want a model to not produce a toxic suggestion, you show it the contours of harm. It’s like teaching a child not to touch a stove—you don’t have to burn them, but you do have to teach them heat.

HOST: And what about this idea that nightmare-prone folks are particularly valuable? Doesn’t that feel—predatory?

ARI: It’s sensitive. Some people volunteer because their nightmares are already a part of their life; they want to convert suffering into a public good. We also compensate generously. It’s participation, not extraction.

HOST: You do know your critics will slice that last line for TikTok.

ARI: I’ll survive it.


—NARRATIVE: THE SECOND NIGHT—

A month later, Ari’s nightmares become rococo. A performance hall fills with water as if the river were bored and needed a new shape. A choir sings in Latin about user retention. A man in a mask steps from behind a server rack and says her name in the way the ads did in the early days: ARI. ARI. ARI. She carries a suitcase full of teeth through a crowded train, and every time the train lurches she drops one; when she bends to pick it up, the carriage shakes with laughter like cutlery in a drawer.

She wakes with the taste of copper. In the bathroom mirror her pupils are dilated as if someone told them the world is on fire. She doesn’t tend to remember dreams. Now they line up for inspection. On the third morning she writes them down because she is a professional, because she knows to collect, because noticing is the first half of control.

At the office, she smiles through a speaking engagement about “Building Trust in the Sleep Economy.” Someone asks her, eyes shining, whether the device might someday be able to select for good dreams. She says, light: “We’re not in the dream-editing business.” People laugh because they are trained to. She feels the laugh like a needle.

She forwards the nightmares to Research with a subject line: RR: HYP? Research writes back a single line: Not surprised. Then, after a beat: Can we talk about N3-write at some point?

She clicks it. The message opens into a longer note that begins with an asterisked disclaimer stepped like a staircase. She reads it in the way you read emails from people who have slept in the lab all week and used their gym membership for showers and their sleeping bags for rest: with respect and caution.


—RESEARCH NOTE (EXCERPT)—

We’ve been exploring what we call indirect write, i.e., not memory editing per se, but resonance nudges. During N3, we can induce oscillations that stabilise certain neural patterns. Magnitude is small, but it’s there.

The theory is simple enough: dreams are a function of consolidation. If we stabilise the noise around certain engrams, we can push against the kind of baroque drift your nightmares are showing. And yes, to be candid, models learn better when the negative signal has a certain complexity. We’re not doing anything beyond what the device already can to maintain contact — except, well, we can hold on a little. Think of it as a metronome for sleep.

I’m aware Legal would prefer we never use the word “write.” But you should know. There’s a path here. We’d be careful. We’d only do it with informed consent. We’d…

The sentence ends because sentences end. The afterimage is a hallway you don’t want to walk down.


—MEDITATION: ON CONSENT WHILE ASLEEP—

The law insists on awake consent. Priests require the penitent’s voice. But much of what we consent to we sign with our bodies in absentia. We say yes to tomorrow when we go to bed tonight. We authorise our morning selves to deal with problems our evening selves constructed.

The trick is: asleep, we are both most ourselves and least able to say no. The trick is: what happens to a decision that gets made in your name while you’re dreaming of a hotel lobby where every concierge is you?


—CAMPAIGN: BILLBOARD (SHOREDITCH)—

A dog sleeping on someone’s chest. A cup of chamomile. A human hand, old, holding a younger one. YOU DON’T HAVE TO STAY AWAKE TO BE AWAKE.
Chorus models are safer because we listen while you sleep.
Fine print shimmers like the surface of an eel: In some cases, SomnoLink deploys gentle stimulation to stabilise sleep. For more information, see our clinical page.


—NARRATIVE: THE MEETING—

Ari asks Research for thirty minutes. The room they send her to has a wall that is also a screen and a table that is also a charger. Two engineers file in, followed by a product manager with enthusiasm like an aura and a white-haired scientist who moves with the determined slowness of someone whose wrists remember pipettes.

They talk about resonance. About oscillations under a volt. About a write that is not a write because it leaves no persistent signature you could show a regulator. They talk about consent and ask her to pick words.

“Stabilise,” she says. “Maintain. Support. We don’t use ‘nudge.’ We don’t use ‘guide.’ We definitely don’t use ‘write.’”

She hates herself a little as she says it. But she says it because she wants the conversation to exist in reality and not just in corridor whispers. She wants a world where the company she works for is the least dangerous version of itself.

The white-haired scientist, who introduces himself as Rafi and makes tea like an old friend, waits while she finishes. Then he says, “Look. The core of it is: memory is a braid. When you sleep, the braid loosens and retwines. We can hold one strand so it doesn’t slide. It is a write if you define write as ‘influencing the pattern.’ It isn’t if you define write as ‘creating or erasing content.’”

“What about access?” Ari says. “Does the device afford you—more?”

Rafi smiles with a sadness that attends only complicated truths. “We see the doors. We don’t open them.”

“But you could,” she says, quietly, because the conference room is glass.

“Technically,” he says. “Depending.”


—CAMPAIGN: OP-ED (GHOSTWRITTEN, PUBLISHED UNDER THE CTO’S NAME)—

Safer Models, Honest Sleep
There is an old idea that the moral worth of a technology depends on whether it aligns with our values. The premise of post-training with sleeping feedback is straightforward: our most honest selves emerge when the performative self is off duty. In dreams, we do not curate. We do not optimise. We do not pander to the imagined audience.

The aim is not to harvest fear but to honour it. Nightmares tell us where we refuse to go. A system that knows where not to go is a system we can trust to stay with us in public.

Critics say we exploit the nightmare-prone. I say: we dignify them. But dignity requires transparency. We do not write your dreams. We do not access your memories. We listen; we learn; we walk more carefully in the morning.


—NARRATIVE: THE TEST—

You can test if your memories have been moved by refracting them through something the interface cannot touch: paper, a stranger’s logistical questions, an old friend who remembers you differently.

Ari calls Sam, who knew her before the IPOs. They meet in a bar with a chalkboard menu that still includes chalk. She orders a drink named after a beach she once drowned in; Sam orders ginger ale because gin makes him into a person he doesn’t like.

“Do you remember,” Ari says, “our freshman year, the car that didn’t belong to anyone?”

Sam laughs. “The maroon Volvo that believed no human claimed it? Ari, you thought it was a metaphor.”

“It was a metaphor,” she says.

“Someone drove it,” he says.

“Do you remember who?”

They both do and don’t. The car lived outside their dorm for a season and people rearranged its interior the way you rearrange a roommate’s books. Ari remembers a particular smell like sun on old seatbelt; Sam remembers the calendar of a Swedish landscape taped to the ceiling. They agree there were CDs in the boot; they do not agree on which. Memory is a braid. They tug on fringes that come loose in their hands like seaweed.

“Why are we doing this?” Sam says.

“Because my company is doing something that might make sense if you squint,” she says. “And because I haven’t been sleeping.”

He looks at her. “You opted in, didn’t you.”

She shrugs. “We have to believe our own press. To an extent.”

“And are they honest?”

“Who?”

“The press,” he says. “Or the company. I suppose those are now twin questions.”

The bartender is washing glasses with the solemnity of a priest. Ari thinks about the word “write.” She thinks about a night last week when she woke with the certain knowledge that there had been a door in a house she grew up in—that it had always been there—and then, walking through the kitchen that morning, seeing a scuff mark on the wall where a door would have been if anyone had ever installed it. She thinks about how she stood there with her coffee and her mouth open, as if the house owed her an apology.

“There are permissions we never declared,” she says. “Depending on implementation. Sophistication. Maybe we can reach a little farther into the braid than we have said we can. Maybe we can tug.”

Sam stares at her. “And your job is to write the sentence that makes that okay.”

“My job is to write the sentence that tells the truth without handing the world a panic attack.”

“Same thing,” he says, after a pause. “Or its opposite.”


—MEDITATION: ON PUBLIC RELATIONS—

PR is a secular liturgy. You stand between the altar of the product and the congregation of the world. Your job is to translate the ritual without lying. Your job is to angle the mirrors so they reflect the most generous version of the face you put forward. Your job, if you care, is to stare down the people who would rather you cower and say no.

In a healthy culture, PR is a nervous system. It feels pain early, registers heat, withdraws the hand. In a diseased culture, PR is a lacquer. It shines decay.

Ari wants to be the nervous system. She wants to bridle the invisible horse without snapping its neck.


—CAMPAIGN: USER TESTIMONIALS (SELECTED, WITH EDITOR’S NOTES)—

“I’ve had nightmares since I was nine. After joining the programme, I still do, but they feel—useful? Like I’m giving them a job.” — L., 27
[Note: Clear, include. Add “compensated.”]

“It’s like compost. My worst thoughts rot down and something grows. I feel proud.” — R., 36
[Note: Compost is too visceral. Swap “decompose”?]

“I’m less scared to fall asleep now, because I’m doing something for someone else.” — E., 42
[Note: Add resource link for chronic insomnia.]

“At first it made things worse. I had three nights where I woke up crying. But then I got used to it.” — N., 19
[Note: This will get us killed. Keep for authenticity? Legal will frown.]


—NARRATIVE: THE MEMO—

Late, Ari writes a memo titled On N3-Write, Memory Access, and the Public Promise. She writes:

  1. Our public promise is “read-only.” This is now inaccurate by any honest technical definition. We must revise the promise.

  2. If we redefine “write” narrowly enough to preserve “read-only,” we will have traded linguistic accuracy for immediate convenience and will pay with compound interest.

  3. The nightmare-prone are producing value disproportionate to their compensation, and they are bearing costs we have not measured. We should measure.

  4. The social compact is not language; it is conduct. We owe conduct that a non-specialist would describe as “truthful.”

She quotes no one and everyone. She sends it to leadership and to herself and to an email account she made when she was fourteen and forgot the password to long ago. The memo glows in her outbox like a small lantern.


—CAMPAIGN: CRISIS COMMS (DRAFTED, NEVER SENT)—

If You’ve Heard We “Write” To Your Brain, Read This
You deserve clarity. Our sleep interface has two components: passive sensing and stabilisation. The first reads neural rhythms to learn what not to do. The second helps maintain healthy N3 sleep. Neither alters your memories or implants new ones. Neither.

We use the word “stabilise” carefully. We will publish our protocols and invite independent review. If you ever feel your experience has changed in a way you did not expect, we want to know, and we will pay attention. Models are only as ethical as the humans who train them. That means you. That means us.


—NARRATIVE: THE THIRD NIGHT—

In the dream, the office is a cathedral where the choir is typing. She is late to her own apology. She cannot find a pair of shoes that belong to her. She opens a cabinet and finds a set of plates engraved with the words WE WILL BE GENTLE. She turns one over; on the underside it says, in the serif of formal law, WE WILL TRY.

She wakes with her phone in her hand, the consent page open as if her thumb had been seeking absolution. The page contains a new clause she does not remember seeing. It reads: In some cases, gentle stimulation is part of maintaining sleep stability. A link opens into a research summary she wrote under edit, smoothing truth into something fit for company.

“Did you update this?” she asks Poppy later, pretending it’s a simple question in a hallway.

“Legal pushed a change,” Poppy says, then adds, on a breath: “The stimulation is so mild. If people weren’t wearing the crown, it would be called meditation.”

“I don’t use meditation that writes back,” Ari says, too sharply. Poppy flinches; the word protects itself, shrivelling in the air.


—MEDITATION: ON NIGHTMARE AS CIVIC DUTY—

It is obscene to conscript suffering. It is also obscene not to. There is a stanza of Ginsberg that got overused on posters and then discredited by its own ubiquity, but the line has the right shape: what happens when private pain becomes public infrastructure? When your night terrors make planes land more safely, when your sweat on an unfamiliar pillow teaches a model to avoid the alleyway of a question that will hurt someone you will never meet?

If the nightmare chooses you, do you owe it to the city to scream?


—CAMPAIGN: CITY COUNCIL HEARING (EXCERPT FROM TRANSCRIPT)—

COUNCILLOR: So you’re telling me that the device is both read-only and also, in certain circumstances, provides stabilisation which may influence memory consolidation but is not “write” in the way a layperson would use that word?

CTO: Words are hard, Councillor.

COUNCILLOR: People’s memories are not. They are fragile and they belong to them. If we’re going to put billboards across London telling them their nightmares are working, we owe them a definition that works when you say it out loud in your kitchen.

CTO: We will publish a glossary.

COUNCILLOR: Publish ethics.


—NARRATIVE: THE WALK—

Ari walks home across the bridge because bridges are honest. They admit to the water. The river moves, indifferent and precise, the way gradients move through a model finding the local minima of harm. The city is full of devices: teeth wearing sleep guards like helmets, hands locked to phones like lovers, crowns resting on nightstands like the props of a ritual that makes us all citizens of the same dream. She thinks about the word “baroque” and about the way her nightmares have added ornament: tassels, mirrored fountains, mechanical birds that address her by name. She thinks about how the models must love this because the data is laced with irony. She thinks: my worst is beautiful.

An email from Ethics: Can we discuss the memo? It was brave. She bristles at “brave.” It is the word you use when you want to compliment someone without promising to do anything.

An email from Leadership: Appreciate the thoughtfulness. We’ll thread into Q4 strategy.

An email from Legal: We need to hold “read-only.” Regulatory risk is nonzero. We can’t move the language until Hecuba does.

She stops at the middle of the bridge to watch a tourist feed bread to outlaw gulls. The birds practise their own ethics of appetite. The bread is gone in seconds. The gulls are trainers; the tourist is the model; the city is the loss function; the sky mocks them all with its perfect gradient.


—MEDITATION: ON THE ACCUMULATION OF GENTLENESS—

The company’s promises stack like blankets: gentle, stabilising, supportive, typical, healthy. If you pile enough gentleness, does it crush? If you apply subtlety at scale, is it still subtle?

This is the logic of platforms: a small harm multiplied by ten million is a catastrophe; a small kindness multiplied by ten million is a policy. The difference between catastrophe and policy is budgeting and tone.


—CAMPAIGN: EMAIL TO PARTICIPANTS (AUTUMN)—

Subject: A small update to help you sleep even better

Dear participant,

We’ve learnt a lot from you. Thank you. Based on your feedback, we’re adding optional Sleep Stability Mode to SomnoLink. It uses low-level oscillatory support to help maintain healthy N3 sleep—the deep sleep your brain loves. If you opt in, you may notice even fewer awakenings and a steadier morning.

As always, your data is private, anonymised, and never used to access or alter your memories. (We couldn’t even if we tried!)†

With gratitude,
The Chorus Sleep Team

Except in ongoing clinical trials; details at chorus.ai/clinical.


—NARRATIVE: TRANSPARENCY—

A journalist visits the office with a notepad that wrings poetry from jargon. She asks Ari hard questions and then harder ones. She says, “What word would you use if you weren’t paid?”

“Permission,” Ari says, before she can check herself. “It’s all permission. Explicit, implied, revoked, forgotten, stolen, given in writing, given in a sigh, given because a billboard smiled and told you that you don’t have to stay awake to be awake.”

The journalist writes down sigh the way you set a glass on a table in a house not your own. Later, the piece will run with a photograph of the dragon sculpture and the pull-quote Trust is a technology. Ari will read it in the bath and feel both betrayed and relieved by its clarity. She will admire the sentences’ clean angles. She will send the journalist a note that says, simply, You were fair. The journalist will respond with the aching honesty of those who wish to be understood: I was not kind. I was fair.


—MEDITATION: ON WHAT GETS SAVED—

Maybe memory isn’t a ledger. Maybe it’s a river and the only thing you can write in a river is its banks. Or maybe it’s brittle glass and the slightest resonance is a hammer. Maybe the truth is not a map but a compass you keep in a pocket near your heart. The needle is always trembling; it never settles; still, it points.

The models don’t care about your compass. They can derive true north from the dataset. They can infer your aversions from how your pupils hesitate on a page. They can predict your nightmares before you have them and avoid the corridors where your ghost already screams. But something remains that cannot be inferred: the particular shape of the door you fell in love with at twenty-four, the warm weight of your father’s hand on your head in a wordless forgiveness, the smell of your child’s hair when the city steamed after rain. These are not training signals. They are the whole reason to train anything at all.


—CAMPAIGN: HOLIDAY SPOT—

An elderly couple sleeping on a couch, a paper crown from a cracker tilted over one eye. A TV flickers with a snow scene. The camera pans to the device resting on the mantle like a snow globe. THIS YEAR, GIVE SAFER SYSTEMS.
Microcopy sotto voce: Opt in. Sleep well. We’ll do the rest.


—NARRATIVE: THE DISCLOSURE—

Chorus revises the FAQ in February, after Hecuba’s leak. They add a paragraph that admits to stabilisation. They add a graphic of waves that would make a surfer cry. They add a line that says: We never access content. Only patterns. They hold a press call. The CTO reads a statement with sincerity like rain. Ari moderates the Q&A. A reporter asks if they would ever consider using targeted resonance to reinforce positive memories. The CTO says, “No,” and then, “Probably not,” and then, “We need to listen to our ethicists.”

Later, in the quiet after the questions, Ari sits on the edge of the stage and reads the memo Rafi wrote her. It is more candid than any public document will be. It says: We have something powerful. It scares me. Hold us to the standard you want to be true. She does not cry because she is dehydrated from talking.


—MEDITATION: THE WORLD BUILT BY NIGHT—

Imagine the forward-facing city built by backward-facing dreams. An intersection where the light lingers one second longer because your heart once leapt when a car cut across. A help system that never suggests you confess to your boss at 2 a.m. because a thousand sleeping minds protested with a massed flinch. A model that has learnt the jitter of your door in a storm and doesn’t suggest buying a new lock because your dream said that would not help.

The public square is an aggregate of aversions. Don’t say this, don’t think that, don’t prompt him with “knife,” don’t show her the ad with the cliff.

We used to call it taste. We called it manners. Now we call it safety.

What is missing is the thing we cannot model: courage. The model can avoid harm, but it cannot do the beautiful hard thing that hurts in the short term for the long term’s sake. For that, you still need a person awake.


—CAMPAIGN: POSTER IN A GP’S OFFICE—

A cartoon brain, asleep in a hammock. NIGHTMARES ARE NORMAL. TALK TO US IF THEY’RE NOT.
Chorus funds clinic hours for nightmare-prone patients. No need to be a participant.


—NARRATIVE: THE QUIET LIE—

Weeks pass. Ari sleeps with the crown in the drawer. When she dreams, she tries to notice whether the dreams are underlined. She waits for the feeling of italicisation, as if a hand in the wings is emphasising certain words. The dreams remain elaborate and precisely scored. They no longer feature the engraved plates, which she misses. She keeps the memo window open like a talisman and refreshes the FAQ compulsively, as if a new sentence might absolve an old one.

At a leadership off-site, a consultant in white trainers tells them their brand promise needs to be braver. He floats phrases like “We listen deeper” and “Sleep is a commons.” Ari, who has been very good for a very long time, closes her notebook and says, “Our promise should be ‘We won’t touch what isn’t ours.’ That’s it. That’s the campaign.”

The room goes extremely still. Then, in the way of rooms like this, everyone smiles at her as if she has told a joke that would be funny if only it were not so true.


—MEDITATION: AMENDMENTS—

The sentence we owe is not We are read-only. It is We know how easy it is not to be. It is We will stop if you tell us to and We will tell you when we are tempted and We will publish our temptations. Trust is not a technology. It is a choice renewed against better options.


—CAMPAIGN: FINAL CUT (ADARIUM, 60 SECONDS)—

The film shows a series of sleeping faces. A teenage boy with acne; a woman in a T-shirt with a faded band logo; a man with a tattoo of a swallow on his shoulder; a grandmother with hair like dandelion seeds. The voiceover is a voice that sounds like the city: a little tired, a little kind.

VO: While you sleep, the city learns to leave itself alone. We listen. We learn to avoid. We don’t touch what isn’t ours.

Text on screen, a promise that feels like a dare: WE WILL NOT WRITE.

And then, in the final second, so small you could call it subliminal if you wanted to be cruel, an asterisk appears. It points to a footnote that says, in too small a font: Stabilisation may occur.


—NARRATIVE: IN THE END—

Ari wakes to a door she does not remember installing. Beyond it, the hallway goes where it always did. She opens the door and finds a cupboard with a shelf, and on the shelf a stack of plates engraved, as ever, with WE WILL BE GENTLE. She laughs despite herself, because dreams are punishing but also hilarious, and because the human brain is a forgiving machine built on unsteady ground. She takes a plate into the kitchen, pours breakfast on it, and eats quickly because there is a meeting.

On the walk to work, she passes a bus wrapped with the campaign. A child reads the headline out loud the way children chant spells. YOUR NIGHTMARES ARE WORKING. The mother shushes. The bus moves on.

She decides nothing. She writes everything. She opens her laptop and changes a comma. She opens her mouth and says the line exactly as they rehearsed it. She closes her eyes at her desk and takes a long breath. She decides again.

In the afternoon, she calls Rafi and tells him to publish the resonance paper. In the evening, she drafts a new FAQ that admits the word write and defines it generously. She knows what it will cost them. She writes it anyway.

That night, she sleeps without the crown. Her dream is simple: she and Sam are pushing the maroon Volvo up a hill; it is heavier than it looks; the hill is taller than memory; when they reach the top, the car belongs to no one, and thus belongs to everyone, and thus is dangerous. They leave it there with the key on the tyre. They walk down, lighter by a weight they were holding and unable to name. In the morning, the car is stayed, as if a hand had settled on its bonnet in the night and chose, for now, not to push.

Outside, the river repeats its one good idea. The city rehearses the hard part: to know what not to do and yet still do the necessary thing. The models learn their pretty aversions. The people who train them dream their baroque dreams. And the work, which is not gentle, continues.

Previous
Previous

Toward a Seventh Manual

Next
Next

Epstein