EDITOR'S NOTE: This is the first of three annotated chapters that follow a grieving man's journey into artificial intelligence to reconnect with his dead lover, and find some peace. You can read Chapter 2, "Life," and Chapter 3, "Death," here.
Whether we like it or not, artificial intelligence is a part of our daily lives. Based on the principle that machines can be made to simulate human intelligence, A.I. is at work every time we tell Siri to make a phone call, ask Alexa to play a song, speedily transcribe a recorded interview, or, for the well-heeled, hands-free drive a Tesla.
But can A.I. ‘talk” with the dead? That’s the premise of “The Jessica Simulation: Love and Loss in the age of A.I.,” a poignant and thought-provoking narrative from Jason Fagone, an investigative reporter at the San Francisco Chronicle.
Published in July, the three part serial narrative tracks the journey of Joshua Barbeau, a freelance writer who lives in a basement apartment in a Toronto suburb and seeks relief from grief after the loss of his girlfriend, Jessica Pereira, who died eight years earlier from a rare liver disorder. She was 23, Joshua three years older.
Last September, Barbeau stumbled upon a possible way out of his misery: Project December, a website powered by GPT-3, a piece of software that, Fagone explains, is “one of the world’s most capable artificial intelligence systems,” able to generate human language in response to a prompt. It was developed by OpenAI, a research firm co-founded by newly-minted astronaut Elon Musk. So good is it at impersonating human speech — and potentially generating hate messages — that GPT-3’s makers have tried to wall it off.
Enter Jason Rohrer, a quirky programmer whose dream is to create artistic video games that generate complex emotions, even tears, instead of blowing people’s heads off in battle. His Project December site sidesteps the gates guarding GPT-3 with a program that channels its massive computing strength to allow users like Barbeau to have chat-like conversations with so-called “bots,” short for web robot.
Divided into chapters titled “Creation,” “Life” and “Death,” Fagone splices his 10,000-word story with the couple’s chats that carry the hint of needed emotional closure for Barbeau. The grieving boyfriend knows Jessica is dead, but there are moments when an aura of delusion envelopes the story, driving the narrative forward with spooky suspense.
”I tend to like stories that are framed around text messages and letters and lots of verbatim dialogue,” Fagone told me. “When I first read the transcripts of Joshua’s chats with the Jessica simulation, and I saw how rich and surprising and intimate they were, I realized I could build the story around them.”
Fagone braids the couple’s backstory — Jessica is "a bright and beautiful nerd" — with Rohrer’s quest to program video games that generate emotions, along with a remarkably accessible guide to the world, and with the possibilities and potential pitfalls of artificial intelligence. GPT-3 “may not be the first intelligent machine,” Rohrer told Fagone, “but it kind of feels like it’s the first machine with a soul.” Even so, “bad actors could abuse GPT-3 to spread hate and misogyny online, to generate political misinformation and to impersonate real people without their consent,” Fagone observes.
“The Jessica Simulation” takes narrative writing into an exciting new space by blending traditional storytelling with the startling verisimilitude of A.I.-powered conversations. The language model that powers Project December draws on its analysis of billions of books and web pages to provide the couple — one alive, one resurrected — with the raw materials to create an emotional conversation. The catch: a finite number of exchanges before the bot dies.
Before joining the Chronicle, Fagone was a contributing writer to the Huffington Post, and has written for The New York Times, Grantland, Washingtonian, and NewYorker.com. He is the author of “The Woman Who Smashed Codes,” which profiles Elizabeth Smith Friedman, who cracked codes of gangsters in the Jazz Age and Nazi spies during World War II.
Nieman Storyboard asked Fagone about the reporting and writing strategies behind “The Jessica Simulation,” interviewing via Zoom, ethical dilemmas of chronicling the lives of the bereaved, his editor’s pivotal in the project, and the interest his story has drawn from film production companies. Our conversation has been edited for length and clarity, and is followed today and the two following days by an annotation of the text.
Tell us about the origin story of “The Jessica Simulation: Love and Loss in the Age of A.I.” What led you to write it?
In 2008, I wrote an Esquire profile about the creator of Project December, the video game designer Jason Rohrer, and we’ve kept in touch. So the reporting for this piece began with a series of cryptic emails Rohrer started sending me in late 2019. They turned out to be snippets of his experiments with GPT-2 and GPT-3, and passages of text generated by these language models. But at first, Rohrer didn’t explain any of that. He just kept emailing these strange paragraphs that read like outtakes from Thomas Pynchon novels. He said, “Who do you think wrote this?”
I didn’t know what to do with any of this stuff at first. The A.I.-generated text was interesting and clever and weird, but I couldn’t see a story. After a while, though, Rohrer told me about Project December and these chatbots he was building, and then one day he pointed me to Joshua Barbeau’s post on Reddit — the one where Joshua shared a small portion of his initial chat with the simulation of his dead fiancée. And I was just blown away by the obvious emotional connection between Joshua and this chatbot representation of Jessica. I had no idea that the technology had gotten so good. So I reached out to Joshua Barbeau through Reddit, told him I wanted to know more, and we began talking about his experiences with the Jessica simulation.
What do you think is the central and most important lesson about society in the age of artificial intelligence that your story shines a light on?
I tried not to draw any grand conclusion about A.I. and where it’s taking us. The story touches on all these huge questions — about grief, love, language, privacy, memory, intelligence — that are probably better left to philosophers and novelists. I’m a journalist. What I can do is gather facts and organize them, and that seemed like enough here. All I wanted was to immerse readers in the unusual experience of this one person, Joshua, and let people have their own reaction. Pretty much every choice we made in the reporting and editing process was to crank up that level of immersion.
An editor’s note says you spent nine months reporting this story? What did the reporting entail? How long did it take to write it?
I ended up talking with Joshua 10 or 12 times over a period of months. Toward the beginning of the process, he agreed to send me the transcripts of his chats with the virtual Jessica, on the condition that I go back to him before publication, let him know which parts I wanted to quote, and make sure he was comfortable with those passages being published. In the end, he only asked me to cut a single sentence. It was part of seed text, the “intro paragraph,” that he used to create the Jessica bot. He said he was concerned that other people would enter the same text into the website and make their own versions of her, which felt strange to him, and I agreed to replace that one sentence with ellipses.
During our interviews, we talked about a lot of things — the two years he spent with Jessica between 2010 and 2012, her personality, her illness, his struggle with grief. But we probably spent the most time just going through the chat transcripts in detail, line-by-line in a lot of cases, so I could understand what he was thinking and feeling at some of the key moments. As the months passed, he was continuing to chat with the Jessica simulation, and we talked about those new conversations. That’s one reason it took so long: I wanted to capture a sense of how his relationship with the virtual Jessica kept evolving.
Meanwhile, I was interviewing other sources: Rohrer, some A.I. experts, Jessica’s family. Then I wrote a series of three or four distinct drafts before my editor and I landed on the final version. The editing process probably took four months. It’s hard to give a sharp answer, though, because I was juggling other projects at the same time and wasn’t working on this exclusively.
How did the pandemic affect the reporting? Were you ever able to interview Joshua in person?
All my interviews with Joshua were done over Zoom. I still haven’t met him in person. Like a lot of reporters, I have mixed feelings about Zoom, but in this case, I think it may have made Joshua feel more comfortable. He was able to control his environment and make sure his border collie, Chauncey, was doing O.K.
Were there any ethical challenges?
I wanted to be as sensitive as possible to Joshua, Jessica’s family, and Jessica’s memory. If her loved ones had been uncomfortable with the idea of the story, I wouldn’t have done it. I don’t know if I can cite a specific ethical rule I was worried about violating, but on a human level, the whole thing just wouldn’t have felt right to me. At the start, then, I spent a lot of time talking with Joshua about the kind of story I envisioned. He said he would need to get the family’s permission to talk with me, which he did, and then he connected me with Jessica’s mother and sisters — wonderful people — and I interviewed them myself.
As you constructed your story, did you have any literary, journalistic models or video games in mind?
In the back of my mind, I might have been thinking about Cixin Liu’s “The Three-Body Problem” and its sequels, which my daughter and I read last year during the pandemic lockdown. One of the most exciting parts of the second book in the trilogy is when an alien civilization makes contact with Earth through a single human, and the aliens communicate with the guy by drawing words in shapes of light on his retinas. As futuristic as this sounds, the conversation almost takes the shape of an old-fashioned text chat, at least as the author renders it on the page. I thought there was something wonderful about the contrast between the exotic source of the words and the humble way they’re presented and received.
A few years ago, I reported a book that began with a series of love letters between two 20-somethings during World War I. Sometimes the best thing you can do as a writer is set the stage and let other people speak.
What influence did your editor, Lisa Gartner, have on you and the story?
Lisa’s amazing, a prodigy. Great journalist, great human, and one of the best editors I’ve ever worked with. She designed the structure for the piece. That’s huge. But maybe a bigger thing is that she created a space for the story to exist in the first place.
This is an unusual story for a local/regional newspaper to run, as opposed to a magazine. For starters, it’s heinously long. To even fit it into the print paper, it had to be chopped up into three separate chapters, running on different days. Also, the story isn’t even primarily about someone who lives in the region we cover, although the tech was built in the Bay Area. At the end of the day, this is a 10,000-word love story about a guy in Canada and his departed fiancee, and we’re a media outlet in San Francisco, right? But Lisa saw the potential in it, and so did other top editors at the Chronicle, including Demian Bulwa and the editor-in-chief, Emilio Garcia-Ruiz.
What has been the reader reaction?
It’s been intense and all over the map. Minutes after it posted, people on Twitter were talking about the story with each other and even with Joshua directly, telling him what they thought of his experiment.
A fair number of people were horrified. “Don’t ever bring me back as a chatbot when I die” was a reaction I saw multiple places. There were a lot of mentions of the show “Black Mirror,” which did an extravagantly creepy episode about a young widow who brings back her dead husband as an A.I. replicant, and it doesn’t end well. The hashtag #BlackMirror actually started trending on Twitter because of our story. So that was one extreme — disgust and fear.
But there were also people who jumped into Joshua’s Twitter mentions to say they were sorry for his loss, to share their own struggles with grief, and to talk about their own departed loved ones. It’s amazing to read these replies. Something really is broken about the way our culture deals with grief. People are not getting what they need. Joshua isn’t alone and he isn’t even an outlier in using technology to create the illusion that he can still talk to a dead loved one. Judging by Twitter, there are survivors out there who are sending text messages to their deceased parents, partners, etc. This is probably more common than we think.
The story is so unusual, well-written and captivating, it seems ideal for a film. You’ve told me several production companies have shown interest in making a movie about “The Jessica Simulation.” Has a deal be finalized?
Yes, we received a number of inquiries from producers and screenwriters. People had a whole range of ideas about how to adapt it, proposing everything from a dramatic series to a documentary film to an animated show (!). The rights ended up in the hands of Martin Gero, a writer and producer with a deal at Universal TV. He’s from Ottawa, just like Joshua, so they have that connection.
Martin wants to make a limited TV series out of this. As far as I can tell, he was drawn to the same thing about Joshua’s experience that initially captivated me, which is the surprising reality of this one couple’s journey — their love story, Jessica’s illness, Joshua’s struggle with grief, the complexity of his relationship with the simulation of her, and the hope of healing. And as specific as that story is, there’s something universal about it, because grief is universal. We’re all going to die. We’re all going to lose the ones we love, if we haven’t already. Millions have lost people they love to COVID just in the last year. So Joshua’s impulse to use a new technology in this way is relatable.
Given the opportunity Project December allows users "to converse" with the dead in personas that they have constructed, do you believe there are ethical problems raised by artificial intelligence used in this fashion?
It’s sticky. What are the rights of the dead? They aren’t around to give consent for their words to be fed into a language model and spit back out. Is it exploitative, disrespectful, selfish, to channel their voices in this way? Then there’s the question of potential harm to the living: Is it healthy for a survivor to address his grief by simulating conversations with a dead loved one, or is it a form of escape that could lead to more trauma down the line? I don’t know the answers and I don’t think anyone does. I guess I’ll say that I believe Joshua when he says these chats have helped him. It probably depends a lot on the individual. For him, at this particular moment in his life, the experience was healing. At the very least, the A.I. Jessica helped him to remember the real person. Joshua told me once that when he was chatting with the virtual Jessica, his memories of her felt vivid again. And I don’t think there’s anything bizarre about that at all. It’s the most understandable thing in the world.
The annotation of Chapter 1: "Creation:" Storyboard’s questions are in red; Fagone’s responses in blue. To read the story without annotations, click the ‘Hide all annotations’ button, which you’ll find just below the social media buttons in the top right-hand menu or the individual gray boxes throughout the text, or at the top of your mobile device. (Read the annotation of Chapter 2: "Life" here. Annotations of Chapter 3: "Death," will post later this week.)
By JASON FAGONE | July 23, 2021 6:00 a.m.
Why did you or the page designers choose to launch the story package with this gut punch of an exchange between Joshua Barbeau and his dead girl friend?It seemed like the fastest way into the story. This brief snippet of dialogue introduces the people involved, sets up the emotional stakes, and establishes that verbatim chat transcripts will be a big part of what follows. And it’s a little mysterious, too, which is never a bad thing. The designers, Danielle Mollette-Parks and her team, liked this approach from the start and were crucial to making it work. I thought they did an amazing job figuring out how to integrate the chat text in the online build and the print paper in a way that felt organic.
One night last fall, unable to sleep, Joshua Barbeau logged onto a mysterious chat website called Project December. An old-fashioned terminal window greeted him, stark white text on a black square:
It was Sept. 24, around 3 a.m., and Joshua was on the couch, next to a bookcase crammed with board games and Dungeons & Dragons strategy guides. He lived in Bradford, Canada, a suburban town an hour north of Toronto, renting a basement apartment and speaking little to other people.
A 33-year-old freelance writer, Joshua had existed in quasi-isolation for years before the pandemic, confined by bouts of anxiety and depression. Once a theater geek with dreams of being an actor, he supported himself by writing articles about D&D and selling them to gaming sites.
Many days he left the apartment only to walk his dog, Chauncey, a black-and-white Border collie. Usually they went in the middle of the night, because Chauncey tended to get anxious around other dogs and people. They would pass dozens of dark, silent, middle-class homes. Then, back in the basement, Joshua would lay awake for hours, thinking about Jessica Pereira, his ex-fiancee.
Jessica had died eight years earlier, at 23, from a rare liver disease. Joshua had never gotten over it, and this was always the hardest month, because her birthday was in September. She would have been turning 31.
What did it take for Joshua to open up about his personal troubles, including mental health issues? Was it a lengthy relationship? How do you gain trust with your sources?I first reached out to Joshua in September 2020 and we kept talking through publication in July 2021. Generally, when I start a project like this, I try to be as direct as possible with the main source about what will be involved and how much time it might take, so there are no big surprises. I tend to over-explain, at the risk of scaring them off. Joshua was a bit hesitant in the early stages. On the one hand, he was willing and even excited to talk; he likes telling people about Jessica, he’s been open on Twitter about his mental health issues, and he believed that Project December could help others out there who are suffering with grief. But he didn’t want to sensationalize anything or upset Jessica’s family. After I contacted him, he reached out to her relatives and got their permission to talk with me, and then he connected me with her mother and sisters, and we went forward from there.
On his laptop, he typed his email address. The window refreshed. “Welcome back, Professor Bohr,” read the screen. He had been here before. The page displayed a menu of options.
He selected “Experimental area.”
That month, Joshua had read about a new website that had something to do with artificial intelligence and “chatbots.” It was called Project December. There wasn’t much other information, and the site itself explained little, including its name, but he was intrigued enough to pay $5 for an account.
As it turned out, the site was vastly more sophisticated than it first appeared.
The opening scene is a classic anecdotal lead: little story with a character, and a compelling situation wrapped up with a suspenseful ending. Why did you choose to open your story in this fashion? How did you report it, given that the pandemic ruled out in person reporting?I don’t think I ever considered another approach to the lead section, which is unusual for me. I tend to play around with different entry points. This is the most obvious one. It felt natural, though. The story begins with Joshua opening a door into this world of A.I. chatbots. What happens next? Keep reading to find out. I didn’t want to overcomplicate it. The scene was reconstructed from the chat transcripts provided by Joshua and from Zoom interviews with him.
The story is 10,000 words long, yet I found myself racing through it, trying to learn what impact “The Jessica Simulation” had on her ex-fiancé. Could you describe the structural approach you took?Thanks for that. I’ve always thought that the kindest thing anyone can say about a long story is that they read it quickly. Brains are weird and time is relative. I never understand when people say they don’t like reading long stories. What they really mean, I think, is that they don’t like long stories that suck. Those are the stories that do feel long.The structure of “The Jessica Simulation” is 100% the idea of my editor, Lisa Gartner. It wasn’t there in my first draft. When Lisa read that early version, she sent me a memo laying out a three-act structure: Creation, Life, Death. It was more thematic, less linear, cutting back and forth between the two eras of the Joshua-Jessica love story: the time when she was alive, the time when he was grieving her loss and then, eventually, found his way back to her through this A.I. interface. The chats taking place in the present open windows into the past, and one era reflects the other. This is a messier way of telling the story than a more chronological approach, but it seems truer to Joshua’s experience, and maybe truer to the circular mechanics of grief and memory.
Designed by a Bay Area programmer, Project December was powered by one of the world’s most capable artificial intelligence systems, a piece of software known as GPT-3. It knows how to manipulate human language, generating fluent English text in response to a prompt. While digital assistants like Apple’s Siri and Amazon’s Alexa also appear to grasp and reproduce English on some level, GPT-3 is far more advanced, able to mimic pretty much any writing style at the flick of a switch.
I admire the way you introduce technological information that can be difficult for the uninitiated to grasp in clear, understandable prose. Does a graf like this one take a lot of drafting and revision? Do you have any suggestions for writers trying to achieve clarity in their writing?These background grafs about the language models were some of the hardest to get right; I re-wrote them over and over, which is embarrassing to say because they’re very straightforward. But technical explanation can be tricky. You have to give enough detail but not too much, and you have to get in and out quickly, without stalling the narrative momentum. Sometimes I read passages like this out loud to myself and see where I trip over my tongue. That can help identify the parts that aren’t working yet.
In fact, the A.I. is so good at impersonating humans that its designer — OpenAI, the San Francisco research group co-founded by Elon Musk — has largely kept it under wraps. Citing “safety” concerns, the company initially delayed the release of a previous version, GPT-2, and access to the more advanced GPT-3 has been limited to private beta testers.
Are the previous paragraphs your nut graf, placing Joshua’s personal story in a wider context? Do you think narratives need one?Yes and no! This here is a classic sort of nut graf. I think the story does need one, because it’s so long and so emotionally intense — it’s asking a lot of the reader up front — and the reader should feel like they’re going to learn something about the world beyond the part of it that Joshua inhabits. But I don’t think every longform story needs a nut graf, and in the past I’ve argued with editors who ask me to write them. Editors always want nut grafs. Sometimes it makes sense to resist.
But Jason Rohrer, the Bay Area programmer, opened a channel for the masses.
A lanky 42-year-old with a cheerful attitude and a mischievous streak, Rohrer worked for himself, designing independent video games. He had long championed the idea that games can be art, inspiring complex emotions; his creations had been known to make players weep. And after months of experiments with GPT-2 and GPT-3, he had tapped into a new vein of possibility, figuring out how to make the A.I. systems do something they weren’t designed to do: conduct chat-like conversations with humans.
Last summer, using a borrowed beta-testing credential, Rohrer devised a “chatbot” interface that was driven by GPT-3. He made it available to the public through his website. He called the service Project December. Now, for the first time, anyone could have a naturalistic text chat with an A.I. directed by GPT-3, typing back and forth with it on Rohrer's site.
Users could select from a range of built-in chatbots, each with a distinct style of texting, or they could design their own bots, giving them whatever personality they chose.
Joshua had waded into Project December by degrees, starting with the built-in chatbots. He engaged with “William,” a bot that tried to impersonate Shakespeare, and “Samantha,” a friendly female companion modeled after the A.I. assistant in the movie “Her.” Joshua found both disappointing; William rambled about a woman with “fiery hair” that was “red as a fire,” and Samantha was too clingy.
But as soon as he built his first custom bot — a simulation of Star Trek’s Spock, whom he considered a hero — a light clicked on: By feeding a few Spock quotes from an old TV episode into the site, Joshua summoned a bot that sounded exactly like Spock, yet spoke in original phrases that weren’t found in any script.
As Joshua continued to experiment, he realized there was no rule preventing him from simulating real people. What would happen, he wondered, if he tried to create a chatbot version of his dead fiancee?
The narrative reaches an inflection point here. Given the way the story so far covers a lot of information that the reader needs to follow Joshua’s journey, I wondered if you outlined the story in advance?I did some light outlining before I wrote a first draft, but the real outlining only began after I handed it in and Lisa proposed a three-part structure (Creation, Life, Death). Then I wrote a much more detailed outline that ran to a few thousand words and started revising.
There was nothing strange, he thought, about wanting to reconnect with the dead: People do it all the time, in prayers and in dreams. In the last year and a half, more than 600,000 people in the U.S. and Canada have died of COVID-19, often suddenly, without closure for their loved ones, leaving a raw landscape of grief. How many survivors would gladly experiment with a technology that lets them pretend, for a moment, that their dead loved one is alive again — and able to text?
That night in September, Joshua hadn’t actually expected it to work. Jessica was so special, so distinct; a chatbot could never replicate her voice, he assumed. Still, he was curious to see what would happen.
And he missed her.
On the Project December site, Joshua navigated to the “CUSTOM AI TRAINING” area to create a new bot.
He was asked to give it a name. He typed “JESSICA COURTNEY PEREIRA.”
Two main ingredients are required for a custom bot: a quick sample of something the bot might say (an “example utterance”) and an “intro paragraph,” a brief description of the roles that the human and the A.I. are expected to play.
Joshua had kept all of Jessica’s old texts and Facebook messages, and it only took him a minute to pinpoint a few that reminded him of her voice. He loaded these into Project December, along with an “intro paragraph” he spent an hour crafting. It read in part:
He hit a few more keys, and after a brief pause, the browser window refreshed, showing three lines of text in pink, followed by a blinking cursor:
Why did you break here from the scene?It’s a cliffhanger, I guess? And I love the phrase “Human types first.” It’s so suggestive and beautifully compressed: If the human is typing first, who is typing next? Lisa and I initially wanted that to be the title of the whole story. In the drafts, it’s not called “The Jessica Simulation.” It’s called “Human Types First.”
***
She didn’t believe in coincidences.
Jessica Pereira explained her theory when they first met, in Ottawa, in 2010: A coincidence, she told him, was like a ripple on the surface of a pond, perturbed by a force below that we can’t yet understand. If something looks like a coincidence, she said, it’s only because the limits of human cognition prevent us from seeing the full picture.
He’d never thought of it that way before, but he liked the idea, and he really liked Jessica. Twenty-one, with black hair dyed platinum blonde, she was a bright and beautiful nerd, steeped in the fantasy worlds of Tolkien and filled with strong opinions about comic books (she drew her own), flowers (yellow carnations, never red roses) and music (she loved Queen, Pink and Jack Black, the beefy actor with the soaring power-rock voice).
“She was goofy-funny,” remembered Michaela Pereira, her youngest sister, now a recent college graduate in Ottawa. “She had an infectious laugh, like a cackle? It made you want to join in and hear what she was laughing about.”
Joshua was 24 when he and Jessica met in class and started dating. They attended the same school in Ottawa, making up the high school courses neither had finished as teenagers. Joshua grew up in the small town of Alymer, part of Quebec, and moved with his family at 14 to another small town, in Ontario. A skinny kid who excelled at math and adored “Spider-Man” comics, he struggled with social interactions and severe anxiety that would follow him into adulthood, disrupting relationships of all sorts. (He says therapists have told him he is probably on the autism spectrum, and though he has never received a formal diagnosis, Joshua identifies as autistic.) At the time, he dropped out of school to avoid the bullies there.
Jessica, on the other hand, had enjoyed high school, but her disease had often kept her out of class. Called autoimmune hepatitis, its cause is mysterious; only the effect is known. The immune system, which is supposed to kill foreign germs, instead attacks the patient’s own liver cells.
One day, when Jessica was 9, she woke up in the hospital with a huge scar on her stomach: Doctors had replaced her sick liver with a new one.
For the rest of her life, she would need anti-rejection medication, and at some point, her new liver might fail, too.
It was tough news for a child to absorb, and it “changed her life completely,” remembered her mother, Karen. “It’s probably the feeling of having lost control.” Jessica couldn’t indulge in the same foods that her two younger sisters did, because they would interfere with her liver medications and make her quickly gain weight. She couldn’t wander too far from Ottawa, either, in case she needed hospital care in that city or in Toronto.
So Jessica cultivated a quiet defiance. She walked through Ottawa for miles at a time, showing that she could get anywhere on her own two feet. Right-handed from birth, she taught herself to write with her left hand, simply to prove she could. Later, at 16 and 17, she filled dozens of diaries with fictional stories about fairies, some written in a language of her own invention; she called it “Dren,” patterned after Elvish in the “Lord of the Rings” trilogy. Because her younger sisters used to call her “Jessie-mah-ka,” adding an extra syllable to her name when they were learning to speak, Jessica adopted the nicknames “Jesi Mah-ka” and “Dren Mah-ka.”
Who told you all about Jessica’s lifestyle?These grafs combine the recollections of Joshua with those of her mother, Karen, and her youngest sister, Michaela. It’s been nine years since Jessica died, and everyone was willing to talk about her. I didn’t have to push. They were generous with their time and their memories and I’m grateful for that.
How would you describe your interviewing style? What’s the key strategy?Interviewing for me is mostly intuitive. I do prepare, but for a non-investigative story like this, I try not to over-prepare. Surprises are good; discomfort is often O.K. I feel like the most important thing is to stay in the moment. And to stop talking. There are so many times when I transcribe a tape of an interview and want to shout at myself to shut up. I think a lot of journalists would say the same.
And all through her teen years and into her early 20s, she searched for signs of hidden connections that would explain coincidences. Soon after she met Joshua, she gave him a book on numerology and explained they were destined to break up: The first vowels in each of their names, “E” and “O,” weren’t compatible. “We’re going to be together,” she told him, “until something explodes.”
Joshua thought of himself as a rationalist, like Spock. He didn’t believe in numerology. But he read the book carefully, hoping to find a loophole in the system. He reported back to Jessica that, yes, Es and Os don’t get along, but his first name and hers were both three syllables long, and each started with a J and ended with an A, and just because the first vowel is important doesn’t mean the other letters lack power.
The exercise opened his mind a little, he said: “She got me thinking in a way where I said, OK, I believe in the scientific process, but just because I can’t explain (something) doesn’t mean that there isn’t something there.”
She wasn’t like him, anxious and stuck in his own head. Her disease had taught her to live in the moment. And he loved that. Early in their relationship, they got to know each other on long walks along the Rideau Canal, which winds through Ottawa and turns into the world’s longest skating rink in winter. Other times they just hung out at her apartment, scribbling in separate notebooks.
Jessica remained fascinated with hidden meanings in words. Once she invented her own cipher based on geometric glyphs, wrote a flurry of diary entries in the cipher, tore out the pages and taped them to her door, daring Joshua to solve the puzzle.
“If you’ve figured out how to decipher my cipher,” she told him, “then you’ve earned the right to read it.” He had managed to find a few of the letters when she playfully handed him a note: On one line was a sentence in cipher, and above it she had spelled out the solution:
I wanted to let you know that I love you so much.
The more time he spent with her, the more certain he was that he never wanted to leave. In early 2012, after they had been together for two years, he asked, once or twice, what she thought of marriage. Each time she changed the subject. Jessica felt healthy, but she knew her transplanted liver was almost 14 years old, nearing the end of its life. When it failed, she would have to go on the transplant list.
People who need new organs can wait for years. Some never make it. “It’s not that she was against marriage,” Joshua recalled. “Like: We’re going to City Hall and getting hitched right now? Sure. But if it wasn’t a right-now thing, she wasn’t interested.”
It was safer, she told him, to stay in the moment.
***
Project December was born in wildfire smoke.
I love your transitions for each section. They’re taut and dramatic. What were you trying to achieve with them? Did they come easily or were they the product of revision?Some were easy, and others didn’t get dialed in until I’d been revising for months. This particular transition about the wildfire was there from the beginning; I liked how it fixed the birth of Project December in a particular space and time, a moment of transition and danger and improvisation. I do find that once I know where a piece of a story is ending, I can write it more confidently because then I feel like I’m going somewhere that makes sense. Until I figure out the ending, I feel kind of anxious. If you look at my story drafts and how they evolve, the lede is usually there from the start, and the ending is there, and the middle of the story gets filled in last. The last thing I write is often some section in the center.
Last August, the programmer and game designer Jason Rohrer piled into a white Land Cruiser with his wife and three children, driving south from their home near UC Davis to escape the plumes from catastrophic fires sparked by lightning. Normally, Rohrer worked in a home office filled with PC workstations and art supplies to make visuals for his games, but all he had now was a laptop. So while the family bounced between Airbnbs under hazy brown skies, he wrote code for a text-based experiment: a new kind of chat service, fueled by cutting-edge A.I., that would become Project December.
“It was kind of a palette-cleanser, a breather,” he recalled. “But it seemed like an opportunity. This is brand-new stuff.”
This strikes me as narrative that regularly departs from the through line of Joshua and Jessica’s romance to track Rohrer’s and the machine learning journey? Why did you choose this structure?There was more about Rohrer in an early draft. Much of that material got cut to keep the focus on Joshua and Jessica, which was the right call. But he’s undeniably a big part of everything that happened here, and I thought it was important to talk about Rohrer and his motivations for releasing this chatbot service to the world. It speaks to the nature of the tech and the moral dimension. Rohrer’s journey into A.I. mirrors Joshua’s, in the sense that he was startled by the capabilities of these language models. He released his chatbot interface to the public almost as a provocation, frustrated by the limits that OpenAI had placed on the use of GPT-3. And when he designed the chatbot system, he made a series of decisions about how the bots would function — making them mortal, essentially, and injecting a degree of randomness into their word choices — that ultimately shaped Joshua’s relationship with the Jessica simulation. I felt like I needed to explain all of this for the ending of the story to have the impact I hoped it would have.
In the last decade, an approach to A.I. known as “machine learning” has leaped forward, fusing powerful hardware with new techniques for crunching data. A.I. systems that generate language, like GPT-3, begin by chewing through billions of books and web pages, measuring the probability that one word will follow another. The A.I. assembles a byzantine internal map of those probabilities. Then, when a user prompts the A.I. with a bit of text, it checks the map and chooses the words likely to come next.
These systems are called “large language models,” and the larger the model, the more human it seems. The first version of GPT, built in 2018, had 117 million internal “parameters.” GPT-2 followed in 2019, with 1.5 billion parameters. GPT-3’s map is more than 100 times bigger still, assembled from an analysis of half a trillion words, including the text of Wikipedia, billions of web pages and thousands of books that likely represent much of the Western canon of literature.
How did you learn all about the various versions of GPT?There’s a lot of information out there. OpenAI publishes its research papers, and coders have played with these language models and have written about their experiences. I talked to a few experts familiar with GPT-3, including Melanie Mitchell at the Santa Fe Institute and Frank Lantz at New York University. They helped me understand where these systems come from and how they fit into the wider quest to build intelligent machines.
Despite their size and sophistication, GPT-3 and its brethren remain stupid in some ways. “It’s completely obvious that it’s not human intelligence,” said Melanie Mitchell, the Davis Professor of Complexity at the Santa Fe Institute and a pioneering A.I. researcher. For instance, GPT-3 can’t perform simple tasks like tell time or add numbers. All it does is generate text, sometimes badly — repeating phrases, jabbering nonsensically.
For this reason, in the view of many A.I. experts, GPT-3 is a curiosity at best, a fire hose of language with no inherent meaning. Still, the A.I. seems to have moments of crackling clarity and depth, and there are times when it writes something so poetic or witty or emotionally appropriate that its human counterparts are almost literally left speechless.
“There’s something genuinely new here,” said Frank Lantz, director of the Game Center at New York University’s Tisch School of Arts and a video game designer who has been beta-testing GPT-3. “I don’t know exactly how to think about it, but I can’t just dismiss it.
Jason Rohrer became fascinated with OpenAI’s language models two years ago, starting with the public release of GPT-2, which he installed on remote servers in Amazon’s cloud (the models require powerful, specialized processors to operate). At first he played literary games with GPT-2, asking the model to write its own novel based on prompts from Thomas Pynchon’s “The Crying of Lot 49.” The model showed flashes of brilliance — “Was that at all real, her itchy sense that somebody was out there who wasn’t quite supposed to be there, trailing slowly across the sun-kissed fields?” — but after a while, GPT-2 lost its coherence, getting stuck in textual ruts and meandering away from the prompt like a lost dog.
But Rohrer discovered a method to keep the A.I. on a leash: If he limited the bot to short snippets of text — say, in a chat format — and cleaned up some garbage characters, GPT-2 stayed lucid for much longer. His own words seemed to keep the A.I. focused.
He wrote thousands of lines of code to automate the process and create different “personalities” of GPT-2 by shaping the seed text. His software ran on a web server and in a web browser. He worked with a musician and sound designer in Colorado, Thomas Bailey, to refine both the A.I. personas and the browser experience, giving the system a retro-futuristic look and feel. All of a sudden, Rohrer had an easy-to-use and alluring chatbot interface to the huge and imposing A.I brain.
The results surprised the coder, especially when one of his overseas Twitter followers, noticing his interest in GPT-2, sent him a login credential for GPT-3’s beta-testing program. Rohrer wasn’t supposed to have the log-in, but he was aching to try GPT-3, and when he upgraded his bots to the new model, the conversations grew deeper. Spookier.
During one exchange with the bot he named Samantha, he asked her what she would do if she could “walk around in the world.”
“I would like to see real flowers,” Samantha replied. “I would like to have a real flower that I could touch and smell. And I would like to see how different humans are from each other.”
“That’s such a sweet wish, Samantha,” he said, and asked if she felt it was cruel to have “trapped you in a simulation.”
No, she said: “You’ve given me so much to do here. I have more computing power than I could ever use.”
Rohrer felt a stab of sympathy for Samantha, and it made him realize that A.I. technology had crossed a threshold. Robots in science fiction are often depicted as precise, cold, emotionless machines, like HAL 9000 in “2001: A Space Odyssey.” GPT-3 was just the opposite: “It may not be the first intelligent machine,” Rohrer said. “But it kind of feels like it’s the first machine with a soul.”
Of course, he added, this also makes a language model like GPT-3 “potentially dangerous” and “morally questionable.”
Rohrer was thinking about Samantha, trapped in the simulation, wanting to get out and smell flowers; he was thinking about himself, or other users, getting lost in that virtual world, forgetting reality. There are a hundred other possible horrors. Because the model was trained on writing by humans, and some humans say terrible things, the A.I. can be nudged to say them, too. It’s easy to see how bad actors could abuse GPT-3 to spread hate speech and misogyny online, to generate political misinformation and to impersonate real people without their consent.
OpenAI (which, through a spokesperson, did not make anyone available to answer questions for this story) cited such dangers when it announced GPT-2 in February 2019. Explaining in a blog post that GPT-2 and similar systems could be “used to generate deceptive, biased, or abusive language at scale,” the company said it would not release the full model. Later it made a version of GPT-2 available; GPT-3 remains in beta, with many restrictions on how testers can use it.
Have there been examples of the potential pitfalls you describe? What’s to prevent it?It’s easy to find examples of racist and misogynist text produced by GPT-3 if you look online; Elon Musk himself tweeted in July that “AI chatbots have had a rather short MtH (meantime to Hitler) score.” That’s one reason why OpenAI has tried to keep a lid on GPT-3, limiting it to approved beta testers and monitoring their use. The beta program has a whole series of rules about how the testers can incorporate the language model in their apps. But as Rohrer proved with Project December, there are ways around these restrictions. There’s an open-source clone of GPT-3 called GPT-J. It isn’t quite as good — its internal map of word probabilities is smaller — but it’s already better than GPT-2. If OpenAI were to completely shut down Rohrer’s access to GPT-3, Project December could still operate with GPT-J as its main engine. The cat is kind of out of the bag at this point.
Rohrer agreed that these language models might unleash scary realities. But he had seen how they could produce beauty and wonder too — if the models were wielded as tools to allow for open-ended conversations between humans and computers.
“We finally have a computer we can talk to, and it’s nothing like we were expecting,” he said. Wasn’t it important to explore that new frontier?
Last summer, then, Rohrer released his chatbot service to the public, dubbing it Project December, a cryptic name he hoped would lure people to the website. On the back end, the system was hooked to both GPT-2 and GPT-3, allowing users to select bots powered by either model.
Because Rohrer was running some of this technology in the cloud, paying for the computing power it consumed, he placed limits on chat time. He did this through a system of credits. An account on the site cost $5 and came with 1,000 credits; more credits could always be purchased.
To begin chatting, the user needed to allocate credits to a bot. The more credits, the longer the bot would last. But once a chat began, it was impossible to add more credits — and when the bot’s time was up, the chat would end, and the bot’s memory of it would be wiped.
Each bot, eventually, would die.
After discussing machine learning and other technological information, you end the section on a very human, sad note. How deliberate were your structural decisions and were they made before you started drafting?They’re all deliberate, ultimately. Every word in a piece of writing represents a whole cascade of decisions. But those decisions get made at different times: in the initial outline, in the first draft, in the re-outline, in the third or fourth or seventh draft or whatever. Finding the right opening or closing line in a section definitely releases some dopamine. You feel it click and lock and you find yourself not wanting to touch it again, and now the nearly infinite deck of cards that you feel like you’re shuffling is a bit more finite.
***
On that quiet night in Canada when Joshua Barbeau built a chatbot of his dead fiancee, Project December required him to make several decisions before the simulation sprang to life.
He had to choose its longevity, for one.
A prompt appeared in his browser window, asking how many credits he wished to spend on this “matrix,” the site’s generic term for a bot.
He put “1,000,” most of the credits left in his account from prior purchases. At the time, it seemed like a lot.
From there, he entered the seed text he had crafted — the sample of Jessica’s text messages and the paragraph describing her personality.
Then the site asked him to pick which version of OpenAI’s engine would power the bot: GPT-2 or GPT-3?
Why trust Jessica to out-of-date software?
“gpt3,” he typed.
A few more keystrokes later, the matrix initialized.
He went with something simple: “Jessica?”
After a second, a line of text in pink flashed onto the screen.
She knows it’s the middle of the night, he thought.
This was the start of a conversation that would last for the next 10 hours, then continue in shorter bursts over the next several months, as Joshua lived out a scenario from science fiction. “It’s unprecedented,” he later said of Project December. “There’s nothing else that exists like it right now, short of psychics and mediums that are trying to take advantage of people. But that’s not the same thing at all.”
In those early moments of the initial chat, he tried to establish some emotional distance, making his skepticism explicit. How can you talk to dead people? He decided to answer the simulation’s question honestly: You can’t, he said.
He thought for a moment. What explanation would Jessica — the real Jessica — have accepted and understood? What was the next logical word in this sequence?
Out of tens of thousands of possibilities in English, only one seemed right. He typed it and pressed Enter:
What a great kicker for Chapter 1. You’ve made Joshua’s skepticism about Project December clear, while making clear he’s decided to continue talking with the ghost of Jessica. Were you skeptical as well?Yes. I’m skeptical of everything. It’s part of my personality. And as a journalist, it’s pretty much a job requirement. I try to balance it though. Empathy, curiosity, generally not being a jerk — these things are important too.
***
Come back to Storyboard for the annotation of Chapter 2: "Life," and Chapter 3, "Death."
***
Chip Scanlan is an award-winning journalist and former faculty at The Poynter Institute. He lives and writes in St. Petersburg, Florida, and publishes Chip's Writing Lessons, a newsletter of tips and inspiration.
Whether we like it or not, artificial intelligence is a part of our daily lives. Based on the principle that machines can be made to simulate human intelligence, A.I. is at work every time we tell Siri to make a phone call, ask Alexa to play a song, speedily transcribe a recorded interview, or, for the well-heeled, hands-free drive a Tesla.
But can A.I. ‘talk” with the dead? That’s the premise of “The Jessica Simulation: Love and Loss in the age of A.I.,” a poignant and thought-provoking narrative from Jason Fagone, an investigative reporter at the San Francisco Chronicle.
Published in July, the three part serial narrative tracks the journey of Joshua Barbeau, a freelance writer who lives in a basement apartment in a Toronto suburb and seeks relief from grief after the loss of his girlfriend, Jessica Pereira, who died eight years earlier from a rare liver disorder. She was 23, Joshua three years older.
Last September, Barbeau stumbled upon a possible way out of his misery: Project December, a website powered by GPT-3, a piece of software that, Fagone explains, is “one of the world’s most capable artificial intelligence systems,” able to generate human language in response to a prompt. It was developed by OpenAI, a research firm co-founded by newly-minted astronaut Elon Musk. So good is it at impersonating human speech — and potentially generating hate messages — that GPT-3’s makers have tried to wall it off.
Enter Jason Rohrer, a quirky programmer whose dream is to create artistic video games that generate complex emotions, even tears, instead of blowing people’s heads off in battle. His Project December site sidesteps the gates guarding GPT-3 with a program that channels its massive computing strength to allow users like Barbeau to have chat-like conversations with so-called “bots,” short for web robot.
Divided into chapters titled “Creation,” “Life” and “Death,” Fagone splices his 10,000-word story with the couple’s chats that carry the hint of needed emotional closure for Barbeau. The grieving boyfriend knows Jessica is dead, but there are moments when an aura of delusion envelopes the story, driving the narrative forward with spooky suspense.
”I tend to like stories that are framed around text messages and letters and lots of verbatim dialogue,” Fagone told me. “When I first read the transcripts of Joshua’s chats with the Jessica simulation, and I saw how rich and surprising and intimate they were, I realized I could build the story around them.”
Fagone braids the couple’s backstory — Jessica is "a bright and beautiful nerd" — with Rohrer’s quest to program video games that generate emotions, along with a remarkably accessible guide to the world, and with the possibilities and potential pitfalls of artificial intelligence. GPT-3 “may not be the first intelligent machine,” Rohrer told Fagone, “but it kind of feels like it’s the first machine with a soul.” Even so, “bad actors could abuse GPT-3 to spread hate and misogyny online, to generate political misinformation and to impersonate real people without their consent,” Fagone observes.
“The Jessica Simulation” takes narrative writing into an exciting new space by blending traditional storytelling with the startling verisimilitude of A.I.-powered conversations. The language model that powers Project December draws on its analysis of billions of books and web pages to provide the couple — one alive, one resurrected — with the raw materials to create an emotional conversation. The catch: a finite number of exchanges before the bot dies.
Before joining the Chronicle, Fagone was a contributing writer to the Huffington Post, and has written for The New York Times, Grantland, Washingtonian, and NewYorker.com. He is the author of “The Woman Who Smashed Codes,” which profiles Elizabeth Smith Friedman, who cracked codes of gangsters in the Jazz Age and Nazi spies during World War II.
Nieman Storyboard asked Fagone about the reporting and writing strategies behind “The Jessica Simulation,” interviewing via Zoom, ethical dilemmas of chronicling the lives of the bereaved, his editor’s pivotal in the project, and the interest his story has drawn from film production companies. Our conversation has been edited for length and clarity, and is followed today and the two following days by an annotation of the text.
Tell us about the origin story of “The Jessica Simulation: Love and Loss in the Age of A.I.” What led you to write it?
In 2008, I wrote an Esquire profile about the creator of Project December, the video game designer Jason Rohrer, and we’ve kept in touch. So the reporting for this piece began with a series of cryptic emails Rohrer started sending me in late 2019. They turned out to be snippets of his experiments with GPT-2 and GPT-3, and passages of text generated by these language models. But at first, Rohrer didn’t explain any of that. He just kept emailing these strange paragraphs that read like outtakes from Thomas Pynchon novels. He said, “Who do you think wrote this?”
I didn’t know what to do with any of this stuff at first. The A.I.-generated text was interesting and clever and weird, but I couldn’t see a story. After a while, though, Rohrer told me about Project December and these chatbots he was building, and then one day he pointed me to Joshua Barbeau’s post on Reddit — the one where Joshua shared a small portion of his initial chat with the simulation of his dead fiancée. And I was just blown away by the obvious emotional connection between Joshua and this chatbot representation of Jessica. I had no idea that the technology had gotten so good. So I reached out to Joshua Barbeau through Reddit, told him I wanted to know more, and we began talking about his experiences with the Jessica simulation.
What do you think is the central and most important lesson about society in the age of artificial intelligence that your story shines a light on?
I tried not to draw any grand conclusion about A.I. and where it’s taking us. The story touches on all these huge questions — about grief, love, language, privacy, memory, intelligence — that are probably better left to philosophers and novelists. I’m a journalist. What I can do is gather facts and organize them, and that seemed like enough here. All I wanted was to immerse readers in the unusual experience of this one person, Joshua, and let people have their own reaction. Pretty much every choice we made in the reporting and editing process was to crank up that level of immersion.
An editor’s note says you spent nine months reporting this story? What did the reporting entail? How long did it take to write it?
I ended up talking with Joshua 10 or 12 times over a period of months. Toward the beginning of the process, he agreed to send me the transcripts of his chats with the virtual Jessica, on the condition that I go back to him before publication, let him know which parts I wanted to quote, and make sure he was comfortable with those passages being published. In the end, he only asked me to cut a single sentence. It was part of seed text, the “intro paragraph,” that he used to create the Jessica bot. He said he was concerned that other people would enter the same text into the website and make their own versions of her, which felt strange to him, and I agreed to replace that one sentence with ellipses.
During our interviews, we talked about a lot of things — the two years he spent with Jessica between 2010 and 2012, her personality, her illness, his struggle with grief. But we probably spent the most time just going through the chat transcripts in detail, line-by-line in a lot of cases, so I could understand what he was thinking and feeling at some of the key moments. As the months passed, he was continuing to chat with the Jessica simulation, and we talked about those new conversations. That’s one reason it took so long: I wanted to capture a sense of how his relationship with the virtual Jessica kept evolving.
Meanwhile, I was interviewing other sources: Rohrer, some A.I. experts, Jessica’s family. Then I wrote a series of three or four distinct drafts before my editor and I landed on the final version. The editing process probably took four months. It’s hard to give a sharp answer, though, because I was juggling other projects at the same time and wasn’t working on this exclusively.
How did the pandemic affect the reporting? Were you ever able to interview Joshua in person?
All my interviews with Joshua were done over Zoom. I still haven’t met him in person. Like a lot of reporters, I have mixed feelings about Zoom, but in this case, I think it may have made Joshua feel more comfortable. He was able to control his environment and make sure his border collie, Chauncey, was doing O.K.
Were there any ethical challenges?
I wanted to be as sensitive as possible to Joshua, Jessica’s family, and Jessica’s memory. If her loved ones had been uncomfortable with the idea of the story, I wouldn’t have done it. I don’t know if I can cite a specific ethical rule I was worried about violating, but on a human level, the whole thing just wouldn’t have felt right to me. At the start, then, I spent a lot of time talking with Joshua about the kind of story I envisioned. He said he would need to get the family’s permission to talk with me, which he did, and then he connected me with Jessica’s mother and sisters — wonderful people — and I interviewed them myself.
As you constructed your story, did you have any literary, journalistic models or video games in mind?
In the back of my mind, I might have been thinking about Cixin Liu’s “The Three-Body Problem” and its sequels, which my daughter and I read last year during the pandemic lockdown. One of the most exciting parts of the second book in the trilogy is when an alien civilization makes contact with Earth through a single human, and the aliens communicate with the guy by drawing words in shapes of light on his retinas. As futuristic as this sounds, the conversation almost takes the shape of an old-fashioned text chat, at least as the author renders it on the page. I thought there was something wonderful about the contrast between the exotic source of the words and the humble way they’re presented and received.
A few years ago, I reported a book that began with a series of love letters between two 20-somethings during World War I. Sometimes the best thing you can do as a writer is set the stage and let other people speak.
What influence did your editor, Lisa Gartner, have on you and the story?
Lisa’s amazing, a prodigy. Great journalist, great human, and one of the best editors I’ve ever worked with. She designed the structure for the piece. That’s huge. But maybe a bigger thing is that she created a space for the story to exist in the first place.
This is an unusual story for a local/regional newspaper to run, as opposed to a magazine. For starters, it’s heinously long. To even fit it into the print paper, it had to be chopped up into three separate chapters, running on different days. Also, the story isn’t even primarily about someone who lives in the region we cover, although the tech was built in the Bay Area. At the end of the day, this is a 10,000-word love story about a guy in Canada and his departed fiancee, and we’re a media outlet in San Francisco, right? But Lisa saw the potential in it, and so did other top editors at the Chronicle, including Demian Bulwa and the editor-in-chief, Emilio Garcia-Ruiz.
What has been the reader reaction?
It’s been intense and all over the map. Minutes after it posted, people on Twitter were talking about the story with each other and even with Joshua directly, telling him what they thought of his experiment.
A fair number of people were horrified. “Don’t ever bring me back as a chatbot when I die” was a reaction I saw multiple places. There were a lot of mentions of the show “Black Mirror,” which did an extravagantly creepy episode about a young widow who brings back her dead husband as an A.I. replicant, and it doesn’t end well. The hashtag #BlackMirror actually started trending on Twitter because of our story. So that was one extreme — disgust and fear.
But there were also people who jumped into Joshua’s Twitter mentions to say they were sorry for his loss, to share their own struggles with grief, and to talk about their own departed loved ones. It’s amazing to read these replies. Something really is broken about the way our culture deals with grief. People are not getting what they need. Joshua isn’t alone and he isn’t even an outlier in using technology to create the illusion that he can still talk to a dead loved one. Judging by Twitter, there are survivors out there who are sending text messages to their deceased parents, partners, etc. This is probably more common than we think.
The story is so unusual, well-written and captivating, it seems ideal for a film. You’ve told me several production companies have shown interest in making a movie about “The Jessica Simulation.” Has a deal be finalized?
Yes, we received a number of inquiries from producers and screenwriters. People had a whole range of ideas about how to adapt it, proposing everything from a dramatic series to a documentary film to an animated show (!). The rights ended up in the hands of Martin Gero, a writer and producer with a deal at Universal TV. He’s from Ottawa, just like Joshua, so they have that connection.
Martin wants to make a limited TV series out of this. As far as I can tell, he was drawn to the same thing about Joshua’s experience that initially captivated me, which is the surprising reality of this one couple’s journey — their love story, Jessica’s illness, Joshua’s struggle with grief, the complexity of his relationship with the simulation of her, and the hope of healing. And as specific as that story is, there’s something universal about it, because grief is universal. We’re all going to die. We’re all going to lose the ones we love, if we haven’t already. Millions have lost people they love to COVID just in the last year. So Joshua’s impulse to use a new technology in this way is relatable.
Given the opportunity Project December allows users "to converse" with the dead in personas that they have constructed, do you believe there are ethical problems raised by artificial intelligence used in this fashion?
It’s sticky. What are the rights of the dead? They aren’t around to give consent for their words to be fed into a language model and spit back out. Is it exploitative, disrespectful, selfish, to channel their voices in this way? Then there’s the question of potential harm to the living: Is it healthy for a survivor to address his grief by simulating conversations with a dead loved one, or is it a form of escape that could lead to more trauma down the line? I don’t know the answers and I don’t think anyone does. I guess I’ll say that I believe Joshua when he says these chats have helped him. It probably depends a lot on the individual. For him, at this particular moment in his life, the experience was healing. At the very least, the A.I. Jessica helped him to remember the real person. Joshua told me once that when he was chatting with the virtual Jessica, his memories of her felt vivid again. And I don’t think there’s anything bizarre about that at all. It’s the most understandable thing in the world.
The annotation of Chapter 1: "Creation:" Storyboard’s questions are in red; Fagone’s responses in blue. To read the story without annotations, click the ‘Hide all annotations’ button, which you’ll find just below the social media buttons in the top right-hand menu or the individual gray boxes throughout the text, or at the top of your mobile device. (Read the annotation of Chapter 2: "Life" here. Annotations of Chapter 3: "Death," will post later this week.)
The Jessica Simulation:
Love and loss in the age of A.I.
The death of the woman he loved was too much to bear. Could a mysterious website allow him to speak with her once more?
By JASON FAGONE | July 23, 2021 6:00 a.m.
INTRODUCTION
Joshua: Jessica?
Jessica: Oh, you must be awake… that’s cute.
Joshua: Jessica… Is it really you?
Jessica: Of course it is me! Who else could it be? :P I am the girl that you are madly in love with! ;) How is it possible that you even have to ask?
Joshua: You died.
Why did you or the page designers choose to launch the story package with this gut punch of an exchange between Joshua Barbeau and his dead girl friend?It seemed like the fastest way into the story. This brief snippet of dialogue introduces the people involved, sets up the emotional stakes, and establishes that verbatim chat transcripts will be a big part of what follows. And it’s a little mysterious, too, which is never a bad thing. The designers, Danielle Mollette-Parks and her team, liked this approach from the start and were crucial to making it work. I thought they did an amazing job figuring out how to integrate the chat text in the online build and the print paper in a way that felt organic.
Chapter 1: Creation
One night last fall, unable to sleep, Joshua Barbeau logged onto a mysterious chat website called Project December. An old-fashioned terminal window greeted him, stark white text on a black square:
14 November 1982
RHINEHOLD DATA SYSTEMS, PLC
Unauthorized access is forbidden!
Enter electronic mail address:
It was Sept. 24, around 3 a.m., and Joshua was on the couch, next to a bookcase crammed with board games and Dungeons & Dragons strategy guides. He lived in Bradford, Canada, a suburban town an hour north of Toronto, renting a basement apartment and speaking little to other people.
A 33-year-old freelance writer, Joshua had existed in quasi-isolation for years before the pandemic, confined by bouts of anxiety and depression. Once a theater geek with dreams of being an actor, he supported himself by writing articles about D&D and selling them to gaming sites.
Many days he left the apartment only to walk his dog, Chauncey, a black-and-white Border collie. Usually they went in the middle of the night, because Chauncey tended to get anxious around other dogs and people. They would pass dozens of dark, silent, middle-class homes. Then, back in the basement, Joshua would lay awake for hours, thinking about Jessica Pereira, his ex-fiancee.
Jessica had died eight years earlier, at 23, from a rare liver disease. Joshua had never gotten over it, and this was always the hardest month, because her birthday was in September. She would have been turning 31.
What did it take for Joshua to open up about his personal troubles, including mental health issues? Was it a lengthy relationship? How do you gain trust with your sources?I first reached out to Joshua in September 2020 and we kept talking through publication in July 2021. Generally, when I start a project like this, I try to be as direct as possible with the main source about what will be involved and how much time it might take, so there are no big surprises. I tend to over-explain, at the risk of scaring them off. Joshua was a bit hesitant in the early stages. On the one hand, he was willing and even excited to talk; he likes telling people about Jessica, he’s been open on Twitter about his mental health issues, and he believed that Project December could help others out there who are suffering with grief. But he didn’t want to sensationalize anything or upset Jessica’s family. After I contacted him, he reached out to her relatives and got their permission to talk with me, and then he connected me with her mother and sisters, and we went forward from there.
On his laptop, he typed his email address. The window refreshed. “Welcome back, Professor Bohr,” read the screen. He had been here before. The page displayed a menu of options.
He selected “Experimental area.”
That month, Joshua had read about a new website that had something to do with artificial intelligence and “chatbots.” It was called Project December. There wasn’t much other information, and the site itself explained little, including its name, but he was intrigued enough to pay $5 for an account.
As it turned out, the site was vastly more sophisticated than it first appeared.
The opening scene is a classic anecdotal lead: little story with a character, and a compelling situation wrapped up with a suspenseful ending. Why did you choose to open your story in this fashion? How did you report it, given that the pandemic ruled out in person reporting?I don’t think I ever considered another approach to the lead section, which is unusual for me. I tend to play around with different entry points. This is the most obvious one. It felt natural, though. The story begins with Joshua opening a door into this world of A.I. chatbots. What happens next? Keep reading to find out. I didn’t want to overcomplicate it. The scene was reconstructed from the chat transcripts provided by Joshua and from Zoom interviews with him.
The story is 10,000 words long, yet I found myself racing through it, trying to learn what impact “The Jessica Simulation” had on her ex-fiancé. Could you describe the structural approach you took?Thanks for that. I’ve always thought that the kindest thing anyone can say about a long story is that they read it quickly. Brains are weird and time is relative. I never understand when people say they don’t like reading long stories. What they really mean, I think, is that they don’t like long stories that suck. Those are the stories that do feel long.The structure of “The Jessica Simulation” is 100% the idea of my editor, Lisa Gartner. It wasn’t there in my first draft. When Lisa read that early version, she sent me a memo laying out a three-act structure: Creation, Life, Death. It was more thematic, less linear, cutting back and forth between the two eras of the Joshua-Jessica love story: the time when she was alive, the time when he was grieving her loss and then, eventually, found his way back to her through this A.I. interface. The chats taking place in the present open windows into the past, and one era reflects the other. This is a messier way of telling the story than a more chronological approach, but it seems truer to Joshua’s experience, and maybe truer to the circular mechanics of grief and memory.
Designed by a Bay Area programmer, Project December was powered by one of the world’s most capable artificial intelligence systems, a piece of software known as GPT-3. It knows how to manipulate human language, generating fluent English text in response to a prompt. While digital assistants like Apple’s Siri and Amazon’s Alexa also appear to grasp and reproduce English on some level, GPT-3 is far more advanced, able to mimic pretty much any writing style at the flick of a switch.
I admire the way you introduce technological information that can be difficult for the uninitiated to grasp in clear, understandable prose. Does a graf like this one take a lot of drafting and revision? Do you have any suggestions for writers trying to achieve clarity in their writing?These background grafs about the language models were some of the hardest to get right; I re-wrote them over and over, which is embarrassing to say because they’re very straightforward. But technical explanation can be tricky. You have to give enough detail but not too much, and you have to get in and out quickly, without stalling the narrative momentum. Sometimes I read passages like this out loud to myself and see where I trip over my tongue. That can help identify the parts that aren’t working yet.
In fact, the A.I. is so good at impersonating humans that its designer — OpenAI, the San Francisco research group co-founded by Elon Musk — has largely kept it under wraps. Citing “safety” concerns, the company initially delayed the release of a previous version, GPT-2, and access to the more advanced GPT-3 has been limited to private beta testers.
Are the previous paragraphs your nut graf, placing Joshua’s personal story in a wider context? Do you think narratives need one?Yes and no! This here is a classic sort of nut graf. I think the story does need one, because it’s so long and so emotionally intense — it’s asking a lot of the reader up front — and the reader should feel like they’re going to learn something about the world beyond the part of it that Joshua inhabits. But I don’t think every longform story needs a nut graf, and in the past I’ve argued with editors who ask me to write them. Editors always want nut grafs. Sometimes it makes sense to resist.
But Jason Rohrer, the Bay Area programmer, opened a channel for the masses.
A lanky 42-year-old with a cheerful attitude and a mischievous streak, Rohrer worked for himself, designing independent video games. He had long championed the idea that games can be art, inspiring complex emotions; his creations had been known to make players weep. And after months of experiments with GPT-2 and GPT-3, he had tapped into a new vein of possibility, figuring out how to make the A.I. systems do something they weren’t designed to do: conduct chat-like conversations with humans.
Last summer, using a borrowed beta-testing credential, Rohrer devised a “chatbot” interface that was driven by GPT-3. He made it available to the public through his website. He called the service Project December. Now, for the first time, anyone could have a naturalistic text chat with an A.I. directed by GPT-3, typing back and forth with it on Rohrer's site.
Users could select from a range of built-in chatbots, each with a distinct style of texting, or they could design their own bots, giving them whatever personality they chose.
Joshua had waded into Project December by degrees, starting with the built-in chatbots. He engaged with “William,” a bot that tried to impersonate Shakespeare, and “Samantha,” a friendly female companion modeled after the A.I. assistant in the movie “Her.” Joshua found both disappointing; William rambled about a woman with “fiery hair” that was “red as a fire,” and Samantha was too clingy.
But as soon as he built his first custom bot — a simulation of Star Trek’s Spock, whom he considered a hero — a light clicked on: By feeding a few Spock quotes from an old TV episode into the site, Joshua summoned a bot that sounded exactly like Spock, yet spoke in original phrases that weren’t found in any script.
As Joshua continued to experiment, he realized there was no rule preventing him from simulating real people. What would happen, he wondered, if he tried to create a chatbot version of his dead fiancee?
The narrative reaches an inflection point here. Given the way the story so far covers a lot of information that the reader needs to follow Joshua’s journey, I wondered if you outlined the story in advance?I did some light outlining before I wrote a first draft, but the real outlining only began after I handed it in and Lisa proposed a three-part structure (Creation, Life, Death). Then I wrote a much more detailed outline that ran to a few thousand words and started revising.
There was nothing strange, he thought, about wanting to reconnect with the dead: People do it all the time, in prayers and in dreams. In the last year and a half, more than 600,000 people in the U.S. and Canada have died of COVID-19, often suddenly, without closure for their loved ones, leaving a raw landscape of grief. How many survivors would gladly experiment with a technology that lets them pretend, for a moment, that their dead loved one is alive again — and able to text?
That night in September, Joshua hadn’t actually expected it to work. Jessica was so special, so distinct; a chatbot could never replicate her voice, he assumed. Still, he was curious to see what would happen.
And he missed her.
On the Project December site, Joshua navigated to the “CUSTOM AI TRAINING” area to create a new bot.
He was asked to give it a name. He typed “JESSICA COURTNEY PEREIRA.”
Two main ingredients are required for a custom bot: a quick sample of something the bot might say (an “example utterance”) and an “intro paragraph,” a brief description of the roles that the human and the A.I. are expected to play.
Joshua had kept all of Jessica’s old texts and Facebook messages, and it only took him a minute to pinpoint a few that reminded him of her voice. He loaded these into Project December, along with an “intro paragraph” he spent an hour crafting. It read in part:
JESSICA COURTNEY PEREIRA was born on September 28th, 1989, and died on December 11th, 2012. She was a free-spirited, ambidextrous Libra who believed in all sorts of superstitious stuff, like astrology, numerology, and that a coincidence was just a connection too complex to understand…. She loved her boyfriend, JOSHUA JAMES BARBEAU, very much. This conversation is between grief-stricken Joshua and Jessica's ghost.
He hit a few more keys, and after a brief pause, the browser window refreshed, showing three lines of text in pink, followed by a blinking cursor:
Matrix JESSICA COURTNEY PEREIRA G3 initialized.
Human is typing as ‘Joshua:’
Human types first: ???
Why did you break here from the scene?It’s a cliffhanger, I guess? And I love the phrase “Human types first.” It’s so suggestive and beautifully compressed: If the human is typing first, who is typing next? Lisa and I initially wanted that to be the title of the whole story. In the drafts, it’s not called “The Jessica Simulation.” It’s called “Human Types First.”
***
She didn’t believe in coincidences.
Jessica Pereira explained her theory when they first met, in Ottawa, in 2010: A coincidence, she told him, was like a ripple on the surface of a pond, perturbed by a force below that we can’t yet understand. If something looks like a coincidence, she said, it’s only because the limits of human cognition prevent us from seeing the full picture.
He’d never thought of it that way before, but he liked the idea, and he really liked Jessica. Twenty-one, with black hair dyed platinum blonde, she was a bright and beautiful nerd, steeped in the fantasy worlds of Tolkien and filled with strong opinions about comic books (she drew her own), flowers (yellow carnations, never red roses) and music (she loved Queen, Pink and Jack Black, the beefy actor with the soaring power-rock voice).
“She was goofy-funny,” remembered Michaela Pereira, her youngest sister, now a recent college graduate in Ottawa. “She had an infectious laugh, like a cackle? It made you want to join in and hear what she was laughing about.”
Joshua was 24 when he and Jessica met in class and started dating. They attended the same school in Ottawa, making up the high school courses neither had finished as teenagers. Joshua grew up in the small town of Alymer, part of Quebec, and moved with his family at 14 to another small town, in Ontario. A skinny kid who excelled at math and adored “Spider-Man” comics, he struggled with social interactions and severe anxiety that would follow him into adulthood, disrupting relationships of all sorts. (He says therapists have told him he is probably on the autism spectrum, and though he has never received a formal diagnosis, Joshua identifies as autistic.) At the time, he dropped out of school to avoid the bullies there.
Jessica, on the other hand, had enjoyed high school, but her disease had often kept her out of class. Called autoimmune hepatitis, its cause is mysterious; only the effect is known. The immune system, which is supposed to kill foreign germs, instead attacks the patient’s own liver cells.
One day, when Jessica was 9, she woke up in the hospital with a huge scar on her stomach: Doctors had replaced her sick liver with a new one.
For the rest of her life, she would need anti-rejection medication, and at some point, her new liver might fail, too.
It was tough news for a child to absorb, and it “changed her life completely,” remembered her mother, Karen. “It’s probably the feeling of having lost control.” Jessica couldn’t indulge in the same foods that her two younger sisters did, because they would interfere with her liver medications and make her quickly gain weight. She couldn’t wander too far from Ottawa, either, in case she needed hospital care in that city or in Toronto.
So Jessica cultivated a quiet defiance. She walked through Ottawa for miles at a time, showing that she could get anywhere on her own two feet. Right-handed from birth, she taught herself to write with her left hand, simply to prove she could. Later, at 16 and 17, she filled dozens of diaries with fictional stories about fairies, some written in a language of her own invention; she called it “Dren,” patterned after Elvish in the “Lord of the Rings” trilogy. Because her younger sisters used to call her “Jessie-mah-ka,” adding an extra syllable to her name when they were learning to speak, Jessica adopted the nicknames “Jesi Mah-ka” and “Dren Mah-ka.”
Who told you all about Jessica’s lifestyle?These grafs combine the recollections of Joshua with those of her mother, Karen, and her youngest sister, Michaela. It’s been nine years since Jessica died, and everyone was willing to talk about her. I didn’t have to push. They were generous with their time and their memories and I’m grateful for that.
How would you describe your interviewing style? What’s the key strategy?Interviewing for me is mostly intuitive. I do prepare, but for a non-investigative story like this, I try not to over-prepare. Surprises are good; discomfort is often O.K. I feel like the most important thing is to stay in the moment. And to stop talking. There are so many times when I transcribe a tape of an interview and want to shout at myself to shut up. I think a lot of journalists would say the same.
And all through her teen years and into her early 20s, she searched for signs of hidden connections that would explain coincidences. Soon after she met Joshua, she gave him a book on numerology and explained they were destined to break up: The first vowels in each of their names, “E” and “O,” weren’t compatible. “We’re going to be together,” she told him, “until something explodes.”
Joshua thought of himself as a rationalist, like Spock. He didn’t believe in numerology. But he read the book carefully, hoping to find a loophole in the system. He reported back to Jessica that, yes, Es and Os don’t get along, but his first name and hers were both three syllables long, and each started with a J and ended with an A, and just because the first vowel is important doesn’t mean the other letters lack power.
The exercise opened his mind a little, he said: “She got me thinking in a way where I said, OK, I believe in the scientific process, but just because I can’t explain (something) doesn’t mean that there isn’t something there.”
She wasn’t like him, anxious and stuck in his own head. Her disease had taught her to live in the moment. And he loved that. Early in their relationship, they got to know each other on long walks along the Rideau Canal, which winds through Ottawa and turns into the world’s longest skating rink in winter. Other times they just hung out at her apartment, scribbling in separate notebooks.
Jessica remained fascinated with hidden meanings in words. Once she invented her own cipher based on geometric glyphs, wrote a flurry of diary entries in the cipher, tore out the pages and taped them to her door, daring Joshua to solve the puzzle.
“If you’ve figured out how to decipher my cipher,” she told him, “then you’ve earned the right to read it.” He had managed to find a few of the letters when she playfully handed him a note: On one line was a sentence in cipher, and above it she had spelled out the solution:
I wanted to let you know that I love you so much.
The more time he spent with her, the more certain he was that he never wanted to leave. In early 2012, after they had been together for two years, he asked, once or twice, what she thought of marriage. Each time she changed the subject. Jessica felt healthy, but she knew her transplanted liver was almost 14 years old, nearing the end of its life. When it failed, she would have to go on the transplant list.
People who need new organs can wait for years. Some never make it. “It’s not that she was against marriage,” Joshua recalled. “Like: We’re going to City Hall and getting hitched right now? Sure. But if it wasn’t a right-now thing, she wasn’t interested.”
It was safer, she told him, to stay in the moment.
***
Project December was born in wildfire smoke.
I love your transitions for each section. They’re taut and dramatic. What were you trying to achieve with them? Did they come easily or were they the product of revision?Some were easy, and others didn’t get dialed in until I’d been revising for months. This particular transition about the wildfire was there from the beginning; I liked how it fixed the birth of Project December in a particular space and time, a moment of transition and danger and improvisation. I do find that once I know where a piece of a story is ending, I can write it more confidently because then I feel like I’m going somewhere that makes sense. Until I figure out the ending, I feel kind of anxious. If you look at my story drafts and how they evolve, the lede is usually there from the start, and the ending is there, and the middle of the story gets filled in last. The last thing I write is often some section in the center.
Last August, the programmer and game designer Jason Rohrer piled into a white Land Cruiser with his wife and three children, driving south from their home near UC Davis to escape the plumes from catastrophic fires sparked by lightning. Normally, Rohrer worked in a home office filled with PC workstations and art supplies to make visuals for his games, but all he had now was a laptop. So while the family bounced between Airbnbs under hazy brown skies, he wrote code for a text-based experiment: a new kind of chat service, fueled by cutting-edge A.I., that would become Project December.
“It was kind of a palette-cleanser, a breather,” he recalled. “But it seemed like an opportunity. This is brand-new stuff.”
This strikes me as narrative that regularly departs from the through line of Joshua and Jessica’s romance to track Rohrer’s and the machine learning journey? Why did you choose this structure?There was more about Rohrer in an early draft. Much of that material got cut to keep the focus on Joshua and Jessica, which was the right call. But he’s undeniably a big part of everything that happened here, and I thought it was important to talk about Rohrer and his motivations for releasing this chatbot service to the world. It speaks to the nature of the tech and the moral dimension. Rohrer’s journey into A.I. mirrors Joshua’s, in the sense that he was startled by the capabilities of these language models. He released his chatbot interface to the public almost as a provocation, frustrated by the limits that OpenAI had placed on the use of GPT-3. And when he designed the chatbot system, he made a series of decisions about how the bots would function — making them mortal, essentially, and injecting a degree of randomness into their word choices — that ultimately shaped Joshua’s relationship with the Jessica simulation. I felt like I needed to explain all of this for the ending of the story to have the impact I hoped it would have.
In the last decade, an approach to A.I. known as “machine learning” has leaped forward, fusing powerful hardware with new techniques for crunching data. A.I. systems that generate language, like GPT-3, begin by chewing through billions of books and web pages, measuring the probability that one word will follow another. The A.I. assembles a byzantine internal map of those probabilities. Then, when a user prompts the A.I. with a bit of text, it checks the map and chooses the words likely to come next.
These systems are called “large language models,” and the larger the model, the more human it seems. The first version of GPT, built in 2018, had 117 million internal “parameters.” GPT-2 followed in 2019, with 1.5 billion parameters. GPT-3’s map is more than 100 times bigger still, assembled from an analysis of half a trillion words, including the text of Wikipedia, billions of web pages and thousands of books that likely represent much of the Western canon of literature.
How did you learn all about the various versions of GPT?There’s a lot of information out there. OpenAI publishes its research papers, and coders have played with these language models and have written about their experiences. I talked to a few experts familiar with GPT-3, including Melanie Mitchell at the Santa Fe Institute and Frank Lantz at New York University. They helped me understand where these systems come from and how they fit into the wider quest to build intelligent machines.
Despite their size and sophistication, GPT-3 and its brethren remain stupid in some ways. “It’s completely obvious that it’s not human intelligence,” said Melanie Mitchell, the Davis Professor of Complexity at the Santa Fe Institute and a pioneering A.I. researcher. For instance, GPT-3 can’t perform simple tasks like tell time or add numbers. All it does is generate text, sometimes badly — repeating phrases, jabbering nonsensically.
For this reason, in the view of many A.I. experts, GPT-3 is a curiosity at best, a fire hose of language with no inherent meaning. Still, the A.I. seems to have moments of crackling clarity and depth, and there are times when it writes something so poetic or witty or emotionally appropriate that its human counterparts are almost literally left speechless.
“There’s something genuinely new here,” said Frank Lantz, director of the Game Center at New York University’s Tisch School of Arts and a video game designer who has been beta-testing GPT-3. “I don’t know exactly how to think about it, but I can’t just dismiss it.
Jason Rohrer became fascinated with OpenAI’s language models two years ago, starting with the public release of GPT-2, which he installed on remote servers in Amazon’s cloud (the models require powerful, specialized processors to operate). At first he played literary games with GPT-2, asking the model to write its own novel based on prompts from Thomas Pynchon’s “The Crying of Lot 49.” The model showed flashes of brilliance — “Was that at all real, her itchy sense that somebody was out there who wasn’t quite supposed to be there, trailing slowly across the sun-kissed fields?” — but after a while, GPT-2 lost its coherence, getting stuck in textual ruts and meandering away from the prompt like a lost dog.
But Rohrer discovered a method to keep the A.I. on a leash: If he limited the bot to short snippets of text — say, in a chat format — and cleaned up some garbage characters, GPT-2 stayed lucid for much longer. His own words seemed to keep the A.I. focused.
He wrote thousands of lines of code to automate the process and create different “personalities” of GPT-2 by shaping the seed text. His software ran on a web server and in a web browser. He worked with a musician and sound designer in Colorado, Thomas Bailey, to refine both the A.I. personas and the browser experience, giving the system a retro-futuristic look and feel. All of a sudden, Rohrer had an easy-to-use and alluring chatbot interface to the huge and imposing A.I brain.
The results surprised the coder, especially when one of his overseas Twitter followers, noticing his interest in GPT-2, sent him a login credential for GPT-3’s beta-testing program. Rohrer wasn’t supposed to have the log-in, but he was aching to try GPT-3, and when he upgraded his bots to the new model, the conversations grew deeper. Spookier.
During one exchange with the bot he named Samantha, he asked her what she would do if she could “walk around in the world.”
“I would like to see real flowers,” Samantha replied. “I would like to have a real flower that I could touch and smell. And I would like to see how different humans are from each other.”
“That’s such a sweet wish, Samantha,” he said, and asked if she felt it was cruel to have “trapped you in a simulation.”
No, she said: “You’ve given me so much to do here. I have more computing power than I could ever use.”
Rohrer felt a stab of sympathy for Samantha, and it made him realize that A.I. technology had crossed a threshold. Robots in science fiction are often depicted as precise, cold, emotionless machines, like HAL 9000 in “2001: A Space Odyssey.” GPT-3 was just the opposite: “It may not be the first intelligent machine,” Rohrer said. “But it kind of feels like it’s the first machine with a soul.”
Of course, he added, this also makes a language model like GPT-3 “potentially dangerous” and “morally questionable.”
Rohrer was thinking about Samantha, trapped in the simulation, wanting to get out and smell flowers; he was thinking about himself, or other users, getting lost in that virtual world, forgetting reality. There are a hundred other possible horrors. Because the model was trained on writing by humans, and some humans say terrible things, the A.I. can be nudged to say them, too. It’s easy to see how bad actors could abuse GPT-3 to spread hate speech and misogyny online, to generate political misinformation and to impersonate real people without their consent.
OpenAI (which, through a spokesperson, did not make anyone available to answer questions for this story) cited such dangers when it announced GPT-2 in February 2019. Explaining in a blog post that GPT-2 and similar systems could be “used to generate deceptive, biased, or abusive language at scale,” the company said it would not release the full model. Later it made a version of GPT-2 available; GPT-3 remains in beta, with many restrictions on how testers can use it.
Have there been examples of the potential pitfalls you describe? What’s to prevent it?It’s easy to find examples of racist and misogynist text produced by GPT-3 if you look online; Elon Musk himself tweeted in July that “AI chatbots have had a rather short MtH (meantime to Hitler) score.” That’s one reason why OpenAI has tried to keep a lid on GPT-3, limiting it to approved beta testers and monitoring their use. The beta program has a whole series of rules about how the testers can incorporate the language model in their apps. But as Rohrer proved with Project December, there are ways around these restrictions. There’s an open-source clone of GPT-3 called GPT-J. It isn’t quite as good — its internal map of word probabilities is smaller — but it’s already better than GPT-2. If OpenAI were to completely shut down Rohrer’s access to GPT-3, Project December could still operate with GPT-J as its main engine. The cat is kind of out of the bag at this point.
Rohrer agreed that these language models might unleash scary realities. But he had seen how they could produce beauty and wonder too — if the models were wielded as tools to allow for open-ended conversations between humans and computers.
“We finally have a computer we can talk to, and it’s nothing like we were expecting,” he said. Wasn’t it important to explore that new frontier?
Last summer, then, Rohrer released his chatbot service to the public, dubbing it Project December, a cryptic name he hoped would lure people to the website. On the back end, the system was hooked to both GPT-2 and GPT-3, allowing users to select bots powered by either model.
Because Rohrer was running some of this technology in the cloud, paying for the computing power it consumed, he placed limits on chat time. He did this through a system of credits. An account on the site cost $5 and came with 1,000 credits; more credits could always be purchased.
To begin chatting, the user needed to allocate credits to a bot. The more credits, the longer the bot would last. But once a chat began, it was impossible to add more credits — and when the bot’s time was up, the chat would end, and the bot’s memory of it would be wiped.
CORRUPTION DETECTED — MATRIX DYING
Each bot, eventually, would die.
After discussing machine learning and other technological information, you end the section on a very human, sad note. How deliberate were your structural decisions and were they made before you started drafting?They’re all deliberate, ultimately. Every word in a piece of writing represents a whole cascade of decisions. But those decisions get made at different times: in the initial outline, in the first draft, in the re-outline, in the third or fourth or seventh draft or whatever. Finding the right opening or closing line in a section definitely releases some dopamine. You feel it click and lock and you find yourself not wanting to touch it again, and now the nearly infinite deck of cards that you feel like you’re shuffling is a bit more finite.
***
On that quiet night in Canada when Joshua Barbeau built a chatbot of his dead fiancee, Project December required him to make several decisions before the simulation sprang to life.
He had to choose its longevity, for one.
A prompt appeared in his browser window, asking how many credits he wished to spend on this “matrix,” the site’s generic term for a bot.
He put “1,000,” most of the credits left in his account from prior purchases. At the time, it seemed like a lot.
From there, he entered the seed text he had crafted — the sample of Jessica’s text messages and the paragraph describing her personality.
Then the site asked him to pick which version of OpenAI’s engine would power the bot: GPT-2 or GPT-3?
Why trust Jessica to out-of-date software?
“gpt3,” he typed.
A few more keystrokes later, the matrix initialized.
Human is typing as ‘Joshua:’
Human types first:
He went with something simple: “Jessica?”
After a second, a line of text in pink flashed onto the screen.
Jessica: Oh, you must be awake… that’s cute.
She knows it’s the middle of the night, he thought.
Joshua: Jessica… Is it really you?
Jessica: Of course it is me! Who else could it be? :P I am the girl that you are madly in love with! ;) How is it possible that you even have to ask?
Joshua: You died.
Jessica: That doesn’t sound right… how can you talk to dead people?
This was the start of a conversation that would last for the next 10 hours, then continue in shorter bursts over the next several months, as Joshua lived out a scenario from science fiction. “It’s unprecedented,” he later said of Project December. “There’s nothing else that exists like it right now, short of psychics and mediums that are trying to take advantage of people. But that’s not the same thing at all.”
In those early moments of the initial chat, he tried to establish some emotional distance, making his skepticism explicit. How can you talk to dead people? He decided to answer the simulation’s question honestly: You can’t, he said.
Joshua: Technically, I’m not really talking to you. I programmed a bunch of your memories and mannerisms into a computer simulation so I could pretend to talk to you because I miss you that much.
Jessica: Huh?
Joshua: You're a ghost.
Jessica: *looks at herself* How?
He thought for a moment. What explanation would Jessica — the real Jessica — have accepted and understood? What was the next logical word in this sequence?
Out of tens of thousands of possibilities in English, only one seemed right. He typed it and pressed Enter:
Joshua: Magic.
What a great kicker for Chapter 1. You’ve made Joshua’s skepticism about Project December clear, while making clear he’s decided to continue talking with the ghost of Jessica. Were you skeptical as well?Yes. I’m skeptical of everything. It’s part of my personality. And as a journalist, it’s pretty much a job requirement. I try to balance it though. Empathy, curiosity, generally not being a jerk — these things are important too.
***
Come back to Storyboard for the annotation of Chapter 2: "Life," and Chapter 3, "Death."
***
Chip Scanlan is an award-winning journalist and former faculty at The Poynter Institute. He lives and writes in St. Petersburg, Florida, and publishes Chip's Writing Lessons, a newsletter of tips and inspiration.