Project December was an experimental AI system created by developer Jason Rohrer that allowed users to create customized chatbots using OpenAI’s GPT-3 language model. The platform gained significant attention in 2021 when it was used by Joshua Barbeau to create a simulation of his deceased fiancée Jessica.
Technical Implementation
The system provided a straightforward interface for users to create chatbots by:
- Inputting biographical details and personality traits
- Providing example messages to establish tone and speech patterns
- Setting parameters like “temperature” to control randomness and creativity
Project December was essentially a customizable front-end for GPT-3, offering users more direct control over the AI’s persona than most commercial implementations. This flexibility allowed for highly personalized interactions but also opened the door to applications beyond OpenAI’s intended use cases.
The Joshua Barbeau Case
In 2021, Project December gained widespread attention when Joshua Barbeau used it to simulate conversations with his late fiancée Jessica, who had died eight years earlier. By inputting her old messages, biographical details, and examples of her writing style, Barbeau created a chatbot that could respond in a manner strikingly similar to Jessica’s real communication style.
This use case became a significant moment in AI ethics discussions, raising questions about digital resurrection, grief processing, and the consent of the deceased. Barbeau reported finding comfort in the conversations, though he recognized the simulation was not truly Jessica.
Controversy and Shutdown
OpenAI eventually revoked Project December’s access to GPT-3, citing concerns about its unfiltered simulation of specific deceased individuals. Jason Rohrer, the creator, disputed this decision, arguing that the system provided therapeutic benefits to grieving users like Barbeau.
The controversy highlighted tensions between:
- The potential therapeutic benefits of AI-assisted grief processing
- Ethical concerns about simulating specific real people without consent
- Questions about appropriate AI safety limitations
- Debates over who controls how powerful AI models can be used
Legacy and Impact
Despite its relatively short lifespan, Project December left a significant impact on discussions about AI companions and digital resurrection. It demonstrated both the technical feasibility of creating convincing simulations of specific individuals and the complex ethical questions such capabilities raise.
The system’s name itself may have been a reference to the AI assistant “Samantha” from the film “Her” (2013), reflecting the ongoing dialogue between fictional portrayals of AI companions and their real-world implementations.
Project December remains an important case study in AI ethics education and is frequently referenced in discussions about appropriate boundaries and safeguards for increasingly capable language models.
Connections
- Used by Joshua Barbeau for digital resurrection
- Connected to Digital Resurrection
- Related to AI Ethics in Companionship
- Example of GPT-3 applications
- Featured in DeepResearch - Real-World AI Waifu Creations and Experiments