AI DUNGEON, a text-based fantasy simulation running on OpenAI’s GPT-3, has been producing bizarre stories since May 2019. Similar to early text adventure games like Colossal Cave Adventure, you can choose from a range of standard settings – Fantasy, Mystery, Apocalyptic, Cyberpunk, Zombies – before selecting a character class and name, and generating a story.
Here was mine: “You are Mr. Magoo, a survivor trying to eke out a living in a post-apocalyptic world by scavenging among the ruins of what’s left. You have a backpack and a canteen. You haven’t eaten anything for two days, so you’re desperately searching for food.” So began Magoo’s tale of woe, about 300 words long, in which he, “half-mad with hunger,” encounters “a man in white robes” (Jesus? Gordon Ramsay?) before getting stabbed in the neck after offering him a greeting kiss.
As lame as this story might be, it points to a complex copyright issue that the gaming industry is just beginning to grapple with. I created a story using my imagination – but to do so, I used an AI assistant. So, who wrote the story? And who gets paid for the work?
AI Dungeon was developed by Nick Walton, a former researcher at a deep learning lab at Brigham Young University in Utah, who is now CEO of Latitude, a company that bills itself as “the future of AI-generated games.” While not a mainstream title, AI Dungeon has nonetheless attracted millions of players. As Magoo’s story illustrates, the player drives the narrative with actions, dialogue, and descriptions; AI Dungeon responds with text, much like a dungeon master – or a kind of fantasy improvisation.
In several years of experimenting with the tool, people have generated far more compelling, D&D-esque narratives than mine, as well as videos like “I broke AI Dungeon with my terrible writing.” It has also sparked controversy, particularly as users began to prompt it to create sexually explicit content involving children. And as AI Dungeon – and similar tools – continue to evolve, they will raise trickier questions about authorship, ownership, and copyright.
Many games offer you tools to create worlds. Classic series like Halo or Age of Empires include sophisticated map editors; Minecraft has sparked an open, imaginative form of play drawing clear inspiration from the capabilities of The Legend of Zelda: Breath of the Wild’s Fuse and Ultrahand; others, like Dreams or Roblox, are less games than platforms on which players can create further games.
Historically, claims to ownership of in-game creations or user-generated content (IGCs or UGCs) have been made moot by “take it or leave it” end-user license agreements – the dreaded EULAs no one reads. Generally, this means players relinquish any ownership of their creations by launching the game. (Minecraft is a rare exception here. Its EULA has long granted players ownership of their IGCs, with comparatively few community uprisings.)
AI adds new complexities. Laws in both the US and the UK dictate that only humans can claim authorship. In a game like AI Dungeon, where the platform essentially enables a player to “write” a narrative with the help of a chatbot, claims to ownership can become murky: Who owns the output? The company that developed the AI, or the user?
“There is a big discussion nowadays, especially about prompt engineering, about the extent to which, as a player, you shape your personality and your free and creative choices,” says Alina Trapova, a law professor at University College London who specializes in AI and copyright and has authored several papers on the copyright issues of AI Dungeon. Currently, this gray area is circumvented with an EULA. AI Dungeon’s is particularly vague. It states that users can “pretty much use the content they create however they want.” When I asked Latitude via email if I could turn my Mr. Magoo bedtime story into a play, book, or film, the support line responded promptly: “Yes, you own all rights to the content you created with AI Dungeon.”
However, games like AI Dungeon (and games created using ChatGPT, like Love in the Classroom) are based on models that have siphoned off human creativity to generate their own content. Fanfiction authors find their ideas in writing tools like Sudowrite, which uses OpenAI’s GPT-3, the precursor to GPT-4.
Things get even more complicated when someone pays the required $9.99 per month to integrate Stable Diffusion, the text-to-image generator that can conjure accompanying images in their AI Dungeon stories. Stability AI, the company behind Stable Diffusion, has been sued by visual artists and media company Getty Images.
As generative AI systems grow, the term “plagiarism machines” is gaining traction. It’s possible for players of a game that utilises G
Discussion about this post