Beyond the theft, beyond the fraud, I worry that AI in storytelling will make us worse as people. Because since the dawn of time, human beings have sat around campfires telling each other stories as a way to share our values with each other. 🧵
Comments
Log in with your Bluesky account to leave a comment
Societies are formed by shared storytelling. What makes a hero? What makes a villain? How do we care for one another? Be it through religion or movies or literature, most of our values are shaped by the stories we tell.
What Jack Nicholson said is true: “We learn how to kiss, or to drink, talk to our buddies, all the things that you can’t really teach in social studies or history, we all learn them at the movies.” And what happens when we turn that power and responsibility over to an algorithm?
Even the great stories that aren't meant to impart a particular moral or idea are almost always asking difficult questions meant to challenge our morals or ideas.
Art doesn't happen when an artist asks, “What do people want to hear?” That's called entertainment. Art happens when an artist says, “This is what people NEED to hear.” And LLMs only feed people what they WANT.
It's happening already with streaming algorithms and the self-selection of social media feeds. But even the hackiest filmmakers will usually try to imbue a tiny shred of truth and wisdom in their latest Netflix slop.
So what happens when our stories stop making us wonder? What happens when they stop challenging our preconceptions? What happens when our age-old method of making us moral gets reduced to, “What do YOU want to hear?”
Comments