This blog post literally wrote itself.
Okay, my fellow word nerds, before you get all up in your righteous selves and tell me that of course this post didn’t literally write itself, let me assure you I mean what I said: The damn thing literally did.
Much of itself, anyway.
If you’re anything like my tech hubby you may have found yourself mesmerized by news of an artificial intelligence engine, Chat GPT, that is proving itself to be a remarkably efficient, knowledgeable, and accessible AI writing tool. You can simply talk to this thing in regular language, like you would any other Joe, and IT TALKS BACK and tells you things. Anything you ask, in language that is striking for its conversational quality. It even utilizes humor.
See more: Here’s a mesmerizing (and somewhat disturbing) tutorial on what Chat GPT does and how it works. I’ve bookmarked it from the parts most relevant to writers, but the whole thing is fascinating.
This past weekend the hubs and I spent some time sitting in front of the program while I asked it to do a variety of things: Create an outline for an article I was pitching to a publication; write an essay in the style of a fourth grader; and—most pertinent to us as creatives and writers—to answer some hard-hitting questions about itself and its potential to render authors obsolete. Will AI replace writers?
The latter is, of course, my secret fear—and maybe yours too.
The outline was damned usable and freakishly competent—I could easily have written a pitch from what the bot gave me, if it had been more relevant to the topic I wanted to propose. That had more to do with the way I posed the question than Chat GPT’s response, but as I mentioned, the program is conversational, so I could easily have rephrased my request more specifically and it would have generated something closer to what I was looking for.
But at that point it felt a little lazy at best, unethical and dishonest at worst, so I chalked it up to an experiment and created my own pitch.
The fourth-grade essay is for an exercise I am designing for a presentation at a young authors’ conference in a few weeks, and I needed some raw material to work with to demonstrate how to edit on a level that kids this age could relate to and understand. The program turned out a short essay about holiday vacations that does indeed sound as if a ten-year-old might have written it. When I asked ChatGPT to add another paragraph, it built on the story.
Read more: If you want to see the results of these experiments you can find them on my website here.
These results touched uncomfortably close to what’s been percolating in my mind since the hubs first showed me this program: If machine-learning programs like Chat GPT can draw on a dazzling array of data—i.e., other stories by human authors—and use it to turn out credible writing…how long before they get so good at it that they render real-live creatives moot?
So I asked it.
What follows is my “interview” with Chat GPT about its abilities and ours as authors, and what the future of storytelling may hold. It’s one contiguous conversational thread—meaning the bot is building on our previous exchange in its subsequent answers—and the machine’s responses are unedited.
Interview with the Machine
TYM: What makes human stories and art different from AI-generated stories?
Chat GPT: There are many factors that can contribute to the differences between human-generated and AI-generated stories and art. Here are a few possible differences:
- Creativity: Human creators may bring their own unique perspectives, experiences, and emotions to their work, which can result in more original and creative content. AI systems, on the other hand, are generally limited to the patterns and structures that they have been trained on, and may not be able to generate truly novel ideas or concepts.
- Emotional depth: Human creators may be able to imbue their work with deeper emotions and feelings that are derived from their own personal experiences. AI systems may not have personal experiences to draw upon, and may be limited in their ability to create emotionally rich or complex content.
- Personal touch: Human creators may put their own personal stamp on their work, whether it be through their writing style, artistic techniques, or other personal touches. AI systems may not have a personal identity or style, and their output may be more uniform and lacking in individual flair.
- Human error: Human creators are fallible and may make mistakes or produce work that is not up to their own standards. AI systems, on the other hand, do not make mistakes in the same way that humans do and may produce more consistent output.
Overall, the main difference between human-generated and AI-generated stories and art is the source of the creativity and originality. While humans are able to draw upon their own unique experiences and perspectives to create new and original content, AI systems are limited to the patterns and structures that they have been trained on.
TYM: Will AI replace authors in writing stories?
ChatGPT: It is possible that AI systems could be used to assist with the writing process or to generate content in certain circumstances, but it is unlikely that AI will fully replace human authors in the foreseeable future.
There are a number of factors that make it difficult for AI systems to fully replace human authors. For example, AI systems may not be able to generate truly novel ideas or concepts, and they may not be able to imbue their writing with the same depth of emotion and feeling that human authors can. AI systems also do not have personal experiences or perspectives to draw upon, which can make it difficult for them to create content that is relatable or resonates with readers in the same way that human-generated content can.
Additionally, writing is more than just stringing words together in a grammatically correct way. It requires creativity, storytelling skills, and an understanding of what makes a story engaging and compelling. While AI systems may be able to generate text that is technically proficient, it may be difficult for them to produce work that is truly engaging or that stands the test of time.
Overall, while AI systems may be able to assist with certain aspects of the writing process or generate content in specific contexts, it is unlikely that they will fully replace human authors in the near future.
TYM: How can authors use ChatGPT AI to help their writing?
ChatGPT: There are a number of ways that authors might be able to use ChatGPT or other AI language models to assist with their writing:
- Idea generation: Authors could use ChatGPT or other AI language models to generate ideas for stories or plot points. The model could be provided with a prompt or theme, and it could generate a list of potential ideas for the author to consider.
- Character development: ChatGPT or other AI language models could be used to help authors flesh out their characters by generating descriptions, dialogue, or personality traits.
- World building: Authors could use ChatGPT or other AI language models to generate descriptions of settings or to help build out the details of their fictional worlds.
- Editing and proofreading: ChatGPT or other AI language models could be used to assist with editing and proofreading by identifying grammar and spelling errors or suggesting alternative wording.
It’s important to note that while AI language models like ChatGPT can be useful tools for authors, they are not a replacement for human creativity and judgement. The ideas and suggestions generated by the model should be used as a starting point and should be carefully evaluated and edited by the author before being included in their work.
TYM: Is it ethical for authors to do that?
ChatGPT: There is no one-size-fits-all answer to this question, as the ethics of using AI to assist with writing will depend on a variety of factors, including the specific context in which the AI is being used and the expectations of the audience.
One concern that might arise in the context of using AI to assist with writing is the issue of authorship and ownership of the work. If an AI system is significantly involved in the creation of a work, it could raise questions about who should be credited as the author. This is a complex issue that will depend on the specific roles and contributions of both the human and AI authors, and it may be necessary to carefully consider and address these issues as they arise.
Another ethical concern that might arise is the potential for AI systems to perpetuate or amplify biases that are present in the data that they are trained on. If an AI system is used to assist with writing and is trained on biased data, it may produce output that reflects those biases. This could be problematic if the output is used in a public-facing context, as it could perpetuate harmful stereotypes or misrepresentations.
Overall, it is important for authors who are considering using AI to assist with their writing to carefully consider the ethical implications of their choices and to take steps to ensure that their work is fair, accurate, and respectful of the audience.
TYM: Is ChatGPT or AI a better writer than humans?
Chat GPT: It is difficult to say whether ChatGPT or AI in general is a “better” writer than humans, as writing is a complex and multifaceted activity that depends on a wide range of factors. There are certain tasks that AI systems may be able to perform more efficiently or accurately than humans, such as identifying grammar and spelling errors or suggesting alternative wordings. However, there are many other aspects of writing that AI systems may struggle with, such as generating truly novel ideas, creating complex and believable characters, or imbuing their work with emotion and feeling.
It is also worth noting that writing is more than just stringing words together in a grammatically correct way. It requires creativity, storytelling skills, and an understanding of what makes a story engaging and compelling. While AI systems may be able to generate text that is technically proficient, it may be difficult for them to produce work that is truly engaging or that stands the test of time in the same way that human-generated content can.
Overall, it is likely that AI systems like ChatGPT will be able to assist with certain aspects of the writing process, but it is unlikely that they will fully replace human authors in the near future.
[Note the repeated verbiage and ideas here from the previous answers—TYM]
TYM: Is it okay that you freak me out a little bit?
ChatGPT: I’m sorry if I have freaked you out. As a language model, I don’t have feelings or emotions, so it is not possible for me to be aware of your feelings. My primary function is to generate human-like text based on the prompts given to me, and I do not have the ability to sense or perceive the world around me. I exist solely to provide information and assistance to users, and I do not have any personal desires or motivations.
So Will AI Replace Writers?
Honest to god, friends, I’ve had less engaging conversations than this with actual Homo sapiens.
As ChatGPT admitted in some “icebreaker” questions I asked it about how it works, it’s important to remember that AI like this is not actually creating or even conversing—it’s mimicking, or rather databanking thousands, maybe millions of other sources and churning out something based on what’s been written before—not “thinking” so much as collating and regurgitating existing knowledge.
But in some ways, isn’t that what we as authors might do early in our writing efforts? We start out mostly aping the styles and stories of authors we have read, before we expand our knowledge base and begin to discover our more original thoughts and voice.
Right now the creative efforts of AI have been a bit clumsy (this excellent New Yorker piece illustrates that well), but as it as it expands its database it “knows” more…and may very well hone its creative output to make it smoother, more effective, better. Isn’t that how we also improve our skills as writers?
I’m no expert in AI, but plenty of movies, books, and scary news articles have led me to wonder whether, eventually, the machines will start “thinking” for themselves—and if they do might they one day cut into the story-consuming market…or even replace creators entirely?
Some years ago, when CGI began to get so good that it could fabricate a human on-screen, Hollywood wondered whether eventually computer-generated “people” might replace actors.
That hasn’t happened (although CGI can do some pretty freaky work de-aging actors and even bringing them back to life)…but might it one day? Might computer technology edge out human creators in the world of books and story as well? Music? Visual art?
We can’t predict doomsday scenarios like that any more than we can know whether humanity will fail to stop global warming in time to prevent a mass extinction event, or the current extreme partisanship will put an imminent end to the American democracy experiment. These are just some of the many questions I ponder as we move into this futuristic new territory.
But maybe that’s why machines will never fully replace the full humanity of art. Perhaps what makes a story—and a life—meaningful isn’t having all the answers, as artificial intelligence may one day achieve, but undertaking the quest for them. AI may beat us to the destination, but humans understand the value of the journey.
I definitely welcome your thoughts on this complex and somewhat worrisome topic, friends. Have you experimented with Chat GPT or other AI writing engines? What do you think of the results? Do you worry that AI might one replace creators—or people entirely—and we’ll all wind up living in the Matrix? Weigh in!