Check out Keypoint Intelligence’s Artificial Intelligence Services!
The literary world recently erupted in frenzy after it was discovered that fantasy author Lena McDonald not only used generative artificial intelligence (AI) to create her latest book, but that she had left the prompt in the final copy. While there is a general sense of disgust at an author for being lazy enough to utilize AI to do the writing for them, more outrage could be directed at the actual prompt itself. The portion of AI text left in Darkhollow Academy: Year 2 remarked that “I've rewritten the passage to align more with J. Bree's style, which features more tension, gritty undertones, and raw emotional subtext beneath the supernatural elements”—noting that not only did McDonald not come up with the words herself, but that she was openly trying to plagiarize the style of another successful author who writes in similar genres. McDonald isn’t alone in this shame as authors KC Crowne and Rania Faris suffered similar controversies after their books were published with prompts like McDonald’s left in the final copy.
These revelations come after an article from the Washington Post reporting that Amazon (once the bastion for self-publishing authors) has been overrun with books being “written” by AI to the point that it’s becoming impossible to distinguish what was made by a person and what was made by a machine. While this oversaturation devalues the market for people who want to read actual books crafted by real people, another major issue is that human writers are having their reputations tarnished. Jane Friedman, author of The Business of Being a Writer, discovered that there were several books of lower quality that were attributed to her—dangerous for someone who makes a living giving professional writing advice.
 |
This kind of “literature” (meant here in the barest context of the word) is another example of AI slop within our collective space. Generally speaking, “AI slop,” or just “slop,” is a term coined in the 2020s that describes media (e.g., writing and images) made using gen AI and characterized by clear poor quality, lack of logic, or without purpose. In his article for Entrepreneur, Johnny Hughes makes a comparison of AI slop to junk mail and spam emails, calling it “the latest iteration of digital clutter.” While many people can quickly call forth examples of AI slop images, written slop is still something lurking in the shadows of the Internet.
What’s the Harm?
In her article covering the Darkhollow debacle, Rose Graceling-Moore notes that (beyond being done in bad taste and general laziness), AI-generated books will disproportionately affect marginalized authors who are often forced to create space for themselves in a very competitive market via self-publishing. She says, “While there are increasing calls for books by LGBTQ and BIPOC authors in traditional publishing, and even some smaller publishing houses or agents dedicated to own-voice stories, the vast majority of traditional publishing is still extremely straight and white. Should self-publishing become overrun by AI, it is marginalized authors who will feel it the most.”
In a similar vein, gen AI is notorious for being an environmental burden. Beyond the massive amounts of electricity needed to power AI farms used to create slop, a steady water supply is needed to keep the machines from overheating—often to the point that it can have irrevocable effects on local ecosystems. According to Elisa Olivetti, professor in the Department of Materials Science and Engineering as well as the lead of the Decarbonization Mission of MIT’s new Climate Project, “When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in. There are much broader consequences that go out to a system level and persist based on actions that we take.”
Keypoint Intelligence Opinion
Generative AI is not evil. It is not the sole contributor to social ills or environmental catastrophes. It has a purpose and should be used. The problem is that AI has no place in the realm of the humanities (it’s in the name, after all…“HUMAN-ities”).
We should be using gen AI to help create complex lines of code, compute difficult mathematical equations, help diagnose cancer. Writing a book, composing a song, or painting a picture can technically be done by AI, but why would you? We’ve seen enough eldritch horrors online of people with too many fingers, bodies morphing into scenery, or Thing-like abominations being produced by innocent prompts of “a group of friends on a beach.” There is enough music on Instagram and TikTok (that due to being unsafe for work can’t be included here as hyperlinks) with nonsensical lyrics that would make the hardest trap artist blush.
The issue is that not enough people are paying attention to the same thing happening to books. Already, the library app Hoopla has become overrun with AI slop with poorly written garbage that is clogging up resources for many who utilize the free content of a library. With book bans on the rise and far too many people not finding the time to read, we shouldn’t be making it harder by creating an obstacle course of nonsense before they can find their next book. Let’s leave writing to the writers and keep AI away from the arts.
Stay ahead in the ever-evolving print industry by browsing our Industry Reports page for the latest insights. Log in to the InfoCenter to view research and studies on AI through our Workplace- and Production-based Advisory Services. Log in to bliQ for product-level research, reports, and specs on AI solutions. Not a subscriber? Contact us for more information.
Keep Reading
INFOGRAPHIC: The Ethics of AI
Generative AI and the Problem Factory
INFOGRAPHIC: Poisoning Data to Protect Artists