Before the mighty printing press, monastic scribes mass-produced text by hand. Before the car, people saddled a horse and headed for the hills. Before AI, people thought for themselves.
As a writer, I find inspiration from the world around me and articulate the fantasies I see in my mind onto the page. It is a process that I have worked hard to refine over decades — brainstorming, outlining, drafting, and editing. I’d be remiss not to mention the distractions, constant second guessing, and writer’s block. Years ago, when I first started writing, my tools were a typewriter, dictionary-thesaurus combo, and my imagination. Today, people type a sentence into an AI text generator, and it spits out a novel. Where is the fun in that? Better yet, where is the integrity?
AI searches through vast amounts of information to collect datasets. Amongst those datasets are published works, unsubstantiated claims, and outdated information. A recent report by NewGuard found the world’s top AI generators are currently producing fake news. This is news that is consumed by people all over the world. AI is in our phones, laptops, and the search engines we use every day. It has become so common that when a person asks AI something, it is usually followed by blind faith and a naivety that makes Forrest Gump look like a skeptic.
An article by the Editorial Board of The Washington Post reported that most companies argue that the use of copyrighted material used by AI datasets falls under “fair use.” I agree with that, if the information is used in a new iteration that is transformative, and offers a creative change from the original. I believe that most of that creative change needs to originate from the human author and not the machine.
What’s the adage? There’s nothing new under the sun? There are plenty of writers who are inspired by other writers’ works. It is one thing to be inspired by and something else to steal another person’s work, whether you know it or not. AI poses huge risks for copyright infringement. If an industry is going to support AI-generated books, then I recommend that they only be available through certain outlets or vendors that specialize in AI products, at significantly reduced prices, and the producers should not be recognized as authors. There should be as little influence on the publishing industry as possible.
With the surge of AI-produced books in the market, there comes another risk — oversaturation. This limits the exposure that hard-working authors need. The value of their product reduces, and their ability to share their work and earn a living decline. Will Ormus reported that author Chris Cowell spent more than a year writing his technical how-to book before finding an AI generated version for sale on Amazon, three weeks before his own released. Cowell performed an internet search on the author and could find no information. The Mumbai based company that published the book had dozens of additional tech-based books listed on Amazon — some appeared to have screenshots taken directly from ChatGPT. The books shared five-star reviews from the same reviewers.

Until there are parameters in place to protect copyrighted materials from being exploited by people using AI to as a short cut to become an author, the publishing industry and book retailers should boycott AI-generated books.
I recognize the evolution of AI is moving fast and its incorporation into every facet of our lives is inevitable. Every good writer does his research, and the more thorough that research is, the better. This is where the use of AI as an assistant makes the most sense. It won’t change that the writer still bears the burden of fact checking, but it does provide another tool.
People are naturally drawn to the path of least resistance. Everyone wants immediate gratification, and that next dopamine hit. As more and more people overindulge their desire to use AI as a creative tool, more people will lose their creativity, resulting in creative atrophy.
It is that overreliance on AI that puts humans at risk of losing their problem-solving skills. We evolved to develop problem-solving skills for survival, and now our evolution threatens to extinguish them as we hand over the reins for convenience. AI is used and its data is consumed, and much like social media, sugary foods, alcoholic beverages, and nicotine, it has the potential to be harmful. Don’t people have a right to know if they are at risk? Harmful products should be required to bear a disclaimer warning people of those risks. AI is just another example.

Writing can be an intimate, grueling, exhausting, fulfilling, and sometimes therapeutic process. When you create a thing, you leave a part of you behind in that creation. It is deeply personal. Hemingway’s “Old Man and the Sea”, Dickens’ “Great Expectations,” and King’s “Carrie” all share the DNA of their authors. When AI-produced works are created and passed off as authentic, it is an insult to the integrity of authentic writers and an assault on the publishing industry.