LET'S GET STRATEGIC
ChatGPT: Opportunities and Risks Related to AI-Generated Content
by Linda Pophal
AI-driven content-generation tools are nothing new. But if the hype around the release of ChatGPT is to be believed, this new offering is something special. It is purported to have a heightened ability to create quality content for a variety of applications, including social media posts, marketing copy, business reports, customer-facing documents, and a variety of other written materials. As OpenAI, the creator of the ChatGPT model, notes on its website, “The dialogue format makes it possible for ChatGPT to answer [follow-up] questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests.”
POTENTIAL AND SOME POSSIBLE PERIL
I tried ChatGPT in December 2022, and I was impressed with what it delivered when I asked the question, “What is blockchain?” I’ve been stymied myself to attempt to explain what blockchain is, so I appreciated the simple and, presumably, accurate explanation I received within seconds:
A blockchain is a decentralized, distributed database that is used to store data in a way that is secure and transparent. It is made up of a series of interconnected blocks, each of which contains a record of multiple transactions. Each block is linked to the previous block, forming a chain of blocks that cannot be altered or tampered with. The decentralized nature of the blockchain means that it is not controlled by any single entity, making it difficult for any one party to manipulate the data stored on the blockchain. This makes it an attractive technology for a wide range of applications, including financial transactions, supply chain management, and voting systems.
When I tried to access the tool again in late January, though, I found that its popularity has clogged up its systems—or the message I received is just a clever way of building an email database of interested users. Here’s what the site had posted when I visited: “Get notified when ChatGPT is back.” Below that, it stated, “We’ve had a lot of people come by to check out ChatGPT, and our systems are currently at capacity. If you want to be notified when we’ve resolved these issues, please submit your email [into the box provided on the site]. In the meantime, feel free to keep checking back at chat.openai.com.”
My limited experimentation with this tool made me feel that it could certainly be a promising resource for writers of all kinds, but it raised the following questions:
- If multiple people are using the same prompt, what’s the potential for duplication of content or the tool plagiarizing itself?
- Who owns the content—the person asking the question or the AI?
- If there are errors in the content, who’s responsible or liable for any negative implications the errors may cause?
In the hope of finding the answers, I reached out to some experts to get their perspectives.
May Habib is the co-founder and CEO of Writer, an AI writing platform. “Regarding plagiarism, ChatGPT presents an interesting situation as the content produced by ChatGPT should be original and unique in the sense that the same prompt should provide a different output each time,” Habib says. But, she adds, “as with any writing, there’s always the possibility that similar or exact phrases already exist in the larger body of written language, and writers need to be mindful of how this impacts their work.” She notes that the real implication for plagiarism “is that content produced by ChatGPT can be directly shared or published as one’s own thoughts or ideas.” This has already created some concerns—and potential solutions—in academic circles as instructors have become aware that some students are using ChatGPT to create papers and reports (per U.S. News & World Report). To combat the potential for its misuse, a student from Princeton tweeted that they developed an app, GPTZero, that “can quickly and efficiently detect whether an essay is ChatGPT or human written.”
GPT, says Nik McFly, co-founder of the AI school Hybrain Academy, “is a language model, and its main goal is not to produce copied or plagiarized content.” But, he acknowledges, “if multiple people enter the same prompt into GPT, it’s possible that they might get similar or even identical content.” That makes it crucial for users to ensure that they’re properly citing and attributing content generated through GPT, he says.
In terms of content ownership, Habib says that “there is no definitive answer at this point.” But, she adds, “ChatGPT makes no claims of ownership for any of the output.” McFly notes that “copyright law protects original works of creativity like literature, drama, music, and art from being reproduced or shared without the permission of the copyright owner.” It’s possible, he says, that the tool could be viewed as the “creator” of the content. But also possible, he adds, is that the “user who entered the prompt cold be considered to be the creator.”
Daniel Armstrong, co-founder of Eleven, a content writing agency, says that one of the things that can potentially ease this concern is that ChatGPT users have the ability to ask follow-up questions or instruct the tool to adjust the content—to “make it less fluffy” or “more formal,” for example, he says. The more detailed the instructions, the less likely the answer will be to mirror content created by other users.
Obviously, ChatGPT and other tools are in their early days of adoption, so time will tell where sentiment falls in terms of responsibility for the accuracy of content created. However, Habib suggests that while there is no clearly established precedent for liability, “the expectation could be that the onus is on the person who used ChatGPT and then shared or published the content to review it for accuracy.” That seems logical and reasonable. It’s important, says McFly, for users to carefully review and fact-check any content created by AI tools to ensure accuracy. “If errors are found and the content causes harm or damage, the user or the agency that used the content could be held responsible,” he notes.
Habib points to a couple of additional concerns about ChatGPT that deserve consideration:
- Opting in—The opt-in concept, she states, “centers around allowing individuals and organizations to choose to have their work included in machine learning models.”
- Transparency—There needs to be transparency related to requirements for organizations and individuals to disclose when content was created with AI. “China has already begun to address the concept of an AI-generated watermark in their latest legislation,” she says.
Todd Stearn founded The Money Manual website in 2017 to help people save money, make money, and feel confident about their finances. He and his team have “done a lot of digging into ChatGPT,” Stearn says. The bottom line is, he asserts, “The buck stops when you put your name on it. This is [why] it’s so important to know the sources of information. It’s also vital to cross-check facts and sources. Accepting everything an unvetted anonymous source tells you is at best naive and at worst incredibly dangerous.”
To ensure the appropriate use of these tools, McFly says content creators should follow best practices and seek guidance from legal professionals as needed or to respond to any questions or concerns. “With proper use and caution, these tools can be a valuable resource for generating content in a variety of contexts,” he concludes.
In the near term—the next few years—Armstrong and his team expect that some additional features will emerge to aid in the content creation process, such as:
- AI will automatically check content against a specific style sheet and make corrections to ensure it matches that style.
- AI will match writing against specific style sheets from specific publishers, editors, or clients.
- These tools will be able to “read the internet.” Currently, Armstrong notes, “ChatGPT is only trained on data fed to it up until 2021.”
While Armstrong says that he doesn’t expect ChatGPT or similar tools to replace writers—especially top-tier writers—any time soon, it is important for all content creators to understand how ChatGPT or similar tools could become a part of their workflows. It’s clear, he states, that “most if not all content writers will at some point have to adapt or be left behind,” and AI “will transform our industry whether we like it or not.” The future, Armstrong says, “is both scary and exciting.”