Wyoming Journalist Caught Using Artificial Intelligence to Create Fake Quotes and Stories

Wyoming Journalist Caught Using Artificial Intelligence to Create Fake Quotes and Stories

Quotes from the Wyoming governor and a local prosecutor were the first things that struck Powell Tribune reporter CJ Baker as a little odd. Then came some of the sentences in the articles that struck him as almost robotic.

But the fact that a reporter at a competing outlet had used generative AI to write his stories came to light in a June 26 article about comedian Larry the Cable Guy being chosen as grand marshal of a local parade. The article ended with an explanation of the inverted pyramid, the basic approach to writing a news story.

“The 2024 Cody Stampede parade promises to be an unforgettable celebration of American independence, led by one of humor’s most beloved figures,” the Cody Enterprise reported. “This structure ensures that the most important information is presented first, making it easier for readers to grasp the main points quickly.”

After doing some research, Baker, who has been a journalist for more than 15 years, met Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted to using AI in his reporting before resigning from the Enterprise.

The publisher and editor of the Enterprise newspaper, co-founded in 1899 by Buffalo Bill Cody, has since apologized and promised to take steps to ensure this does not happen again. In an op-ed published Monday, Enterprise editor Chris Bacon said he had “failed to detect” the AI ​​text and the fake quotes.

“It doesn’t matter that the false quotes were the obvious mistake of a rushed rookie journalist who trusted AI. That was my job,” Bacon wrote. He apologized for the fact that “the AI ​​was allowed to insert words that were never spoken into articles.”

Journalists have derailed their careers by fabricating quotes or facts in their stories long before AI came along. But this latest scandal illustrates the potential pitfalls and dangers AI poses for many industries, including journalism, as chatbots can produce misleading, even somewhat plausible, stories with just a few hints.

AI has found a role in journalism, particularly in automating certain tasks. Some newsrooms, including the Associated Press, are using AI to free up reporters to focus on more important work, but most AP staffers are not allowed to use generative AI to create publishable content.

The AP has been using technology to write financial results stories since 2014, and more recently for some sports stories. It is also experimenting with an artificial intelligence tool to translate some stories from English to Spanish. At the end of each story, a note explains the role of technology in its production.

Being transparent about how and when AI is used has proven important. Sports Illustrated came under fire last year for publishing AI-generated online product reviews that were presented as being written by journalists who didn’t actually exist. After the story was published, Sports Illustrated announced it was firing the company that produced the articles for its website, but the incident damaged the reputation of the once-powerful publication.

In his Powell Tribune article announcing Pelczar’s use of AI in his articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Obviously, I never intentionally tried to misquote anyone” and promised to “correct them, apologize, and say that these are inaccuracies,” Baker wrote, noting that Pelczar insisted that his mistakes should not reflect on the Cody Enterprise editors.

After the meeting, the Enterprise launched a comprehensive analysis of all the articles Pelczar had written for the journal in the two months he worked there. They found seven articles that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other articles.

“These are very credible quotes,” Bacon said, noting that people he spoke to during his review of Pelczar’s papers said the quotes sounded like something they would say, but that they had never actually spoken to Pelczar.

Baker reported that seven people told him they had been quoted in articles written by Pelczar but had not spoken to him.

Pelczar did not return a phone message from the AP left at a number listed as his, asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that contacted him.

Baker, who reads the Enterprise regularly because it is a competitor, told the AP that a combination of sentences and quotes in Pelczar’s articles raised his suspicions.

Pelczar’s story about a shooting in Yellowstone National Park included the following sentence: “This incident is a stark reminder of the unpredictable nature of human behavior, even in the most serene situations.”

Baker said the sentence is similar to the story summaries a certain chatbot seems to generate, in that it adds a sort of “life lesson” at the end.

Another story, about a poaching conviction, included quotes from a wildlife official and a prosecutor that appeared to come from a press release, Baker said. However, there was no press release and the agencies involved did not know where the quotes came from, he said.

Two of the questioned stories included false quotes from Wyoming Gov. Mark Gordon, which his staff only learned about when Baker called them.

“In one instance, (Pelczar) wrote an article about a new OSHA rule that included a quote from the governor that was entirely fabricated,” Michael Pearlman, a spokesman for the governor, said in an email. “In a second instance, he appeared to fabricate part of a quote and then combined it with part of a quote that was included in a press release announcing the new director of our Wyoming Game and Fish Department.”

The most obvious AI-generated copy appeared in the story of Larry the Cable Guy.

Creating stories using AI isn’t hard. Users could input a crime report into an AI program and ask it to write a story about the case, including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the leading journalism think tank.

“These generative AI chatbots are programmed to give you an answer, no matter if that answer is completely bogus or not,” Mahadevan said.

Megan Barton, editor of the Cody Enterprise, wrote an op-ed calling AI “the new advanced form of plagiarism, and in the media and writing world, plagiarism is something every media outlet has had to fix at one point or another. It’s the hardest part of the job. But a company that’s willing to fix (or literally fix) these mistakes is a reputable company.”

Barton wrote that the journal had learned its lesson, had put a system in place to recognize AI-generated articles, and would “have longer discussions about why AI-generated articles are not acceptable.”

The Enterprise didn’t have an AI policy, in part because it seemed obvious that journalists shouldn’t use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policies.

Bacon expects to have one in place by the end of the week.

“It will be a topic of discussion before hiring,” he said.

___

Hanson reported from Helena, Montana.