Ƶ

Skip to main content

Open(AI) and shut: What ChatGPT deals with media outlets mean for the future of news

Open(AI) and shut: What ChatGPT deals with media outlets mean for the future of news

By Joe Arney
Photo of Patrick Ferrucci, below, by Kimberly Coffin (CritMedia, StratComm’18)

Pat Ferrucci talks to two students using computers in a CMCI classroom.
Patrick Ferrucci has seen this movie before.

A former reporter and current chair of the journalism department at the University of Colorado Boulder’s College of Media, Communication and Information, Ferrucci studies the institutions, businesses and technologies that are rapidly reshaping the discipline.

So when he learned The Atlantic and Vox Media agreed last week to license their journalism to ChatGPT creator OpenAI, he thought back to agreements traditional publishers once signed with Facebook, Google and Twitter—deals that augmented audiences while wrecking revenue.

“I don’t get it,” he said.

“Maybe they see a monetary infusion at what’s undeniably a difficult financial time for the media. But we’ve seen this before, and each time, that financial infusion doesn’t benefit the actual journalism.”

ChatGPT was hailed as a breakthrough when it arrived in the winter of 2022, able to respond to questions and create content that was seen as a value add for businesses and individuals. Some of the shine has since worn off as creators and artists have accused the company of stealing their work to train the chatbot to write more convincingly—the large-language models ChatGPT are trained on enormous amounts of data that come from novelists, poets, journalists, even regular users of social media platforms who post content and comments.

  “We’ve seen this before, and each time, that financial infusion doesn’t benefit the actual journalism.”
Patrick Ferrucci, chair, journalism

While he criticized the short-term benefits at the potential cost of long-term viability, Ferrucci said there could be other advantages for media companies that sign up with OpenAI.

“It could allow the journalism industry to get an understanding of what those tools can do,” he said. “And if they get a head start with those tools, and learn to implement them into their processes early on, it will give them a leg up on companies that fought against it.”

A different tactic: See you in court

Representing those companies fighting against it is The New York Times, which last year sued OpenAI after changing its terms of service to prevent A.I. systems from scraping its work. At the time, Robin Burke, a professor of information science at CMCI, called ChatGPT’s honeymoon period “a free ride, because nobody was paying attention to what they were doing. Now, I think it makes sense that the organizations producing content are thinking, ‘Do I really agree with this as a usage of my work?’”

It’s a fair question, but Ferrucci said he expects we’ll see more deals like this going forward.

“There are companies that can do investigative journalism because it doesn’t matter if you sue them,” he said. “And there are others who essentially self-censor because the threat of a lawsuit, no matter how frivolous, could destroy the business. If you show these news companies some money, I don’t think all of them can afford to look away.”

Headshot of Casey Fiesler
Casey Fiesler, an associate professor of information science at CMCI and an expert in ethical and legal issues surrounding technology, said of all the copyright suits against OpenAI, the Times may have the most compelling case, since the paper was able to show examples where ChatGPT appeared to respond to user prompts with copyrighted material from the newspaper.

But for her, the most interesting issue isn’t copyright.

“I think the more profound thing is this idea that you used my work to build a technology that will replace me,” she said. “That’s why so many people are upset. It feels like a violation—you’re using my art to build this technology so that you don’t have to pay artists anymore.”

  Visit