December 27, 2024
Back in the good old days, a president could be brought down by telephone calls, exceptional “Deep Throat” sourcing in underground parking lots, and, of course, extensive and dogged reporting. Now, apparently, all we must do is type in a few words, and stories can be written in a matter of seconds by robots. Woodward […]

Back in the good old days, a president could be brought down by telephone calls, exceptional “Deep Throat” sourcing in underground parking lots, and, of course, extensive and dogged reporting.

Now, apparently, all we must do is type in a few words, and stories can be written in a matter of seconds by robots. Woodward and Bernstein this is not.

(Illustration by Tatiana Lozano / Washington Examiner; Getty Images)

Some of the more egregious examples of inaccurate or downright untrue artificial intelligence-generated stories are often the ones that make the headlines.

A German magazine, for example, claimed to have interviewed stricken former Formula One driver Michael Schumacher and his family in person, and it turned out to be completely AI-driven. The Schumacher family successfully sued the magazine for EUR 200,000, or $215,000.

A San Francisco-based news outlet, Hoodline, has passed all its content off as written by real journalists where everything, including the reporters, is generated by artificial intelligence. NewsBreak, one of the most popular news aggregation apps in the United States with suspicious ties to the Chinese government, completely fictionalized a story last Christmas of a mass shooting in a random New Jersey town generated solely by artificial intelligence.

OpenAI and News Corp. ink content agreement

Extreme examples aside, it may be more appropriate to focus on the likes of a recent agreement between OpenAI and News Corporation, the owners of the Wall Street Journal and the Times, among other publications. Such agreements, and OpenAI has made similar ones with other publications such as the Financial Times, are more likely to shape the future of journalism than clearly inaccurate fake stories generated by bots.

The agreement was greeted with large amounts of fanfare, at least by the two companies involved.

“We believe an historic agreement will set new standards for veracity, for virtue and for value in the digital age,” claimed Robert Thomson, chief executive of News Corp. “We are delighted to have found principled partners in Sam Altman and his trusty, talented team who understand the commercial and social significance of journalists and journalism.”

“Our partnership with News Corp. is a proud moment for journalism and technology,” added Sam Altman, CEO of OpenAI. “Together, we are setting the foundation for a future where AI deeply respects, enhances, and upholds the standards of world-class journalism.”

Under the terms of the deal, OpenAI will have access to News Corp. material across a wide variety of the group’s publications for use in training its chatbots and to help answer user questions.

Such an approach has not, however, been universally cheered. The New York Times, in late 2023, for example, sued OpenAI and Microsoft, accusing the companies of effectively stealing the work of its journalists for use in training chatbots.

Federal regulators, too, are taking note of such content agreements and beginning to investigate whether they are actually legal or not.

Handmade versus AI

As AI increases its influence in journalism, there could be a distinction between what Ed Watal, CEO of strategy consultancy Intellibus, calls “handmade news” and AI-generated content.

While such AI-generated content will be quicker to the page, there could be delays in such “handmade news,” news that journalists write from scratch like the old days and don’t use AI to generate any content. That said, such from-scratch news may be a lot closer to one of the fundamentals of journalism, namely getting to the truth.

“The future of truth in journalism may be ‘handmade news,’ it’s a day late, but it’s true,” Watal said. “AI-generated content may over time drown out good journalism unless the AI algorithms are trained to make the distinction between AI-generated news versus handmade news.”

As well as possibly eventually drowning out traditional journalism, AI-generated news may not adhere to the same ethical and investigative standards of journalism generated by actual humans, warned Sean Vosler, founder of MovableType.ai, a book publishing company focused on AI. Bias and misinformation may also be greater with AI-generated content, he said.

However, such possible dangers also come with clear benefits, Vosler said. AI-generated content can reach a wider audience and help journalists by providing better tools for fact-checking and data analysis.

In an AI-driven age, human journalists will continue to have a highly significant role to play, Watal at Intellibus said. It’s just that their roles may change into a more editing and curator-type play.

“The responsibility of the journalist, however, stays with them as they continue to deal with the trade-off of speed versus accuracy,” he said. “Journalists will still be the guardians of truth, reviewing and approving the content that AI generates and doing handmade or AI-based research to seek the truth.”

The human touch

However, AI will never be able to replace the inherent value of human journalists caring about the communities and societies they serve, said Mary Beth West, a senior strategist for Knoxville, Tennessee-based Fletcher Marketing PR, business consultant, and ethics expert.

“Such care matters. Its relevance is evergreen,” she said. “Human judgment and accountability are irreplaceable. Likewise, news media must also be accountable to the editorial choices they make — because those choices have consequences, too.”

AI-generated content is still a relatively new reality, and it may just be too early and even foolhardy to say exactly what effects it will have on the future of journalism. Much remains unknown.

The Times has been printed since 1785. Who’s to know whether it will still be around in something approaching its current form in 2085? But Vosler at MovableType.ai is confident it will.

“Institutions like the Times will endure by integrating these technologies responsibly and preserving the core principles of journalism,” he said. “By 2085, we can expect a symbiotic relationship where AI supports and enhances the work of human journalists, rather than replacing them entirely.”

Perhaps the safest thing to say is that much will depend on bad actors and good ones countering them. It will be a societal decision as to what eventually wins out.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

“The primary driver of whether the Times and other news media will still be around in 2085 will be whether society’s intellectual quality and discernment will economically allow proper news media to exist,” West said. “The onus is on us, society at large.”

For the record, this article was completely handmade with no use of any artificial intelligence and written by a real human.

Nick Thomas is a writer based in Denver, Colorado.

Leave a Reply