Editorials, Opinion

AI can have a place in news, but it will never be a journalist

How will the news and journalism industries handle generative artificial intelligence?

That’s the kind of broad and long-ranging question that it’s too soon to answer, mostly because we’re just seeing the tip of the iceberg when it comes to what AI can do.

Google recently approached several large news organizations, including The New York Times, The Washington Post and The Wall Street Journal’s owner, News Corp, about using a “tool” Google has been calling Genesis. The generative AI (artificial intelligence that can create “new” written or visual content) is marketed as a personal assistant of sorts, meant to help journalists condense vast amounts of information or to help them take notes at long meetings. But that’s not all it can be used for.

We can give you an example that showcases many of the advantages and disadvantages of generative AI: Meet “Paul,” the creation of Microsoft employee and start-up media founder Mark Talkington.

Talkington started the Palm Springs Post as a newsletter he did in his spare time. He eventually expanded enough to hire two reporters — and to create Paul, with the help of friend and former Microsoft VP Peter Loforte.

Paul was designed to listen in on public meetings. It can then generate anything from a very simple summary to a more detailed article on a specific topic from the meeting. After being fed samples of Talkington’s writing and archived stories, Paul could write a passable article: at best, on par with a first-year journalism student’s work, but generally lacking nuance.

But for some news organizations — or sites masquerading as news organizations — “passable” is more than enough. That’s where much of the concern among journalists lies: At a time when conglomerates and hedgefunds are snapping up local outlets and laying off staff in droves, a “passable” article written for virtually free by a program is more financially attractive than keeping humans on staff.          

As smart as artificial intelligence can be, generative AI programs can have difficulty distinguishing fact from fiction. AI can regurgitate blatant falsehoods and present them with an illusion of authority. It treats all the information it takes in as roughly the same — unless its programmer makes the AI cite its information and limits its intake to reliable sources. This is something Talkington did after an early version of Paul “hallucinated” a mass shooting from a prompt about a weapon brandishing incident.

Generative AI programs like ChatGPT and OpenAI are already being used to craft news articles. As early as May, NewsGuard — a nonprofit that identifies and fights disinformation — identified almost 50 “news” websites where most, if not all, of the content had been written by AI. And much of what these robot reporters produce ranges from questionable to outright lies.

As scary as all that sounds, AI can have a place in journalism and news production. Talkington has used Paul to listen in on hourslong meetings that he can’t attend, then generate a summary. For a media start-up with only three humans, this makes Paul a valuable tool, not only for taking notes, but also for generating drafts and headlines.

Paul can take good notes, but it can’t foster relationships within the community or make the sometimes-difficult editorial decisions that produce the most fair and nuanced stories.

AI has a place in journalism — but a program can never replace a person, no matter how much you train an AI to sound like one.