In the era of AI, truth is not free. To counter the proliferation of fake news, newspapers should pivot towards content authentication and quality assurance.
Kishida Fake Video

Fake image of Prime Minister Fumio Kishida circulated on social media. (Photo Souce: DWANGO Co, Ltd (nicovideo), image partially altered)

We are now entering the era of fake news through generative artificial intelligence (AI). Recently, AI-generated videos of Prime Minister Fumio Kishida and a famous newscaster saying things they did not say went viral. These videos have sparked controversy over AI technology, raising profound implications for media outlets.

The generative AI technology that allows the creation of these fake videos is evolving at an ever-increasing rate. It has now reached the point where the human eye cannot distinguish between what is real and what is not. Although researchers are investigating methods for detecting fakes, they cannot keep pace with the constantly evolving quality of these videos. Eventually, it will become technically impossible to tell from the video alone whether it is fake. 

When that happens, individuals will be left entirely defenseless against disinformation. They will have no choice but to assume that all videos are fake. 

Even more shocking is the recent emergence of entirely AI-generated news programs. Everything from the scripts to the casters, the sets, and teleprompters is created using AI. Producing high-quality news programs at a fraction of the cost is a significant turning point for news reporting.

The Nature of the Press

About 16 years ago, I wrote a column titled "The Future of Newspapers" for Seiron magazine. In this column, I described how the Internet would determine the future of newspapers. It would force newspapers, reporters, publishers, newsstands, retailers, and newspaper advertisers to part ways and consider their respective futures separately.

However, the current developments in the news world have exceeded my predictions. Organizational reform has now become inevitable for all media outlets. The essential functions of news reporting, namely gathering, reporting, and disseminating information, are now divisible. 

Moreover, the Internet has dramatically reduced the need for large printing plants, truck delivery networks, and local newsstands to distribute the news. Even employing large numbers of paper carriers is no longer necessary. Gathering information is now also a low-cost process thanks to the Internet. The development of many new forms of investigative reporting, from individual journalists to Internet-based teams like Bellingcat, is a testament to this.

And now, with the advent of generative AI, the ability to turn news into articles has also become less expensive. As generative AI develops, it will not only translate languages but will also take into account background information and cultural differences. It will also be able to create live-update articles using firsthand information as needed.

In addition, AI will generate news commentary that corresponds to the user's level. There are now plenty of smaller online news channels that are happy to take over the biased reporting of certain newspapers. In light of these developments, there is now very little need for larger news corporations. 

The author of this article is Ken Sakamura, dean of the Faculty of Information Networking for Innovation and Design at Toyo University.

Protecting the Truth

In this age of fakery, lowering the cost of gathering, reporting, and relaying news has led to a shift of focus. Now, the question of quality assurance is garnering considerable attention. Newspapers should consider changing their role to that of content authentification. 

At the G7 Hiroshima summit in 2023, leaders discussed Originator Profile (OP), a technology that aims to ensure the reliability of information. If this technology were introduced, it would act as a blockchain-based digital certificate certifying the authenticity of online content. This new framework would revolve around companies that verify the reliability of information. 

Guaranteeing the truth is far more laborious and costly than creating articles and reports using AI. This imbalance requires new, larger organizations with specialized personnel. It requires companies that routinely cross-examine information, draw on various sources, including personal relationships, and even conduct on-site inspections when necessary.

While the government could take on this role, a future with multiple such companies competing would be more desirable. Naturally, if a company were to approve incorrect content, it would affect its credit rating.

News now flows from diverse sources, exposing people to far more fake than reliable news. It will become incumbent upon responsible members of society to always check the electronic credentials of the content they are reading. 

Advertisement

The Truth is Not Free

Initially, the costs may be borne by governments and companies that want the information they transmit to be believed. Of course, it may be difficult for many smaller companies to afford the cost. 

However, covering the expenses for such businesses would be in the interest of content verification companies. By doing so, content verification companies would inherit the social position newspapers have traditionally occupied. There should also be a public fund to ensure a healthy Internet environment. 

Digital certificates of authenticity are managed using blockchain. Each user is charged a small fee every time contents are checked, much like the cost of a newspaper subscription. In a world where all news is suspect due to generative AI, all parties should bear the cost of preserving truth. 

With generative AI, we live in a world where the truth is not free. 

RELATED:

(Read the article in Japanese.)

Author: Ken Sakamura

Leave a Reply