In the current media environment, technology is a double-edged sword. While technology — coupled with social media — has transformed how journalists gather, process and share information, it also has empowered new forms of propaganda and fake news — which, experts at an event earlier this week noted, is a threat to journalism and media freedom as well as democracy.
The session, titled, “Getting to the Truth: Journalism, Technology and the (Un)reliability of Information,” was held at the Thomson Reuters Times Square headquarters in New York and convened policymakers, NGOs, reporters, TV news producers, editors and other journalism professionals. The event was presented by Reuters along with the consul generals from the U.K. and Canada. Speakers included representatives from The Washington Post, CNN and the Committee to Protect Journalists, among others.
Stephen J. Adler, editor in chief of Reuters, said there’s an urgency in the industry, and concern across organizations as fake news grows and is “used as a tool of deception.” He noted laws against fake news are being implemented globally, which also threatens legitimate news, and thereby, press freedom itself.
There are other concerns as well. Aside from fake news, there are legitimate news sites that are simply offering misleading content — for a variety of reasons. For consumers, it becomes challenging to sift through the news to “find out the truth.” But there are solutions, panelists noted.
During a session on disinformation, Natalia Antonova, editor of Bellingcat, and Harleen Kaur Jolly, chief executive officer and cofounder of Ground News, discussed the growth of “open source” news sites and how these can combat misinformation. With Ground News, Kaur Jolly sees the site, which is designed as a news aggregator where users can see the same story from multiple sources, “as a way to help consumers get all the news on a topic so they can decide for themselves what the truth is.”
Regarding how technology is fueling fake news, Hazel Baker, global head of user-generated content at Reuters, said “technology opens the door to manipulation.” She then quizzed attendees to see if they could tell the difference between true and false stories based on user-generated content. This included clips of an oil tanker explosion (true), a sinkhole in London (false, and done with CGI) and a missile attack in Pakistan (false as it was a doctored video game clip).
Baker then showed examples of “deep fake” or synthetic news, which included digital face swaps, facial reanimation and synthetic voiceovers. Baker said while many of the examples require expertise, “there are off-the-shelf apps and filters that can be used to create this content.”
For retailers and brands, it’s important to note that deep fakes could pose a threat to brand equity and reputation when used in a malicious way. In regard to how the media industry can protect itself and maintain its integrity in this environment, the overall takeaway was to identify what’s fake and misinforming, and fight it with legitimate, balanced news and information.