This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Deepfake videos during Russian invasion of Ukraine could undermine trust

Deepfake videos during Russian invasion of Ukraine could undermine trust
Deepfake masks may impact the future of misinformation online. Credit: John Noonan, Unsplash, CC0 (creativecommons.org/publicdomain/zero/1.0/)

A new study explores themes in Twitter discussions of deepfake videos related to the Russian invasion of Ukraine, highlighting the potential for real videos to be mistaken for deepfakes and for deepfakes to fuel conspiracy theories. John Twomey of University College Cork, Ireland, and colleagues present these findings in the open-access journal PLOS ONE on October 25, 2023.

Created using , videos typically feature a person saying and doing things they never actually did in real life. Deepfake technology has advanced considerably, sparking concerns about its . Deepfakes related to the Russian invasion of Ukraine represent the first instances in which deepfakes have been used in attempts to influence a war.

To better understand the potential harms of deepfakes, Twomey and colleagues analyzed Twitter discussions about deepfakes related to the invasion. They used a qualitative approach known as thematic analysis to identify and understand patterns in the discussions, which included a total of 1,231 from 2022.

The researchers found that many of the tweets expressed to news about deepfakes. For instance, some tweets expressed worry, shock, or confusion about news related to a deepfake that falsely depicted Ukrainian President Volodymyr Zelensky surrendering to Russia. However, some tweets overlooked potential harms or had positive reactions to deepfakes directed against political rivals, especially deepfakes created as satire or entertainment.

Some tweets warned about the need to prepare for increased use of deepfakes, discussed how to detect them, or highlighted the role of the media and government in rebutting them. However, some tweets suggested that deepfakes had undermined users' trust to the point that they no longer trusted any footage of the invasion.

Some tweets linked deepfakes to users' apparent belief in conspiracy theories, such as deepfakes of world leaders being used as cover while they were actually in hiding, or that the entire invasion was fake, anti-Russian propaganda.

This analysis suggests that efforts to educate the public about deepfakes may unintentionally undermine trust in real videos. The authors note that their findings and future research could help inform efforts to mitigate the harms of deepfakes.

The authors add, "Much previous research on deepfakes has been concerned with potential future harms of the technology. However we have focused on how deepfakes are already impacting as we have seen during Russia's invasion of Ukraine. Our research shows how deepfakes are undermining faith in real media and are being used to evidence deepfake ."

More information: Do deepfake videos undermine our epistemic trust? A thematic analysis of tweets that discuss deepfakes in the Russian invasion of Ukraine, PLoS ONE (2023). DOI: 10.1371/journal.pone.0291668 , journals.plos.org/plosone/arti … journal.pone.0291668

Journal information: PLoS ONE

Citation: Deepfake videos during Russian invasion of Ukraine could undermine trust (2023, October 25) retrieved 28 April 2024 from https://phys.org/news/2023-10-deepfake-videos-russian-invasion-ukraine.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Study shows humans do not easily detect deepfakes

75 shares

Feedback to editors