A new study published in PLOS ONE has found that deepfake videos are no more likely to create false memories of movie remakes than simple text descriptions.
In the study, participants were exposed to made-up movie remakes that never existed, such as Will Smith starring as Neo in The Matrix. These fake remakes were presented either as deepfake videos showing Smith’s face convincingly inserted into Matrix scenes, or simply as text descriptions saying he had starred in a remake.
They found an average false memory rate of 49% for the fabricated remakes. However, deepfakes were no more effective than text at creating false memories. For example, The Shining was remembered at a 40% rate when presented as a deepfake and 39% rate when presented as text.
The findings suggest deepfakes may not have a uniquely powerful ability to distort memories, challenging assumptions in some previous literature. According to the researchers, we may be underestimating how easily memories can be manipulated with even basic misinformation.
While deepfakes did not increase false memories relative to text, the rates were still high, highlighting how susceptible memory is to distortion. The study also surveyed people’s attitudes to using deepfakes to recast movies. Most expressed reservations, citing concerns about artistic integrity, the shared experience of movies, and too much choice.
Some did see positive creative applications if used consensually. Overall, the research indicates we should gather more evidence about deepfakes before regulating them as an imminent threat. It also suggests technical dazzle alone may not explain deepfakes’ cultural impact compared to broader social anxieties about misinformation.