In the endlessly escalating war between those striving to create flawless deepfake videos and those developing automated tools that make them easy to spot, the latter camp has found a very clever way to expose videos that have been digitally modified by looking for literal signs of life: a person’s heartbeat.
If you’ve ever had a doctor attach a pulse oximeter to the tip of your finger, then you’ve already experienced a technique known as photoplethysmography where subtle color shifts in your skin as blood is pumped through in waves allows your pulse to be measured. It’s the same technique that the Apple Watch and wearable fitness tracking devices use to measure your heartbeat during exercise, but it’s not just limited to your fingertips and wrists.
Though not apparent to the naked eye, the color of your face exhibits the same phenomenon, subtly shifting in color as your heart endlessly pumps blood through the arteries and veins under your skin, and even a basic webcam can be used to spot the effect and even measure your pulse. The technique has allowed for the development of contactless monitors for infants, simply requiring a non-obtrusive camera to be pointed at them while they sleep, but now is being leveraged to root out fake news.
Researchers from Binghamton University in Binghamton, New York, worked with Intel to develop a tool called FakeCatcher, and their findings were recently published in a paper titled, “FakeCatcher: Detection of Synthetic Portrait Videos using Biological Signals.” Deepfakes are typically created by matching individual frames of a video to a library of headshots, often times containing thousands of images of a particular person, and then subtly adjusting and tweaking the face being swapped in to match the existing one perfectly. Unbeknownst to the naked eye, those images still contain the telltale biological signs of the person having a pulse, but the machine learning tools used to create deepfakes don’t take into account that when the final video is played back, the moving face should still exhibit a measurable pulse. The random way in which a deepfake video is created results in an unstable pulse measurement when photoplethysmography detection techniques are applied to it, making them easier to spot.
From their testing, the researchers found that FakeCatcher was not only able to spot deepfake videos more than 90 percent of the time, but with the same amount of accuracy, it was also able to determine which of four different deepfake tools—Face2Face, NeuralTex, DeepFakes, or FaceSwap—was used to create the deceptive video. Of course, now that the research and the existence of the FakeCatcher tool has been revealed, it will give those developing the deepfake creation tools the opportunity to improve their own software and to ensure that as a deepfake videos are being created, those subtle shifts in skin color are included to fool photoplethysmography tools as well. But this is good while it lasts.