Federal guidance revealed by NIST aims to detect and prevent digital manipulation of facial images
The US National Institute of Standards and Technology (NIST) has published a new report titled Face Analysis Technology Evaluation (FATE) MORPH 4B: Considerations for Implementing Morph Detection in Operations (NISTIR 8584). The report, authored by researchers at NIST, focuses on helping organizations optimize their efforts to detect face morphing software, a type of deepfake technology that blends the photos of two people into a single image.
The report offers an introduction to the topic and key detection methods, and it focuses mainly on the pros and cons of various investigatory techniques. Mei Ngan, the report's author, stated that detection tools have significantly improved in recent years.
The report differentiates between two detection scenarios in the context of face morphing software. In the first detection scenario, called "single-image morph attack detection," examiners only have the synthesized photo. Single-image detection tools can achieve 100% accuracy in detecting morphs, but only if trained on examples from the software that generated the morph; otherwise, accuracy can fall below 40%.
In the second detection scenario, called "differential morph attack detection," examiners have that photo alongside a second, genuine image. Differential detectors have a consistent accuracy range of 72% to 90% in detecting morphs created via open-source and proprietary software, but they require an additional genuine photo for comparison.
Mei Ngan emphasized that it's important to be aware that morphing attacks are happening and that there are ways to mitigate them. The most effective way to mitigate morphing attacks, according to the report, is to prevent users from submitting manipulated photos for ID credentials in the first place.
The report also discusses ways to prevent morphs from entering operational systems in locations such as passport application offices and border crossings. As early as 2022, Europol warned that face morphing could be used in document fraud, such as applying for a passport or passing identity checks.
The report does not discuss the implications of face morphing software for individuals or society at large. However, Mei Ngan also mentioned that the report aims to guide operational staff in determining whether an investigation is necessary and what steps it might take.
The report does not provide details about the specific detection methods it recommends. Nonetheless, it offers valuable insights into the current state of face morphing detection and the importance of staying vigilant against this type of deepfake technology.