Not fake: The CyberSec Research and Transfer Center (FTZ) at HAW Hamburg is working together with Chainstep GmbH on a technology that makes it possible to detect manipulated videos. The researchers have now successfully presented the project in the BMBF funding program “DATI Innovation Sprints”.

Videos that have been faked using AI can hardly be distinguished from real videos. But how is it possible to recognize and prove a deep fake as such? Researchers at the CyberSec research and transfer center at HAW Hamburg are currently working on a corresponding technology together with their partner Chainstep GmbH. “Robust signature of audiovisual media against (deep) fakes (SaM-fake)” is the name of the project that the FTZ has now successfully presented in the BMBF funding program “DATI Innovation Sprints”. The FTZ is thus entitled to submit a funding application in Module 1 of the “DATIpilot” funding guideline.

It is almost impossible to completely remove the watermark.
What exactly is it about? The “Trusted Cam” software deposits invisible watermarks throughout the video during recording. These would be at least partially destroyed by a manipulating AI. “It is almost impossible to completely remove the watermark,” explains Volker Skwarek, Scientific Director of FTZ CyberSec and Professor of Computer Engineering. “Parts of it will always be recognizable. This can then be used to deduce the manipulation of a video that has already been secured.”

Simple image processing such as rotating, cropping, scaling or even compressing files does not affect the stored watermarks. The technology can also be used in many ways, for example in messenger applications or when sending via social media. In messenger applications, the videos and audio recorded in them can be signed directly with the software. When a messenger app receives these signed files, they are automatically verified.

Thanks to C4T transfer funding from the Free and Hanseatic City of Hamburg, FTZ CyberSec has already been able to expand the original research software to include the protection of audio recordings and create a cell phone app. By participating in the BMBF’s “DATI Innovation Sprints” funding programme, the scientists are pursuing the goal of further developing and professionalizing the technology – towards open source software that can be integrated into a messenger app. Skwarek: “Such an open community version can be used for proof of authenticity, for example when reporting from crisis areas.”