
Project Description:
Photo-montage and counterfeit images are not a recent phenomenon. As soon as images have been used within a political or economical context, the reality and the authenticity of recorded scenes become a legitimate question. The emergence of digital data did not really modify that basic context.
The modification of digital images is now a fact, in particular in the domain of cyber-criminality. Modifications can be naive (slight modifications done in order to remove spots present in a facial pictures), controversial (removal of visual object's defaults on a online commercial website) or may have strong societal impacts (improbable meeting between two important politician persons).
This project takes place in the domain of image forensics. The main goal is to certify if an image is a clean or a doctored image. The associated decision process must be as reliable as possible because the digital proof of falsification is really credible only if the method of detection returns very limited number of errors.
In a first step, it is proposed to develop some methods to detect malicious modifications of digital images from two complementary approaches, a first approach based on the modelisation of the digital image acquisition process and a second approach based on machine learning. Considering that the two approaches are scientifically complementary, it will be then proposed to fuse them in order to form a unique detector of digital image integrity.
On the event of the adoption of the draft regulation laying down measures for a high common level of cybersecurity at the institutions, bodies, offices and agencies of the Union, the AI4HealthSec project kicked off a process to provide its opinion.
Cyberwatching.eu has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 740129. The content of this website does not represent the opinion of the European Commission, and the European Commission is not responsible for any use that might be made of such content. Privacy Policy | Disclaimer / Terms and Conditions of Use