Skip to main contentSkip to navigationSkip to footer
Logo of the University of Applied Sciences and Arts Northwestern Switzerland
Degree Programmes
Continuing Education
Research and Services
International
About FHNW
DeEn
Locations and ContactFHNW LibraryMedia Relations

      Logo of the University of Applied Sciences and Arts Northwestern Switzerland
      • Degree Programmes
      • Continuing Education
      • Research and Services
      • International
      • About FHNW
      DeEn
      Locations and ContactFHNW LibraryMedia Relations
      Sc...
      FHNW School of C...
      Institute for...
      Deep Signature - Hidden and Robust Di...

      Deep Signature - Hidden and Robust Digital Signatures

      Today, it’s easier than ever to create and manipulate realistic-looking images. To rebuild trust in digital media, we've developed a better way to verify what's real and what's not.

      Veranschaulichung einer Bilderkennungssoftware, welche Dinge identifiziert

      Digital images, audio, and videos are easy to copy, manipulate, and distribute. This ease of use comes with drawbacks: artists and creators struggle to enforce their copyright, and fake news circulate in social media. Deep fakes are eroding trust.

      Many technologies already exist to prove that a file comes from a reliable source and hasn’t been tampered with on the way. So-called active methods require an image to be prepared before it could be corrupted. This can mean signing it with a cryptographic signature or adding a watermark. In contrast, passive methods don't require any preparation, instead they use forensic tools to recognize whether it's been edited at some point.

      Passive methods lag behind, as manipulations become more and more sophisticated. Active methods can be more reliable, but their weakness is their lack of flexibility: the file is deemed untrustworthy if even one pixel changes, or if uploading it onto a platform strips away the metadata. This limits the situations where such active methods can be used to verify content.

      The technology we created at the FHNW Institute for Data Science is an active method that gives more leeway to benign edits. Unlike methods that require a signature to be delivered alongside the image, our technology hides the verification information within the image itself. Using our method, the creator adds the message to prove the origin of an image file, and the recipient retrieves it to check that the delivered file is similar enough to the original image. This means that an image can be verified even after adjustments to brightness or saving in another format, for example.

      We used two separate pairs of deep-learning neural networks to build the technology. On the one hand, we implemented a method to compress and decompress an image significantly while retaining a high level of detail. On the other hand, we found a way to hide the compressed information within the original image and then give the recipient a way to decode that information with a public key.

      This idea of hiding a copy of the original within the file itself has not been widely explored yet. What made the idea work was the level of compression we achieved, allowing us to hide a detailed copy that takes very little space inside an image.

      We worked with digital images during this project, but the same concept can also be used to authenticate audio and video. We are planning to publish our technology, so that anyone can use our work to develop verification tools for their own use cases. Our goal is to continue to enjoy digital art, trust the news, and deliver verifiable information to anyone online.

      Information

      School / Institute

      FHNW School of Computer Science / FHNW Institute for Data Science

      Funding

      InnoSuisse

      Project timeline

      April 2022 – September 2023

      Project lead

      Michael Graber (FHNW Institute for Data Science)

      Project team

      Marco Willi (FHNW Institute for Data Science)

      Mathias Graf (FHNW Institute for Data Science)

      Melanie Mathys (FHNW Institute for Data Science)

      Michael Aerni (ETH Zurich)

      Christian Schwarzer (Kortikal AG)

      Martin Melchior (FHNW Institute for Data Science)

      More information

      Melanie Mathys

      About FHNW

      FHNW School of Computer Science
      Institute for Data Science
      ht_ins_i4ds_zusammenarbeit_industrie

      What we offer

      • Degree Programmes
      • Continuing Education
      • Research and Services

      About FHNW

      • Schools
      • Organisation
      • Management
      • Facts and Figures

      Information

      • Data Protection
      • Accessibility
      • Imprint

      Support & Intranet

      • IT Support
      • Login Inside-FHNW

      Member of: