Image Authentication Without Using Watermarks and Signatures
Image authentication without using watermarks and signatures (called the passive or blind approach) is regarded as a new direction and does not need any explicit prior information about the image. The decision about the trustworthiness of an image being analyzed is based on fusion of the outcomes of separate image analyzers. Here, we provide an overview of some of methods (analyzers) which are employed to analyze digital images.
- Detection of interpolation and resampling. When two or more images are spliced together to create high quality and consistent image forgeries, geometric transformations are almost always needed. These transformations, typically, are based on the resampling of a portion of an image onto a new sampling lattice. This requires an interpolation step, which typically brings into the signal statistical changes. Detecting these specific statistical changes may signify tampering.
- Detection of near-duplicated image regions. Detection of duplicated image regions may signify copy-move forgery. In copy-move forgery, a part of the image is copied and pasted into another part of the same image typically with the intention to hide an object or a region.
- Detection noise inconsistencies. The amount of noise in authentic digital images is typically uniformly distributed across an entire image and typically invisible to the human eye. Additive noise is a very commonly used tool to conceal the traces of tampering and is the main cause of failure of many active or passive authentication methods. Often by creating digital image forgeries, noise becomes inconsistent. Therefore, the detection of various noise levels in an image may signify tampering.
- Detection of double JPEG compression. In order to alter an image, typically the image must be loaded onto a photo- editing software and after the changes are done, the digital image is re-saved. If the images are in the JPEG format, then the newly created image will be double or more times JPEG compressed. This introduces specific correlations between the discrete cosine transform (DCT) coefficients of image blocks. The knowledge of image’s JPEG compression history can be helpful in finding the traces of tampering.
- Detection of inconsistencies in color filter array (CFA) interpolated images. Here, the hardware features of digital cameras are used to detect the traces of tampering. Many digital cameras are equipped with a single charge-coupled device (CCD) or complementary metal oxide semiconductor (CMOS) sensor. Then, typically, the color images are obtained in conjunction with a color filter array. In these cameras, only a single color sample is captured at each pixel location. Missing colors are computed by an interpolating process, called CFA interpolation. This process introduces specific correlations between the pixels of the image, which can be destroyed by the tampering process.
- Detecting inconsistencies in lighting. Different photographs are taken under different lighting conditions. Thus, when two or more images are spliced together to create an image forgery, it is often difficult to match the lighting conditions from the individual photographs. Therefore detecting lighting inconsistencies offers another way to find traces of tampering.
- Detecting inconsistencies in perspective. When two or more images are spliced together, it is often difficult to maintain correct perspective. Thus, for instance, applying the principles from projective geometry to problems in image forgery detection can be also a proper way to detect traces of tampering.
- Detecting inconsistencies in chromatic aberration. Optical imaging systems are not perfect and often bring different types of aberrations into an image. One of these aberrations is the chromatic aberration, which is caused by the failure of an optical system to perfectly focus light of different wavelengths. When tampered with, this aberration can become inconsistent across the image. This can be used as another way to detect image forgeries.