Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Technology

DARPA Program Targets Image Doctoring (networkworld.com) 41

coondoggie writes: It isn't hard for just about anyone to change or alter an image these days — and that can be a problem. It's an issue researchers at the Defense Advanced Research Projects Agency want to put to rest with a new program called Media Forensics, or MediFor, which looks to build an algorithmic-based platform that can detect image manipulation. "The forensic tools used today lack robustness and scalability and address only some aspects of media authentication; an end-to-end platform to perform a complete and automated forensic analysis does not exist. Although there are a few applications for image manipulation detection in the commercial sector, they are typically limited to a yes/no decision about the source being an "original" asset, obtained directly from an imaging device. As a result, media authentication is typically performed manually using a variety of ad hoc methods that are often more art than science, and forensics analysts rely heavily on their own background and experience," DARPA states.
This discussion has been archived. No new comments can be posted.

DARPA Program Targets Image Doctoring

Comments Filter:
  • Frosty (Score:5, Funny)

    by Hognoxious ( 631665 ) on Thursday October 22, 2015 @02:05AM (#50779225) Homepage Journal

    Just check if some of the pixels are wrong. It helps if you've seen a few shops.

    • Re:Frosty (Score:5, Insightful)

      by invid ( 163714 ) on Thursday October 22, 2015 @08:11AM (#50780305)
      If you can create an algorithm that can detect pixels that have been modified in a picture, you can create an algorithm that can modify the pixels to hide the fact that they've been modified.
      • Didn't Captain Kirk use this idea to cause an evil computer to explode?
      • by dcw3 ( 649211 )

        If you can create an algorithm that can detect pixels that have been modified in a picture, you can create an algorithm that can modify the pixels to hide the fact that they've been modified.

        So, are you able to reverse https://en.wikipedia.org/wiki/... [wikipedia.org] I'm sure it's possible, like MD5, but not likely for all but groups like NSA.

    • by Anonymous Coward

      Is society trying to eliminate anyone have specialized training or knowledge, or is that just a side-effect of its downward spiral?

  • I don't know about identifying things after the fact, but an idea I've tossed around for a few years now is a forensic digital camera. Basically, the hardware will sign/watermark all the photos it takes with some sort of digital signature unique to the camera. The private key would be buried in silicon in such a way as to destroy it if attempts are made to discover it. I'm not a security/encryption expert by any means, so I don't know how feasible this is (or does it already exist?) but sounds plausible t
    • by Anonymous Coward

      That functionality already exists and has done for some time.

      However the implementations so far haven't stood up to attack - Canon [petapixel.com] Nikon [computerworld.com]

      There's no reason that should be the case though. It should be something that can be done securely, only falsifiable if you can either crack the key or find a hash collision (which'd likely mean making enough changes to the image to make it obvious that it's been modified).

    • You can't mandate what cameras people use, but you can have people submit images to a digital signing service to prove the image existed in that form at that point in time, this would prevent anyone from using it as source component of a forgery.

      The problem is that I can otherwise create a virtual camera in software that signs the forgery after it is created. I can even use wavelets to extract the grain/noise/high frequency artefact layers from an image before I modify it, then return them to obscure "
      • I can otherwise create a virtual camera in software that signs the forgery after it is created. I can even use wavelets to extract the grain/noise/high frequency artefact layers from an image before I modify it, then return them to obscure "identity difference" in parts of the final image. You can even calculate the lighting model from an existing scene to ensure the added components match perfectly.

        Is that an "I can" or a "One can"?

  • if ( CanTell(somePixels) && EnoughShopsSeen( time )) {
    printf("THIS LOOKS SHOPPED");
    }
  • The one where "endtoend" appears as one word, but "ad hoc" appears as two - in the same quote.
  • I'm still waiting for the "Enhance Button".
  • I have noticed that if you use Photoshop to do enhancement of contrast/brightness etc the color histogram will have lots of blanks and will look like the skyline of Manhattan not the ridge line of Rocky mountains. Especially if I use something called "gamma correction" (hope I remembered this term right). Is that a characteristic of all image processing? or just the implementation "feature" of Photoshop.
    • by spitzak ( 4019 )

      Error diffusion will help (calculate the floating point value and then store the two nearest integers with a probability based on the value) but not if the correction is extreme. Blurting the image to a floating point value will fix it but, of course, introduces blur.

  • by invid ( 163714 ) on Thursday October 22, 2015 @08:22AM (#50780365)

    Warning: Shameless Self-Promotion

    I've written a science fiction novel, The NPC [amazon.com] that deals with the ramifications of this sort of thing. The solution in the novel is extreme: all recording devices are required to stream their data to a trusted 3rd party (in this case, a corporation called VuDyne) in real time with an encrypted certificate. Otherwise the digital data is not trusted to represent reality. As you can imagine, this gives VuDyne a great deal of power.

  • If you can use an algorithm to detect tampering, you can use that same algorithm to alter your image so the algorithm no longer detects the manipulation.

    • by dcw3 ( 649211 )

      If you can use an algorithm to detect tampering, you can use that same algorithm to alter your image so the algorithm no longer detects the manipulation.

      See my earlier response regarding https://en.wikipedia.org/wiki/... [wikipedia.org] and tell me how you'd get around it.

      • Your previous comment is the same as this one. SHA-2 has nothing to do with it - you could simply encrypt your altered photo. If you wanted to do it the (physically) hard way (assuming you're talking about a camera with internal encryption hardware), you could wire something up to the CCD inputs so the camera doesn't realize it's being fed a pre-made image instead of a view of the real world.

        • by dcw3 ( 649211 )

          Okay, I think we're misunderstanding each other.

          My point...If I take an original image, and apply an encryption to it (this doesn't necessarily involve doing so with the camera), and that's my original image which I post for the world to see. Someone else can't necessarily come along and mess with that.

          Yours...Sure, if you're the originator of some photo, and you've doctored it, I agree with you.

  • ...the doubts on the reality of any kind of imagery cannot be overcome and we need to abandon the idea that images (moving or not) can be trusted as evidence ? Would the world stop spinning ? I highly doubt this.

    Perhaps there can be an exception in cases where the entire chain of taking and handling an image can be verified in one way or another ?

    Unalterable checksum produced by the camera perhaps ? I know that we can already do this with GPS flight logs (track/altitude) coming from certified flight record

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...