Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security

Detecting Video & Audio Tampering 39

* * Beatles-Beatles writes "Dartmouth professor Hany Farid already devised software tools to detect when someone has tampered with digital photos. His next challenge: determining whether video or audio files have been retouched. "
This discussion has been archived. No new comments can be posted.

Detecting Video & Audio Tampering

Comments Filter:
  • Umm... (Score:4, Funny)

    by dduardo ( 592868 ) on Monday September 11, 2006 @12:03PM (#16082062)
    How about just doing md5sum ?
    • I was assuming he had the orignal picture/video/audio in hand, but he doesn't it.
    • Depending on who is doing the tampering, that doesn't necessarily work.

      You can thank the Chinese for blabbing it to everyone.

    • [quote]
      Reuters, an international news wire service, caught heat by publishing a Beirut battle photo that contained an extra plume of smoke for dramatic effect. (Farid's software helped reveal that enhancement.)
      [/quote]

      they needed software to detect this? the 'enhancements' were so blatantly obvious that anyone that's ever used photoshop would have been able to see them...

      hilarious
  • I'm thinking of those home movies of 'Bat Boy'; Now we will be able to find out the truth.
    • by diggum ( 769740 )
      I believe Bat Boy is the sole editorial property of Weekly World News. Not that I've ever read it or anything...
  • I see/hear this all the time. It's called a DUB. American companies taking a perfectly good foreign film or cartoon (Anime) and butchering it. Latest example would be Bleach.....
    • by firebus ( 49468 )
      the problem, for a violent or bloody (or sexy) title like bleach or naruto or (insert sexy title here) is that you can't sell it at walmart or put it on tv without censoring it. and if you can't sell it at walmart or put it on tv, it's not worth the money you have to pay to japan for the rights to the title.

      i'd love it if american anime publishers would plan on an "uncut DVD" version from the start so that, once there's a big enough fan base, they can sell the real thing through niche market and online. it'
      • the problem, for a violent or bloody (or sexy) title like bleach or naruto or...

        Good point; however, it's merely a symptom of another problem: the perception of 'animation = for kids' on this side of the Pac. To be fair, some projects (such as Cool World have attempted to break out of that mold...

        i'd love it if american anime publishers would plan on an "uncut DVD" version from the start...

        Yeah, why don't they? This is already done with quite a few movies, as I recall. And the popularity of fansubs lea

  • by TubeSteak ( 669689 ) on Monday September 11, 2006 @12:47PM (#16082466) Journal
    Most likely, distribution will be limited: Photo editors, but not freelance photographers, at mainstream media outlets may get the software.

    "You do diminish the power the software if you make it completely, widely available," he said.
    Since they mention JPEG quantization tables as one of the main methods of identifying what program/camera a photograph came from... How hard would it be to replace the quant table with your own? Or even just tweak it enough that the program can't ID it.

    It just doesn't strike me as a terribly reliable way to ID a picture's origin. Might as well rely on the EXIF data.
    • Hence the limited distribution. He's well aware that his mechanism is not foolproof. The underlying assumption is that even though you can defeat this detection mechanism, you won't.
    • by plover ( 150551 ) *
      Quantization is only one facet of many means of detecting manipulation. If you're trying to claim a certain picture is legitimate, you may be requested to submit the camera for evidence. If anything but an image completely consistent with the camera's capabilities shows up, the image could be disregarded, and you may be liable for perjury.

      A real world image will have a lot of artifacts tying it back to a particular camera make and model, and a base noise level that is a virtual fingerprint identifiable

      • by mcmonkey ( 96054 )

        A real world image will have a lot of artifacts tying it back to a particular camera make and model, and a base noise level that is a virtual fingerprint identifiable to a specific camera. Are you sure you can manipulate an image that still defeats ALL of these checks, even though you don't know what they all are?

        Yes.

        Just take a picture with the same camera. There may well be signs the image has been manipulated, but in terms of tying elements of a photograph back to an individual camera, why not just

        • by plover ( 150551 ) *

          in terms of tying elements of a photograph back to an individual camera, why not just use the same camera?

          Even that might not be good enough. The camera will have to be in the same orientation, as that will affect both JPEG quantization and Moire artifacts. You may need the "paste-in" subject to be at the same part of the image sensor as he'll be in the final image. And it still has to be edited to match the other attributes -- depth of field, lighting, and focus, all of which are features experts co

          • by mcmonkey ( 96054 )

            Let's say that he found a green cell was out at (1000,500), leaving a single tiny purple pixel, and a blue cell was weak at (1010,550), leaving a yellow pixel. Next, he'd look at your photographic "evidence". If the cell at (1000,500) has any green component or the cell at (1010,550) has more than 25% blue, he'd suspect the image of having been manipulated. Similarly, if he finds an anomalous dead green pixel in your output image at (1500,750) and a weak blue pixel at (1510,800) he might question that as we

  • Simple. (Score:5, Funny)

    by Rob T Firefly ( 844560 ) on Monday September 11, 2006 @01:19PM (#16082829) Homepage Journal
    If Paris Hilton is fully dressed, seems fully aware of her surroundings, and/or is singing well, it's been tampered with.

    This formula can also be adapted to Lindsay Lohan, but hasn't been tested on Tara Reid or others yet.
  • A story on detecting photomanipulation apparently wasn't interesting in and of itself, so the author felt the need to drag in one of the Four Horsemen of the Infocalypse [wikipedia.org]: "Child pornographers also employ photo retouching to skirt felony laws."

    No amount of photo retouching makes sexual abuse of a child legal. The only way I can see to "skirt" the law would be to transform the images so they plausible look artificial (Court rulings have upheld that as long as no children are involved, it's protected by the

    • Different way: edit a pornographic adult picture to look like a child, like by cutting and pasting innocent child pictures onto a legal pornographic picture. Of course, it's not as simple as cutting and pasting, but you get the idea.

      As sick as these people are, I don't see why we should throw them in jail for that...

      Melissa

  • Reuters, an international news wire service, caught heat by publishing a Beirut battle photo that contained an extra plume of smoke for dramatic effect. (Farid's software helped reveal that enhancement.)

    Like we really needed the software to show that a Photoshop clone tool was used. Nearly every person who saw it said it looked fake; even people who don't know how to use Photoshop said it looked "wrong".

"The great question... which I have not been able to answer... is, `What does woman want?'" -- Sigmund Freud

Working...