Previous Entry Share Next Entry
New frontiers in self-incrimination
Just read a recent brief in Scientific American, which serves as another reminder of how powerful Big Data is getting:

Apparently, a characteristic of digital cameras (not anything intentionally built-in, just how they are built) is that each one winds up with subtle "noise" in the images it generates. That is, if your camera and mine both shoot the exact same picture, they might *look* the same, but at the fine-grained digital level there are slight differences. Those differences amount to a "fingerprint" for that camera, and can be extracted as essentially a fairly accurate identifier.

The implication? In principle, at least, this means that your photographs are all digitally signed to your camera. On the positive side, this may prove lovely for law-enforcement: if you have used your camera for nefarious purposes (eg, child porn, terrorist activities, etc), and also use the same camera to take innocuous pictures to post on Facebook, those pictures can be correlated at least well enough to make you a suspect. (The brief says that the accuracy rate is about 90%, with 2% false-positive: not enough to hold up in court, but enough to define an initial suspect pool, especially if correlated with other data.)

The downside, of course, is that this is yet another difficulty in trying to maintain distinct and private identities online. You might have very well-separated Facebook identities for your work life and your private life, but if you are posting pictures on both using the same camera, that may someday wind up giving away your identity. Take due notice thereof...

  • 1
Taking a picture with the lens cap (or your finger) on is supposed to be a way to check for the base signature. RAW format probably necessary, but perhaps not?

I have often wondered how sensitive this pattern is to, say, a 3x3 Gaussian blur, possibly with a tiny bit of noise thrown in before the blur to mess things up even more. Doesn't even have to be a 100% strength blur, so you don't lose too much image quality.

FB could offer this as a service when you upload your images! It sort of already does by re-encoding your image as you upload...

Don't know -- the brief asserts that there is no known technique for altering the signature without altering the image (specifically, they say that you can't separate the content from the noise, change the noise, and add it back to the content again), but I don't know what the limits of that assertion are...

To the extent that noise can be accurately subtracted, even on a probabilistic basis, camera makers already have reason to do that, within the limits of the camera's ability. After all, noise is by definition stuff you didn't want in your picture to begin with. For example, some cameras will already take a pre-exposure with the shutter closed, and subtract off the artifacts from the dark frame (this won't help with all of the non-random noise source in a digital camera, but it's noticeable under some conditions).

I don't think something as simple as a deterministic transform would avoid leaving a fingerprint. At best you'd increase the computational power needed to match the fingerprint (rotating and cropping the picture would be good ways to do that also). That way lies an arms race.

  • 1

Log in

No account? Create an account