Anonymizing data is already quite difficult, as shown in this 2015 paper on the reidentifiability of scrubbed credit card metadata. Beyond ineffective anonymizing, another disturbing aspect is the rate at which AI and ML are improving at image recognition. In particular, face recognition is approaching practicality for general purpose use (See Amazon Rekognition for example). While these technologies aren't quite there yet, they will inevitably reach that point. Once coupled with a data sets that are already publicly available, this means that large public image repositories like Imgur will become petri dishes for face recognition data.
These technologies affect existing data retroactively. What is now an unlabeled morass of anonymous pictures could conceivably become treasure troves in the future for data brokers when the cost of picking out pictures of one's likeness from billions of images becomes easily affordable. This can and should be concerning to anyone who's put up anything incriminating on the internet. At this point in history, that's pretty much everyone.
Toss in unintentional location tagging, and a list of pictures becomes not just a timeline, but also a trace of your history of existence.
It won't just be the FBI that can access a large database of facial recognition data. It'll be just about anyone from a potential employer to a stalker.
Automatic License Plate Recognition and its kin are already a good source of side income for anyone who can point a camera at a street. Without due regulation in place, face recognition has the potential to be even more lucrative. The best part is that even if facial recognition itself is regulated, collecting unlabeled video footage of public spaces will likely remain viable for a long time to come.
What does all this mean? Soon it will become practical to start with someone's face and locate any and all pictures featuring them whether those pictures were made available to the public by that person or by others. Even ones where the subject was photographed without their knowledge.
These technologies affect existing data retroactively. What is now an unlabeled morass of anonymous pictures could conceivably become treasure troves in the future for data brokers when the cost of picking out pictures of one's likeness from billions of images becomes easily affordable. This can and should be concerning to anyone who's put up anything incriminating on the internet. At this point in history, that's pretty much everyone.
Toss in unintentional location tagging, and a list of pictures becomes not just a timeline, but also a trace of your history of existence.
It won't just be the FBI that can access a large database of facial recognition data. It'll be just about anyone from a potential employer to a stalker.
Automatic License Plate Recognition and its kin are already a good source of side income for anyone who can point a camera at a street. Without due regulation in place, face recognition has the potential to be even more lucrative. The best part is that even if facial recognition itself is regulated, collecting unlabeled video footage of public spaces will likely remain viable for a long time to come.
What does all this mean? Soon it will become practical to start with someone's face and locate any and all pictures featuring them whether those pictures were made available to the public by that person or by others. Even ones where the subject was photographed without their knowledge.
Comments
Post a Comment