Who would do this?
Back in November Facebook tested a feature that seemed like the ultimate scam perpetrated by computer geeks. Now, they’re taking that feature global. The feature was first tested in Australia, and promised to help fight revenge porn. How? By asking users to send in their nudes, so Facebook could block those images from being loaded to the platform.
I know. It sounds like a really bad idea.
The program promised to hash these images and AI would be used to identify uploaded images. They further promised that the uploaded images would not be stored on their servers. These promises aren’t enough for me to not have a creeping feeling that there could be a Facebook engineer walking around with a special thumb drive on their keychain.
The program was apparently a success, and no known leaks of private images have been reported. Facebook is now introducing the program to the US, UK, and Canada. CBC reports a little more background about the program than what was originally reported in the Australian pilot. Specifically, that these photos will be reviewed by a Facebook employee.
“According to Facebook, the review will be carried out by one specifically trained member of their community operations safety team; however, the post doesn’t describe what their training will consist of, or what the review will entail” reports CBC.
With all of the privacy concerns that have placed Facebook at the top of headlines worldwide over the last year, I wonder, how many would actually take part in this program?