Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn. If the program works, it would mean that same naked picture will never show up on Facebook, even if a hacker or an ex tries to upload it.
They will keep the blurred image for some time to ensure the technology is working correctly before deleting it.
"With its billions of users, Facebook is one place where many offenders aggress because they can maximize the harm by broadcasting the nonconsensual porn to those most close to the victim", Carrie Goldberg, a lawyer who specializes in sexual privacy told The Guardian.
Australian e-safety Commissioner Julie Inman Grant said it is one of four countries - the others being the USA, United Kingdom and Canada - participating in the test program, but Facebook told "Today" that it is still in talks with the other three nations about expanding there. Facebook claims it won't store images or videos and will only be tracking a digital footprint, known as a hash, to prevent the content from being uploaded again by someone else.
"We're using image-matching technology to prevent non-consensual intimate images from being shared.""It removes control and power from the perpetrator who is ostensibly trying to amplify the humiliation of the victim amongst friends, family and colleagues", she said.
If someone has become the victim of revenge porn, Facebook and the Cyber Civil Rights Initiative created a booklet detailing what actions to take, including reporting the image or video, seeking support from a friend or therapist, calling the CCRI Helpline, documenting everything, and blocking the culprit.