Facebook Unveils Tools to Prevent the Distribution of Revenge Porn

Latest

Facebook is finally attempting to automate a system for taking down and guarding against the sharing of nude images or videos without users’ consent, a crime that’s disturbing in its regularity.

The BBC reports that the new tools will be featured on Facebook, Messenger, and Instagram. Facebook’s head of global safety, Antigone Davis, announced the changes, telling the Guardian they were developed in partnership with “safety experts” and are an example of how technology can “help keep people safe.”

Some of the new tools sound like the system Facebook already has in place. If an image is flagged, it’s sent to “specially trained representatives” who will review the images before deciding if they violate community standards. (It’s not specified where these representatives are, but Wired’s excellent piece from 2014 on social media content-review farms in the Philippines springs to mind.)

More innovative changes include the “photo-matching technologies” which would smack down any images that are recognized as coming from the same batch of previously banned media. Laura Higgins, the founder of the Revenge Porn Helpline, released a supportive statement on the changes:

“We are delighted with the announcement made by Facebook today,” she said.
“This new process will provide reassurance for many victims of image-based sexual abuse, and dramatically reduce the amount of harmful content on the platform,” Higgins added. “We hope that this will inspire other social media companies to take similar action and that together we can make the online environment hostile to abuse.”

Davis told the BBC that the company is working towards finding a way to prevent initial posting of such imagery, but in the meantime they’re still depending on users to flag images. As usual.

 
Join the discussion...