Facebook launches AI to find and remove 'revenge porn' - Action News
Home WebMail Monday, November 11, 2024, 02:48 AM | Calgary | -0.9°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Facebook launches AI to find and remove 'revenge porn'

Facebook is rolling out technology to make it easier to find and remove intimate pictures and videos posted without the subject's consent, often called "revenge porn."

Tool trained to recognize 'nearly nude' photos coupled with derogatory text

Facebook says it will start using AI to automatically detect revenge porn. It has largely relied on people proactively reporting the content up until now. (Dado Ruvic/Reuters)

Facebook is rolling out technology to make it easier to find and remove intimate pictures and videos posted without the subject's consent, often called "revenge porn."

Currently, Facebook users or victims of revenge porn have to report the inappropriate pictures before content moderators will review them. The company has also suggested that users send their own intimate images to Facebook so that the service can identify any unauthorized uploads. Many users, however, balked at the notion of sharing revealing photos or videos with the social-media giant, particularly given its history of privacy failures.

The company's new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review.

Facebook and other social media sites have struggled to monitor and contain the inappropriate posts that users upload, from violent threats to conspiracy theories to inappropriate photos.

Facebook has faced harsh criticism for allowing offensive posts to stay up too long, for not removing posts that don't meet its standards and sometimes for removing images with artistic or historical value. Facebook has said it's been working on expanding its moderation efforts, and the company hopes its new technology will help catch some inappropriate posts.

Trained using confirmed 'revenge porn'

The technology, which will be used across Facebook and Instagram, was trained using pictures that Facebook has previously confirmed were revenge porn. It is trained to recognize a "nearly nude" photo a lingerie shot, perhaps coupled with derogatory or shaming text that would suggest someone uploaded the photo to embarrass or seek revenge on someone else.

The company's new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review. (Dado Ruvic/Reuters)

At least 42 states have passed laws against revenge porn. Many such laws came up in the past several years as posting of non-consensual images and videos has proliferated. New York's law, which passed in February, allows victims to file lawsuits against perpetrators and makes the crime a misdemeanour.

Since 2015, Canada has had acyberbullying law criminalizing the non-consensual distribution of "intimate images," and some provinces have their own laws as well.

Facebook has been working to combat the spread of revenge porn on its site for years, but has largely relied on people proactively reporting the content up until now. But that means by the time it's reported, someone else has already seen it, chief operating officer Sheryl Sandberg said in an interview with The Associated Press. And it's often tough and embarrassing for a victim to report a photo of themselves.

"This is about using technology to get ahead of the problem," Sandberg said.

Facebook still sees user-contributed photos as one way to address the problem, and says it plans to expand that program to more countries. It allows people to send in photos they fear might be circulated through encrypted links. Facebook then creates a digital code of the image so it can tell if a copy is ever uploaded and deletes the original photo from its servers.

The company does not expect the new technology to catch every instance of revenge porn, and said it will still rely on users reporting photos and videos.