Facebook Launches Photo-Matching Tool to Combat Revenge Porn

Revenge porn is pervasive, and Facebook wants to do its part to stop it from spreading on its platforms.

The term refers to non-consensual pornography that’s distributed online to shame, exploit or extort its victims.

And on Wednesday, the company said it would apply photo-matching to ensure intimate, nonconsensual images that are reported once aren’t able to be uploaded again through Facebook’s properties, including Messenger and Instagram.

Facebook (FB, Tech30) said once an image is reported, it is reviewed by the company’s community operations team and then photo-matching will be applied.

From there, “if someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” Facebook head of global safety Antigone Davis said in a company blog post.

A study from Data & Society Research Institute found that one in 25 people has been a victim of either threats, or actual posts, of revenge porn. The phenomenon is emotionally distressing, even resulting in some publicized suicides as a result of the shame and bullying that often results.

“It’s wrong, it’s hurtful, and if you report [revenge porn] to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms,” said CEO Mark Zuckerberg in a Facebook post Wednesday afternoon. “We’re focused on building a community that keeps people safe.”

Click here to read more.

SOURCE: CNN Money, Sara Ashley O’Brien