With the new "revenge porn" tool, users will be able to flag offending images, which will then be reviewed by the content operations team. If the photo is in violation of Facebook’s community standards, it will be removed. The account that shared the photo will also be subject to review for removal.
The photo matching tool can also detect and block images from being shared again once they’re removed.
"These new tools are a huge advancement in combatting nonconsensual pornography and I applaud Facebook for their dedication in addressing this insidious issue, which impacts the lives of individuals and their loved ones across the country and around the world," said Rep. Jackie Speier (D-Calif.), who has called for legislation against nonconsensual porn being disseminated.
The social network giant has been developing the technology over the past year, and has consulted with the Revenge Porn Helpline in the United Kingdom, the National Network to End Domestic Violence, the Center for Social Research, the Cyber Civil Rights Initiative and other groups.
In a July 2016 release, Speier said, "Technology today makes it possible to destroy a person’s life with the click of a button or a tap on a cell phone. That is all anyone needs to broadcast another person’s private images without their consent. The damage caused by these attacks can crush careers, tear apart families, and, in the worst cases, has led to suicide … What makes these acts even more despicable is that many predators have gleefully acknowledged that the vast majority of their victims have no way to fight back.”
The issue of revenge porn on Facebook was highlighted earlier this year when a 30,000-member private Facebook group called Marines United was discovered, in which servicemen would post photos of their female counterparts in sexual and compromising positions, often without their knowledge or consent.
Members of the group scattered once their behavior became public, with some of them moving on to other groups with similar aims. As a result of the scandal the Marine Corps updated its social media conduct regulations and encouraged leadership to report incidents immediately and support victims of revenge porn.
Facebook’s global head of safety Antigone Davis told the BBC, "We are constantly looking to build and improve the tools that we offer and it became very apparent to us that this was a problem occurring across many regions that created unique harm."
Since the tool relies on users flagging photos to be valuable, it isn’t clear how effective it will be in the spread of intimate images in private groups like Marines United, where people come expressly to share such photos.