In a controversial trial, Facebook is asking users in Australia to send the company naked photos and videos of themselves, so it can block the images if they're later uploaded as revenge porn by ex-partners — and the test-run will soon be extended to the UK, US and Canada.
Facebook's software would create a "hash" — a digital fingerprint of the photo — so it can be recognized if it's uploaded, and automatically blocked. The social media giant hopes pre-emptive action will be more effective than deleting images only after they are reported, by which time the damage will likely have already been done.
Facebook has been data mining your personal information from the first moment you created an account. Now they are asking users to send them nude photos of themselves to "fight revenge porn." You can't make this stuff up!
— NMWD 📰 (@RealNMWD) November 8, 2017
Stop using #Facebook. You've been warned. pic.twitter.com/5VEwwfbEZW
Facebook and other technology companies use this photo-matching technology to tackle other forms of banned content, including child sex abuse and extremist imagery. It was first developed in 2009 by Microsoft.
Julie Inman Grant, Australia's e-Safety commissioner, said Facebook would not permanently store the images, only their digital fingerprints, which are capable of blocking further attempts to upload the pictures but cannot be decoded to produce the images themselves.
"We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly," she told Australian broadcaster ABC.
The same hashing technology has been used for years to prevent the spread of child porn, and is also used by internet companies to share and block terrorist images.
Making a Hash
Facebook began using the hashing technique to identify explicit images that had been reported to prevent them from being re-shared in April 2017 — the trial takes the approach one step further, attempting to thwart the photos being posted in the first place.
During the trial, those worried about their images being posted as revenge porn have to contact Australia's e-Safety commissioner through an online form, which may then suggest providing them to Facebook. Users then send the posts to themselves over Facebook's Messenger app.
Did any Facebook lawyers vet this plan? Did they understand the tech? Seems like a failure of risk assessment to me.
— Paul Ohm (@paulohm) November 8, 2017
Facebook's customer support team will then review a blurred version of the image to ensure it's explicit, then "hash" it before deletion. Further instances of the images appearing on Facebook will then be reflexively blocked. Nonetheless, the company will retain the blurred image for some time, to ensure the technology is working correctly.
Revenge porn can result in prison sentences in many countries, but it remains a major problem on the social network, with an average 54,000 cases a month. Roughly four percent of US internet users have been victims of revenge porn, according to a 2016 Data & Society Research Institute report.
The proportion rises to 10 percent in women under the age of 30 — and internet users who identify as lesbian, gay, or bisexual are far more likely than heterosexuals to experience threats of or actual nonconsensual image-sharing.
In all, 15 percent report receiving threats to post an explicit image an image of them online, and seven percent have actually had such images circulated digitally.