I Used PimEyes, a Software That Claims it Can Find Anyone’s Pictures, Here's What I Found - Journo

AI software has caused concern in the media before. Deepfake software is perhaps the most visible example of this, but each new technology brings new concerns. AI facial recognition software is concerning privacy advocates, but the tools are not likely to disappear.
Sputnik
A new facial recognition software available to the public has become a source of controversy after a recent New York Times article described it as a “potentially dangerous superpower from the world of science fiction.”
The software and website is called PimEyes and allows users to enter a photo of a face and find the pictures its algorithm determines is the same person. The article states that the New York Times tested it on dozens of journalists, with their permission, and found that the software found photos of all of them, some of which they had never seen before.
Contrary to the article, which states users must pay $29.99 for the service, the search can be run for free. Users only have to pay if they want to find out what website those photos came from or if they want to search for more than three pictures a day.
The New York Times’ experience was far different than my own. We should start with theirs. They took photos of dozens of reporters and, with their permission, uploaded them to the site. They found photos of all of them, even if the source or result photos included the subject looking away from the camera, with or without glasses or wearing a mask.
China Presents 'AI Nanny' for Embryos in Artificial Wombs, More Efficient Than Humans

My PimEyes Experience

My experience was far different. PimEyes found 10 search results for myself when I uploaded a picture. One was of someone else, six of them were duplicates of each other and two more were duplicates. All told, the site found three pictures of me, along with around a dozen “lower quality results” that were of someone else.
So I uploaded a picture of my fiancée, with her permission, and PimEyes gave us one result from roughly ten years ago and dozens of false positives. I decided to try again with my future brother-in-law and was given three results, none of which were of him.
The difference between myself, the people I put into PimEyes, and the people the New York Times searched for using PimEyes is obvious: online presence. While I have a healthy online presence, it does not compare to reporters working for the nation’s largest newspaper. My fiancée and brother-in-law have a far smaller online footprint than even myself.
PimEyes did not even find my most embarrassing photos, like my mugshots from a DUI I was arrested for 12 years ago. During an investigation I wrote a few years ago on a shady at best cryptocurrency project, the creators attempted to discredit me by posting the mugshots in public forums. They found those pictures, but PimEyes was unable to.
The Times notes that their search results came from news articles, wedding photography pages, review sites, blogs and pornography sites. Most were correct, though they note the pornography ones were not, in their case.
World
No More Words: Microsoft Allows Use of Facial Recognition or Fingerprints Instead of Password

Concerns Over AI Facial Recognition

The New York Times does mention a particular case that is cause for concern. Cher Scarlett, a computer engineer and labor rights advocate, tried out for a pornography company when she was 19. Though she found the experience too degrading to continue with it, her photos from that audition made their way onto the internet and PimEyes found them. Previously, she had no idea those photos were online. She says she sent DMCA notices to the sites demanding the pictures be removed, but she was ignored.
The experience Scarlett went through both during her initial ordeal and while discovering the pictures sounds horrible. I do not mean to demean either of her experiences, it sounds incredibly traumatic.
Those images are online publicly, with or without PimEyes. Anyone can, and an untold number of people likely have, already looked at them before PimEyes found them. The fear, of course, is that online stalkers could take social media pictures of someone they want to find online and search for them.
That is an understandable concern. While PimEyes tells its users to only upload photos of themselves or someone they get permission from, there is nothing stopping cyberstalkers from doing so.
Amnesty International Urges New York City to Ban Facial Recognition Technologies

Is PimEyes Intrusive or Just the Future?

However, what PimEyes is doing is not much different than what Google does with websites. It is looking for similarities to the search parameters and spitting out the result. What Scarlett went through says more about the porn industry, the ineffectiveness of DMCA takedowns and the permanence of the internet, than it does about facial recognition software like PimEyes.
That PimEyes is more technologically advanced than Google and does more than just look for the same pictures is a result of the continual march of technology. Search engines will continually get better and AI software is likely to be incorporated into more technologies we use daily.
PimEyes could certainly run their business in a more ethical way. For starters, they could make opting out of their results far easier. Multiple users, not just Scarlett, have publicly complained about their opt out requests being denied or failing to actually remove their photos from PimEyes’ results. Charging between $89.99 and $299.99 for their premium “PROtect” plans likewise feels skeevy and more than a little like extortion.
But the alternative is to remain in the dark completely, ignorant to what of your past is readily available online, and subject to new tools that are surely coming down the pipeline. There are already several free open-source AI facial recognition options out there, and it is only a matter of time before someone else besides PimEyes connects that to a web crawler if they haven’t already.
The difference is PimEyes has made that task more accessible and as first movers into the market, they have the ability to charge exorbitant fees. That is not ideal, but it is reality.
While hoping your embarrassing photos stay hidden as a needle in the internet’s haystack may be a semi-viable hope-and-pray strategy for now, it will not be for long. Better that you know what is out there than wait for someone else to find it either through AI or their own searching.

AI Facial Recognition is Already Here

Law enforcement, as The Times article points out, is already using AI facial recognition software. And unlike PimEyes, which ignores social media websites and video content and thumbnails, Clearview AI, used by law enforcement, looks through everything, in their neverending goal of knowing exactly what we are doing and have done through time and space.
Accessible AI facial search allows us to use that same technology against them. While you would have to break PimEyes’ terms of service to do so, it could easily be used to identify a police officer who attacks protesters, or it could be used to dig up photos of politicians. Had someone run Representative Madison Cawthorn through PimEyes before voters cast their votes for him, he may have never been elected to office.
Rep. Madison Cawthorn to Face Probe Over Alleged Improper Relations With Staffer, Promoting Crypto
The internet and AI are going to create more complex and scary tools in the future. Open-source software means those tools will be available to those who want them, regardless of regulations or what happens to a company like PimEyes. The key is not to try in vain to stop the continual march of technology, it is finding ways to use that technology to protect yourself. A hammer can be used to build a house, or it can be used to forge a sword.
PimEyes utilizes some business practices I disagree with, but they are only the tip of the AI iceberg. There will be others. It is better that the public be given access to these tools, rather than be limited to law enforcement and the technologically advanced. Doing so would not protect us from the side effects of the tools, it would only create an uneven playing field.
Our digital lives are already stored on the internet and it is only a matter of time before someone finds them. PimEyes does not change that, at worst, it only makes it a little easier, a little sooner.
Discuss