Facial Recognition Developer Told Cops To Test Out The Software By Running Searches On Friends And Family

from the inappropriate-access-is-the-baseline dept

More information about super-sketchy facial recognition software developer, Clearview, is being made public… none of it good.

Clearview — an app created by a developer (Hoan Ton-That) whose previous software put Trump’s hair on people’s photos — is now perhaps the most infamous name in the loaded field of facial recognition tech developers. Ton-That’s app links to a proprietary database of face photos that’s several times larger than any the US federal government has managed to create.

But the only thing proprietary about it is the access. The photos that make up the supposed 3 billion images stored on Clearview servers all come from other companies’ websites. Any site that encourages users to upload photos of themselves has been scraped by Clearview, allowing it to amass a few billion photos with very little effort or expenditure on its part.

Those being scraped aren’t happy. While it’s not definitely illegal to violate sites’ terms of service to scrape photos for resale, it’s kind of discouraged. As long as this remains unsettled, Clearview is free to scrape sites for photos and sell access to its database to cops.

Clearview may or may not be selling a lot of access to a lot of cops. I don’t normally use that mostly-meaningless (and wholly-useless) phrase, but Clearview hasn’t exactly been honest about its law enforcement partnerships. The company has claimed in marketing materials it has been instrumental in identifying wanted criminals and providing lists of suspects for cold cases. Conversations with the agencies named in Clearview’s sales pitches have pointed out the company’s claims are exaggerated, if not completely false.

Fudging facts is just part of the business model, apparently. Ever since Kashmir Hill’s investigation of the company for the New York Times went public, multiple journalists have dug up even more details about the company. Every new story makes the company look even shadier.

Clearview insists it’s a powerful tool that should only be used responsibly by government agencies. But it actually says something else to the law enforcement agencies it pitches its product to, according to documents obtained by Buzzfeed.

[I]n a November email to a police lieutenant in Green Bay, Wisconsin, a company representative encouraged a police officer to use the software on himself and his acquaintances.

“Have you tried taking a selfie with Clearview yet?” the email read. “It’s the best way to quickly see the power of Clearview in real time. Try your friends or family. Or a celebrity like Joe Montana or George Clooney.

“Your Clearview account has unlimited searches. So feel free to run wild with your searches,” the email continued.

The email chain also encouraged the department to freely use its “Invite User” function to give access to as many people as possible, with Clearview stating it would “immediately” open demo accounts for each invited user. “Running wild” must have worked for the Green Bay PD. It ended up signing a $3,000 contract with Clearview.

Clearview’s insistence that this irresponsibility is being handled responsibly by the company doesn’t make things much better. It asks the police to police themselves, something they often seem incapable of doing.

“As as [sic] safeguard we have an administrative tool for Law Enforcement supervisors and administrators to monitor the searches of a particular department,” Ton-That said. “An administrator can revoke access to an account at any time for any inappropriate use.”

Run wild… in moderation. It’s unclear how many law enforcement agencies have been offered the same pitch: an invitation to run inappropriate searches to test drive the product. But it could be in the hundreds. Or it could only be dozens. Clearview’s claims continue to shift, moving upwards with each new iteration of its website, which is now being scrubbed a bit cleaner as the company faces more scrutiny. At one point, Clearview said it was being used by 600 law enforcement agencies, putting it up there with Amazon’s Ring. Now, it says it’s working with “over a thousand independent law enforcement agencies.”

“Working with” may mean nothing more than “sent marketing materials to.” Or it could be Clearview is abusing its own code of conduct by running unsolicited searches on criminal suspects and forwarding this information to agencies. It has made bogus claims in the past about its contributions to investigations and it will likely continue to do so until it feels the endless rebuttals are hurting its ability to close sales.

The sales pitches are already light on facts. Going forward they might also include very questionable assertions about citizens’ rights and freedoms, like the one Clearview investor David Scalzo made on Dilbert creator Scott Adams’ podcast. Apparently, the thing that makes America great is the ability to scrape social media websites for photos and sell access to this database to law enforcement officers.

“The reason why America is the greatest and most prosperous is because of our Bill of Rights, and the First Amendment … says that we do not have to be hidden to be free,” he said. “We do not have to be hidden to say what we want, to share ideas, to share information, and to be with people.”

Ah. The 28th Amendment: “If you’ve got nothing to hide, you have nothing to fear.” I’m sure plenty of states would be willing to ratify this endorsement of unchecked domestic surveillance. While there is little expectation of privacy in public social media posts, the expectation that these won’t be harvested in bulk by third parties and sold to the government remains unchanged. Given the many flaws of facial recognition tech (built-in bias, high false positive rates), having nothing to hide should still give most people plenty to fear — especially when a private company is encouraging cops to “run wild” with unproven software.

Filed Under: , , ,
Companies: clearview, clearview ai

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Facial Recognition Developer Told Cops To Test Out The Software By Running Searches On Friends And Family”

Subscribe: RSS Leave a comment
11 Comments
Anonymous Coward says:

Re: What about the CFAA?

I am Not a Lawyer But it seems to be a matter of legal contention as to what counts. The level if barricades help in terms of forced proof. It doesn’t have to be impermeable but it needs actual gating off to prove they meant to "tresspass". Behind a login gate that isn’t deliberately permeable to get search results is far more enforcable than say publically linkable LinkedIn profiles. The courts wound up pretty much telling them to stick their TOS where the sun doesn’t shine when they tried to sue over harvesting.

Stephen T. Stone (profile) says:

“We do not have to be hidden to say what we want, to share ideas, to share information, and to be with people.”

Have to? No. Want to? Possibly. Yes, sharing pictures of yourself on social media is a far cry from “be[ing] hidden”, but that shouldn’t justify scraping social media and building a facial recog database that cops can use to potentially put innocent people behind bars.

Anonymous Coward says:

Re: But, but copyright...

I do not recall assigning any copyright to anyone, perhaps you are talking about a limited use license to duplicate?

For example, (afaik) facebook is not given the latitude to sell your image to someone who is going to use it in some disgusting advertisement that will cause you some questions from family.

Anonymous Coward says:

Re: Re: Re: But, but copyright...

Yes. But they’d have to be sued by the copyright holders, not Facebook. To do that the copyright holders would have to prove their photos are being used. Which they can’t do because the database isn’t accessible to the public.

That being said, perhaps they really shouldn’t be reminding the law enforcement officers who do have access to the database that they could search for unlicenced use of their own photos.

ECA (profile) says:

Time for abit of knowledge..

How to create a 3D image that can be Identified.
Pictures have NO DEPTH.
there are tricks to lighting and how this all works.

  1. you need a Square mat to project onto a face to show Changes in the surface.
  2. Use Multi spectrum lighting.. NOT only visible, but use UV and IR to scan the person. There are changes under the skin that can be seen. As well as being able to see threw most makeup..
  3. it all has to be sampled at the Same Angles.. Taking pics from Above a persons head only shows the bald spots.
  4. all of this would require Special camera’s for each spectrum used. And special lighting as they go in and out without being bothered.

It can be done, but every one wants this done on the Cheap. And would love to have it work in seconds, not hours.(depends on how many Pics are Stored and compared).

For all of that, I dont think you can get past 50-60%

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...