Cleveland, Ohio police may have botched their chances of convicting an alleged murderer by using the controversial Clearview AI facial recognition tool as the sole evidence justifying a search of the suspect’s home.
In February, the department arrested Qeyeon Tolbert and charged him with the murder of Blake Story, who was shot twice in the back after leaving a blood plasma donation center. Investigators honed in on Tolbert after they sent a CCTV video of a suspect to the Northeast Ohio Regional Fusion Center—a group that pools the surveillance capabilities of local, state, and federal agencies—and “received an identification” of the man in the video, according to court records.
Based on the identification, CPD obtained a search warrant for Tolbert’s home, where officers allegedly recovered a firearm and other evidence. But a Cuyahoga County judge ruled earlier this month that none of that evidence is admissible at trial. The Cleveland Plain Dealer was the first to report the ruling.
Tolbert’s attorneys argued that CPD’s vague account of receiving an identification from the fusion center left out a crucial fact: The only evidence pointing to Tolbert was a report from Clearview AI, which explicitly said in a disclaimer at the bottom that the results “are to be treated as investigative leads and should not be solely relied upon for making an arrest.”
At least eight people around the country have been wrongfully arrested after facial recognition tools incorrectly identified images of suspects, according to a recent Washington Post investigation that found that more than a dozen police departments made arrests based on facial recognition matches without any other corroborating evidence.
In cases where police have made false arrests based on facial recognition matches police have ended up paying hundreds of thousands of dollars to settle lawsuits.
While facial recognition tools often achieve high accuracy metrics in lab testing, they can be less effective in real-world settings where humans can introduce errors, such as using poor-quality images or images of the wrong person.
In the Cleveland case, the CCTV video that the fusion center ran through Clearview AI and that produced a match for Tolbert was from six days after the shooting, which occurred on February 14. In an affidavit, a Cleveland detective wrote that on February 20 he observed a person enter a store who had “the same build, hair style, clothing, and walking characteristics” as the man seen in surveillance footage shooting Story. The detective downloaded the February 20 CCTV footage from the store and sent it to the fusion center.
As a result, Tolbert’s attorney pointed out, the Clearview facial recognition match wasn’t even derived from footage of the crime itself but simply footage of a man police thought looked like the shooter.
The Cuyahoga County prosecutor’s office has appealed the district court’s ruling suppressing the evidence police allegedly recovered from searching Tolbert’s home.
+ There are no comments
Add yours