The American Civil Liberties Association (ACLU) is sounding a warning about the use of AI in creating police reports, saying the tech could produce errors that affect evidence and court cases. The nonprofit highlighted the dangers of the tech in a white paper, following news that police departments in California are using a program called Draft One from Axon to transcribe body camera recording and create a first draft of police reports.
One police department in Fresno said that it’s using Draft One under a pilot program, but only for misdemeanor reports. “It’s nothing more than a template,” deputy chief Rob Beckwith told Industry Insider. “It’s not designed to have an officer push a button and generate a report.” He said that the department has seen any errors with transcriptions and that it consulted with the Fresno County DA’s office in training the force,
However, the ACLU noted four issues with the use of AI. First off, it said that AI is “quirky and unreliable and prone to making up fact… [and] is also biased.” Secondly, it said that an officer’s memories of an incident should be memorialized “before they are contaminated by an AI’s body camera based storytelling.” It added that if a police report is just an AI rehash of body camera video, certain facts might be omitted and it may even allow officers to lie if they did something illegal that wasn’t captured on camera.
The third point was around transparency, as the public needs to understand exactly how it works based on analysis by independent experts, according to the ACLU. Defendants in criminal cases also need to be able to interrogate the evidence, “yet much of the operation of these systems remains mysterious.” Finally, the group noted that the use of AI transcriptions might remove accountability around the use of discretionary power. “For these reasons, the ACLU does not believe police departments should allow officers to use AI to generate draft police reports,” it said.
+ There are no comments
Add yours