Montana’s Economic Affairs Interim Committee met Tuesday to discuss a draft bill that would regulate the use of facial recognition technology in criminal investigations. According to Seaborn Larson’s reporting for the Helena Independent Record, local police agencies in Montana are not currently using facial recognition, but they advocated for the legislation not to restrict their use of the technology for investigating serious crimes. The legislation, currently in draft form, allows police to use facial recognition for certain crimes including assault with a deadly weapon and deliberate homicide. Facial recognition software is a known contributor to wrongful arrests, and innocence advocates are concerned that it would cause wrongful convictions if lawmakers fail to prohibit its use as a suspect identification tool.
Facial recognition technology consistently fails to identify people of color and especially women of color accurately. A study published in the scientific journal IEEE Transactions on Information Forensics and Security found these technologies are the least accurate in identifying subjects who are female, Black, and 18-30 years old. Another peer-reviewed article called Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification compared the accuracy of three different algorithms in identifying four groups: darker-skinned females, darker-skinned males, lighter-skinned females, and lighter-skinned males. Each algorithm was the least accurate in identifying darker-skinned females, and the error rates were up to 34% higher than for lighter-skinned males. Similarly, the National Institute of Standards and Technology looked at 189 algorithms and found facial recognition to be the least accurate in identifying women of color.
In January 2022, the Innocence Project responded to the White House Office of Science and Technology Policy’s request for information on Public and Private Sector Uses of Biometric Technologies (facial recognition software). The Project warns against using these technologies for investigations because oftentimes once an innocent person becomes a person of interest, tunnel vision leads investigators to ignore exculpatory evidence.
This happened to Nijeer Parks in February 2019. He was falsely accused of shoplifting and attempting to hit a police officer with his car based on facial recognition identification. The real perpetrator left behind a fake ID at the scene. A facial recognition software said Nijeer was a high-profile match to the photo in the ID. He was arrested and charged with assault, theft, and eluding arrest carrying a 25-year prison sentence. He seriously considered taking a plea deal if one were offered. About six months later, he found a photo of a receipt that he had taken at the same time as the crime. It proved he was 30 miles away. The case was dismissed for lack of evidence in November 2019. He is currently suing the police department, local officials, and the maker of the facial recognition software that falsely identified him. The lawsuit describes how police failed to obtain DNA or fingerprint evidence from the scene which would have eliminated him as a suspect.
“I could be talking to you from prison right now trying to explain my innocence,” Nijeer told Wired. “I just don’t want that to happen to anybody else.”
Nijeer is not the only Black man to be falsely identified and arrested based on facial recognition alone. Other notable stories include that of Robert Williams and Micheal Oliver. Robert was arrested in January 2020 for stealing $3,600 in watches. His charges were dropped after an Instagram Live video of him singing proved he was 50 miles away.
“And as any other Black man would be, I had to consider what could happen if I asked too many questions or displayed my anger openly, even though I knew I had done nothing wrong,” Robert told Wired.
Similar to Nijeer, Robert strongly considered the possibility that he would have to take a plea deal. This illustrates the innocence issue of falsely confessing to avoid the cost of trial and the risk of spending more time incarcerated if found guilty; because of this, we may never know how many people have falsely confessed to accusations based on facial recognition software.
Micheal was arrested in July 2019 for grabbing a smartphone from a teacher who was recording a fight at school and throwing it on the ground. Facial recognition software identified him as the man in the video. Prosecutors later dropped the charges when Micheal’s public defender pointed out how the man in the video did not resemble him including the fact that Micheal did not have tattoos, which the man in the video did. Despite proving his innocence, Micheal lost his job due to the arrest.
“I’ve got a son, I’ve got my family, I’ve got my own little house, paying all my bills, so once I got arrested and I lost my job, it was like everything fell, like everything went down the drain,” Micheal told Wired.
Montana’s draft bill currently only allows for facial recognition technologies to be used in serious crime investigations. These crimes may be the most challenging to investigate, but they are also the crimes that carry the longest sentences and would have the most implications for people wrongfully arrested or convicted of them. Using questionable technologies to solve crime does not support public safety; in fact, it jeopardizes public safety because if the wrong person is arrested and convicted, then the right person is not held accountable.
In their response to the White House’s request, the Innocence Project states: “In order to narrow the entry point for innocent people into a criminal legal system, it is the Innocence Project’s position that investigative biometric technologies must meet the same standards of accuracy and reliability expected of court admissible evidence and must further demonstrate their capacity for just and equitable application prior to their implementation in the criminal legal system. To require anything less is tantamount to facilitating the experimentation of these technologies on society. This is a painful and intolerable risk. The narrative that policing strategies and due process will weed out innocent people prior to conviction has been disproven by thousands of wrongful convictions.”
Click here to follow the progress of Montana’s Economic Affairs Interim Committee’s draft bill on facial recognition technology. Click here to submit a public comment for the committee’s consideration.