Microsoft claimed that it made a breakthrough in its facial recognition software, which has been as woefully inaccurate at identifying Black people as the documented inability of whites to tell African-Americans apart.
The software giant announced on Tuesday that it improved the ability of its facial recognition technology to recognize men and women with dark skin by up to 20 times. Is that enough given how poorly the software has performed?
This comes against the backdrop of an MIT study that delivered a damaging blow in February to the commercial facial recognition software industry. The systems, to varying degrees, were highly accurate (99 percent in some cases) at identifying white male faces, and least accurate at identifying Black women (about 35 percent error rate).
The problem is that Silicon Valley software engineers are predominantly white men, and they biasedly taught the software how to recognize people who look like them.
There’s a significant problem with that because law enforcement agencies, which are already racially biased, are utilizing the software.
The technology “is not ready for use by law enforcement,” argued Brian Brackeen, CEO of the facial recognition software developer Kairos, in a TechCrunch opinion piece.
“As a result, I (and my company) have come to believe that the use of commercial facial recognition in law enforcement or in government surveillance of any kind is wrong — and that it opens the door for gross misconduct by the morally corrupt,” he stated.
Indeed, the software appears to be no better than eyewitnesses in the criminal justice system. In December, New York’s highest court told judges across the state that defendants are now entitled to have jurors instructed about the “cross-racial identification problem,” joining four other states to make that declaration.
The criminal justice system has long known that eyewitnesses too often misidentify suspects from a different race. Stated another way, decades of research confirms that white people have trouble telling Blacks apart. At least 247 of the 353 convictions overturned on DNA evidence stemmed from misidentification, The Innocence Project reported. Black men were defendants in more than 200 of the exonerations that the organization handled.
Civil rights groups are also dubious about the use of facial recognition software in law enforcement. A group of 52 organizations sent an open letter to the Department of Justice in 2016 underscoring, based in part on a Georgetown University study, that the technology disproportionately harms communities of color.