US police discontinue Amazon's facial recognition technology
The Orlando police department’s controversial pilot of facial recognition technology has ended following public outcry.
The department trialled Amazon’s Rekognition by comparing photographs to a large database of images. It reportedly has not used any images of the public during this period.
The technology – which is powered by artificial intelligence – can track and analyse up to 100 people in a single image including emotions and age ranges.
The pilot was lamented by civil rights group American Civil Liberties Union (ACLU) which leaked emails showing the technology had been sold to US law enforcement agencies in Washington, Oregan and Orlando, Florida.
Nicole Ozer, Technology and Civil Liberties director for the ACLU of California said “Rekognition marketing materials read like a user manual for authoritarian surveillance”.
ACLU was particularly concerned with research that shows facial recognition algorithms unfairly discriminate black faces. A study conducted by Furl, N., Phillips, P. J., & O’Toole, A. J. (2002) demonstrated that algorithms struggle to capture majority-races as effectively as minority-races, indicating that the technology may be fundamentally flawed.
Amazon staff and shareholders have now contacted the company’s chief executive, Jeff Bezos, stating that they “refuse to contribute to tools that violate human rights”.
The letter also called for greater “transparency and accountability measures, that include enumerating which law enforcement agencies and companies supporting law enforcement agencies are using Amazon services, and how”.
An Amazon spokesperson told Police Professional: “We did a professional services engagement with the City of Orlando that was a pilot and had a discernible end date. That this engagement ended was expected and is not news.”
In a similar case, the Metropolitan Police Service (MPS) confirmed it would not be using facial recognition technology at this year’s Notting Hill Carnival. The system, which is ostensibly still on trial, has been used at the event each year since 2016 and was expected to be used at the carnival along with seven further deployments over the coming months.
The decision was taken following an investigation by campaigning group Big Brother Watch, which found that the equipment used by the MPS had a 98 per cent false positive rate.
There is currently no legal guidance covering the police use of automated facial recognition to identify people in crowds and from CCTV footage, despite fears that this amounts to illegal mass surveillance.