In December, 2017, Amazon boasted that it had perfected new face-recognition software for crowds, which it called Rekognition. It explained that the product is intended, in large part, for use by governments and police forces around the world. The ACLU quickly warned that the product is “dangerous” and that Amazon “is actively helping governments deploy it.”

“Powered by artificial intelligence,” wrote the ACLU, “Rekognition can identify, track, and analyze people in real time and recognize up to 100 people in a single image. It can quickly scan information it collects against databases featuring tens of millions of faces.” The group warned: “Amazon’s Rekognition raises profound civil liberties and civil rights concerns.” In a separate advisory, the ACLU said of this face-recognition software that Amazon’s “marketing materials read like a user manual for the type of authoritarian surveillance you can currently see in China.”

BuzzFeed obtained documents showing details of Amazon’s work in implementing the technology with the Orlando Police Department, ones that “reveal the accelerated pace at which law enforcement is embracing facial recognition tools with limited training and little to no oversight from regulators or the public.”