Mathematicians want to break with police, stop working on "predictive policing"

We’ve all heard about the national movement to defund the police. But what if some of our country’s premiere eggheads decided to “de-data” the cops? That’s what’s been suggested in a letter endorsed by more than 1500 mathematicians and computer statisticians that was sent to the august journal, Notices of the American Mathematical Society. (I hear their swimsuit issue really knocks it out of the park.) As Popular Mechanics recently reported, the letter calls on their colleagues in academia to “sever ties” with police departments across the country. They want everyone to halt all work on “predictive policing” software and other high-tech tools used by law enforcement to more efficiently target and prevent crime.

Advertisement

“Given the structural racism and brutality in U.S. policing, we do not believe that mathematicians should be collaborating with police departments in this manner,” the authors write in the letter. “It is simply too easy to create a ‘scientific’ veneer for racism. Please join us in committing to not collaborating with police. It is, at this moment, the very least we can do as a community.”

Some of the mathematicians include Cathy O’Neil, author of the popular book Weapons of Math Destruction, which outlines the very algorithmic bias that the letter rallies against. There’s also Federico Ardila, a Colombian mathematician currently teaching at San Francisco State University, who is known for his work to diversify the field of mathematics.

“This is a moment where many of us have become aware of realities that have existed for a very long time,” says Jayadev Athreya, associate professor at the University of Washington’s Department of Mathematics.

Predictive policing is defined in the article as “the use of mathematical analytics by law enforcement to identify and deter potential criminal activity.” Sounds basic enough, right? If the police have a lot of ground to cover, it’s helpful to know where crimes tend to happen the most so you can intelligently deploy your resources. Software developed for this purpose allows police records to be recorded in such a database, generating data as to where and when crimes are most likely to take place. After all, if you’re running the NYPD, you’re probably not going to put as many officers on Wall Street as you will in East New York because the crimes being committed in the financial sector don’t tend to involve guns, drugs or sexual assaults, with the criminals generally preferring insider trading or fraud.

Advertisement

Unfortunately for the social justice warriors in the mathematics community, this means that the software will generally predict a higher propensity for crime in low-income neighborhoods that tend to be more heavily populated by minorities. This is seen by some on the left as evidence of racism in the software. In reality, it’s evidence of where the majority of violent and property crimes are taking place. If most of the shootings were happening on the Upper East Side of Manhattan, that’s where the software would send the cops.

The article also notes the frequency with which predictive policing software is used with facial recognition technology. But the latter is most often used after a crime takes place, not in a predictive fashion. Still, if the software keeps picking out suspects who are persons of color, then that algorithm must be racist also. Or at least that sounds like the theory being used.

What we’re seeing here is yet another example of virtue signaling from the left, this time among the segment of academia that deals with mathematics and coding. And the default response is to figure out a way to make it harder for the police to do their jobs. Sooner or later they’ll manage to really hinder the police to a measurable degree. And then they’ll get a sense of what America looks like without the thin blue line maintaining order in the streets. Good luck with that, math geeks.

Advertisement

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement