The use of automated technology in crime fighting

Elon Law Professor Michael Rich explored the implications of new technology in the identification of likely criminals at the 2015 Association of American Law Schools (AALS) Annual Meeting.

Michael Rich, associate professor of law, Elon University School of Law
Michael Rich, associate professor of law, Elon University School of Law[/caption]

A part of the AALS Annual Meeting’s joint panel of the Section on Internet and Computer Law and the Section on Defamation and Privacy, Rich’s presentation was titled “Automated Suspicion Algorithms: The Potential and Pitfalls of the Automated Classification of Likely Criminals.”

Forum moderator Michael Froomkin of the University of Miami School of Law said that automated decision making was “one of the most important issues in technology and law today,” noting that it cuts across the laws of war, tort, contract, regulatory compliance, privacy and discrimination.

Rich focused in the area of criminal law, examining the use of algorithms and machine learning techniques to identify crimes from drug dealing to auto theft to securities fraud. As examples of the current use of such technology, Rich said that the U.S. Securities and Exchange Commission “uses an automated computer system to analyze trading data that then alerts them of likely insider trading activity for further investigation.”

He said some video security systems are able to identify suspicious behavior, like detecting when people stop to look in cars at airports, indicating a break-in risk.

“We are seeing automated decision making infiltrate and becoming more important in day to day police work more than it has been historically,” Rich said.

Rich focused on two areas of concern with regard to the use of algorithms for the identification of likely criminals: secrecy and the possible negative effects of the use of algorithms on policing strategies.

“Just as folks who are using automated systems in other areas will over rely on them, I think police are highly likely to over rely on algorithms that identify suspicious individuals as well,” Rich said.

Extending this point, Rich said the overreliance on algorithms would reduce the chances of achieving goals of automated technology in the criminal justice system.

“If all are police are doing is investigating people who already fit profiles established from historical data, the algorithms will be unable to learn about new instances, new changing techniques or strategies used by criminals, which means that those same algorithms will become less useful over time,” Rich said. “Ultimately the algorithm becomes self confirming. It identifies a certain class of criminal, it finds them and then it comes to believe that that is the only kind of activity that is indicative of that type of criminal conduct, when in fact criminals may be adjusting or simply evolving even if not in response to the algorithm itself.”

Rich is scheduled to present scholarship in this area at Cumberland School of Law at Samford University in March and at Stetson University College of Law in April.

An audio cast of the entire automated technology panel at the 2015 AALS Annual Meeting is available here.

More information about Elon Law Professor Michael Rich is available here.