MPD is looking for a new early intervention program — have they gotten any better?
The Department of Justice reported an earlier MPD system missed the mark
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
The city of Minneapolis will soon be shopping for new early intervention software for police. The programs are designed to identify officers who are struggling before misconduct arises. They're also not new; they've been around since the 1980s and have often been criticized as ineffective.
But a new program has caught the attention — and dollars — of several large police departments over the past year and could be in the running here in Minneapolis. It's called First Sign and is based on technology developed by the University of Chicago.
Rayid Ghani, now a professor at Carnegie Mellon University, was part of the team that worked on First Sign. He joined All Things Considered guest host Steven John to talk about how it differs from past programs.
The following is an edited transcript of the interview. Listen to the full interview with the audio player above.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
Tell us a little bit about what this program is trying to do and how it works.
We partnered with a set of police departments through the White House to initially evaluate their existing systems. We found that they were not at all effective. So then we started looking at how do we actually improve on them? How do we build something that works?
The idea was to take (machine learning and) data that departments are already collecting about these officers — their training, any sort of stops they make, any complaints against them, any investigations from internal affairs — and use that to predict if an officer is going to be at risk of doing one of those things that we don't want them to do. And use those predictions to then inform their supervisors, so that they could then prioritize them for different types of interventions.
Is it just more data input into the system that makes it better than similar such programs?
You would hope that better and more data would be the differentiator. It turns out that the older systems didn't use any data. So this is more, but it’s more compared to zero.
That was the shocking part to us. They set thresholds for things like use of force, and once you crossed that, it raised a flag. So it's not really an early warning indicator, it's basically a late warning. They're basically saying, if you've done horrible things in the past, you will do them again. Any intervention at that point is basically punitive.
Culture is a big discussion in Minneapolis with the police department. A lot of people want to completely reimagine the force and say, efforts like this — or body cameras — are Band-Aids on a system that was flawed from the beginning. What would you say to them?
They are absolutely correct that putting in some of these technologies without actually thinking about the overall culture is a mistake. I'll give you an example. The way these systems are built is they are given historical data about the officers, and they're given historical data about incidents that we would like to detect in the future.
Now, who decides what that incident is? A typical department, whenever a use of force happens, the officers on the scene have to report that use of force happened. So one, we're relying on reports that are coming from the officers themselves. Then internal affairs investigates these cases, and then they decide what's justified and what's unjustified. So if you've got an internal affairs that's totally corrupt and says everything is justified, these systems are not going to have any data to find things that are unjustified.
The MPD is currently down some 300 officers. Do you think they've got the bandwidth to take on something new like this, with so few officers on staff?
It is an undertaking. It does require people to manage it, to maintain it, to customize it and then, most importantly, to actually act on it.
A different way of using the technology would be to make it public — the outputs of it public. So when it raises an alert, the public can be involved in that. I don't know how the department feels about it. I'm not speaking on their behalf or on any company’s behalf. But I think implementing this technology inside without having the right resources, without having the right people and the right culture, isn't going to change what we're trying to change.