Monday, November 7, 2016

How Algorithms Could Help Keep People Out of Jail

STEVE LEIFMAN KNEW Miami-Dade’s courts had a problem. Ten years ago the longtime jurist realized that his county was putting too many ­people with mental health problems in jail. So he set up a psychiatric training program for 4,700 police officers and a new system to send people to counseling. The incarcerated population plummeted; the county shut down an entire jail.

But Leifman thought they still weren’t doing enough. So he asked the Florida Mental Health Institute to look at intake data for the county’s jails, mental health facilities, and hospitals and figure out who was using the system. It turned out that over five years, just 97 people with serious mental illnesses—5 percent of the jail population—accounted for 39,000 days in jails and hospitals. By themselves they had cost Miami-Dade $13 million. “This population was really hitting the system hard, without any good outcomes for them, society, or anybody,” Leifman says.
Across the country, jails and prisons have become repositories for people living with mental health issues. More than half of all prisoners nationwide face some degree1 of mental illness; in 20 percent of people in jails and 15 percent in state prisons, that illness is serious. Local criminal justice systems have to figure out how to care for these potentially complex patients—and how to pay for it.
Leifman’s team set up a more intensive system of care. Today, 36 health care providers in South Florida have access to a database of people in clinics or shelters to determine who they are and what help they need. Privacy laws keep its use limited, but the idea is to eventually widen the database’s scope and availability to other providers.
Cities across the country are starting to follow Miami-Dade’s example, trying to use data to keep low-level offenders out of jail, figure out who needs psychiatric help, and even set bail and parole. In the same way that law enforcement uses data to deploy resources—so-called predictive policing—cities are using techniques borrowed from public health and machine learning to figure out what to do with people after they get arrested. The White House’s Data-Driven Justice initiative is working with seven states and 60 localities, including Miami-Dade, to spread the ideas even further.
Eventually anyone moving through the justice system in Miami-Dade will enter medical and family history, past arrests, and more into the database, built in partnership with the Japanese pharmaceutical company Otsuka—which, according to Leifman, has spent $70 million on the project. An algorithm will help predict what kind of help a person needs before they actually need it. Let’s say you have a 30-day prescription for bipolar medication but never get it refilled. This new system would flag it and notify your case manager. (All this will have to comply with federal privacy regulations; the county is now figuring out who will have access—a public defender, a representative from the county mental health project, etc.) “If we can treat mental illness using more of a population model or disease model, not a criminal justice model, we’re going to get much better outcomes,” Leifman says.
This algorithmic approach is going way beyond mental health care. It all depends on what you put into the database. Some places use predictive software to help determine how likely people are to ­reoffend—which in turn influences their jail sentences and parole determinations. This is controversial, because the risk factors some algorithms take into consideration, like lack of education or unemployment, often disproportionately tag poor people and minorities. A ProPublica investigation found that Compas, an assessment tool used in Broward County, Florida, was 77 percent more likely to rate African American defendants as high risk. “Algorithms and predictive tools are only as good as the data that’s fed into them,” says Ezekiel Edwards, director of the ACLU’s criminal law reform project. “Much of that data is created by man, and that data is infused with bias.”
That’s why these predictive systems need oversight and transparency if they’re going to work. Leifman won’t use them in sentencing considerations, for example. “I want to make the decision, not leave it to a machine,” he says. “You don’t want a technology that takes away from using our own brains.” Still, even with more work to be done on training the algorithms, no one can argue with the potential to improve lives, save money, and create a more compassionate and just justice system.

No comments:

Post a Comment

AddToAny