Please ensure Javascript is enabled for purposes of website accessibility

For criminal sentencing, data-driven policy in action


David Alan Soulé

Jinney Smith

Jinney Smith

Data-driven policy is all the rage, but it is much easier to find an advocate of the practice than it is a practitioner. So what happens when a state agency decides to use a data-driven approach utilizing information from multiple sources to address a policy question?

The Maryland State Commission on Criminal Sentencing Policy confronted that question when it addressed an issue that arose during its annual public comments hearing. An attendee said discrepancies in how juveniles are committed to the Maryland Department of Juvenile Services may lead to inconsistent and potentially unfair sentencing guidelines if those youth re-offend and are sentenced as 18-to-22-year-old adults. Specifically, it was noted that variations in the definition of juvenile commitment across local jurisdictions might result in disparities in the scoring of the juvenile delinquency component of an adult offender’s prior criminal record in the Maryland sentencing guidelines.

The juvenile delinquency component currently uses a combination of juvenile adjudications and commitments to assign a point value to the offender’s juvenile record indicating whether he or she is at low, medium, or high risk of recidivism, which then contributes to a measure of his or her overall prior record. Prompted by the public hearing, the MSCCSP collaborated with the Maryland Data Analysis Center to evaluate the juvenile score.

For the juvenile score study, data were collected for the more than 50,000 adults sentenced under the criminal sentencing guidelines in Maryland between 2008 and 2012. In addition to the MSCCSP’s own sentencing worksheet data, data were also provided by the Department of Juvenile Services and the Department of Public Safety and Correctional Services. The DJS data allowed the MDAC to conduct an in-depth audit of the juvenile component score as captured on sentencing worksheets and evaluate its efficacy. The combination of linked DJS and DPSCS data allowed the MDAC to design and test several juvenile delinquency scores, and validate the scoring alternatives across five different measures of adult recidivism — from being rearrested for any offense to being reconvicted for a violent felony. (Without the financial support of the Laura and John Arnold Foundation to establish the MDAC and fund projects helpful to Maryland agencies, this study would have been impossible.)

The MDAC study linked data from three unique sources providing a detailed picture of an offender’s criminal justice involvement starting with juvenile record through post-sentence recidivism. Although these three data sources provide a comprehensive understanding of one’s involvement in the Maryland criminal justice system, it was also necessary to get insight from the individuals who score the sentencing guidelines in order to identify any obstacles or inconsistencies measuring one’s juvenile involvement. Accordingly, the MSCCSP conducted a survey of state’s attorneys and parole and probation agents. The survey results indicated that few had access to detailed data regarding juvenile commitments (e.g., length of commitment, type of facility, seriousness category of offense). This additional information was instrumental in guiding decisions about revising the scoring system.

Several alternative scoring methods were discussed by a commission subcommittee, which ultimately voted last year in favor of one based solely on juvenile adjudications, thus reducing the disparities associated with juvenile commitments and improving the score’s predictive ability. The full commission deliberated on the recommendation at its September and December 2017 meetings, ultimately adopting the subcommittee recommendation. The new juvenile score takes effect July 1.

More than a model

In criminal justice policymaking, the terms “data-driven” and “evidence-based” are frequently invoked, but the fact is that the work needed to support such standards for policymaking is both time-consuming and expensive. Collecting the data needed for the study took nearly a year, and then another year was devoted to analyzing and presenting the data to the commission. We learned in the course of this project that, without existing linked agency datasets, periodically refreshed for research purposes, “data driven” done right can only occur at a very slow speed, generally too slow to keep pace with the policymaking process.

We also learned how any criminal justice agency’s data, studied in isolation, will rarely produce a complete answer to the complex questions policymakers face. Finally, moving beyond the data, we learned how important it is to find clear ways to present complex statistical evidence to policymakers, and for academic researchers to be cognizant of the policymaking audience and its concerns.

Based on this experience and labor, we hope to build on it and encourage greater data-sharing and linking in support of evidence-based criminal justice policies in Maryland. When policymakers have questions, it typically takes time and resources to assemble and analyze the necessary data. Unfortunately, the questions can remain unaddressed because data are not quickly available and linked for analysis. As with most things, data-driven policy is easier the second time around. The data already collected may be used again, or added to, in order to address new questions.

In designing a new juvenile history score, we completed a model project for evidence-based policy-making. We hope our experience does not just remain a model, but rather, the start of a habit in Maryland.

Jinney Smith, Ph.D., is associate director of the Maryland Data Analysis Center at the University of Maryland. She can be reached at [email protected]. David Alan Soulé, Ph.D., is executive director of the Maryland State Commission on Criminal Sentencing Policy. He can be reached at [email protected].