How the Justice System Can Learn From ‘Frequent Flyers’

For some Americans, health care and criminal justice are not two separate systems, but components of one big system that too often fails them. Frustrated cops call them “frequent fliers” because they regularly cycle between jail and hospital, so why do we  think we can fix one without the other?  

If a patient commits suicide within 72 hours of discharge, the Joint Commission (the accrediting body for hospitals) requires that a hospital conduct a “sentinel event” review.

That review will include a root cause analysis designed to uncover any mistakes or latent system weaknesses that contributed to the death.  It tries to learn whatever can be learned, and to report on steps aimed at preventing repetition.

But suppose the suicide is a “suicide-by-cop”:  A distraught former patient succeeds in forcing the police to shoot him in order to protect their colleagues or a hostage or a bystander.

Then, the familiar mechanisms of the criminal justice system will automatically activate too.

There will be a homicide investigation and a coroner’s report.

The performances of the officers who pulled their triggers will be examined by prosecutors and department officials.  Did the cops have another choice?   Did they act in self-defense? Was the shooting “within policy”?  Is prosecution called for?  Discipline?

Currently, the National Institute of Justice (NIJ) and the Bureau of Justice Assistance (BJA) are providing support (through a technical assistance grant to the Quattrone Center for the Fair Administration of Justice at the University of Pennsylvania Law School) for state and local criminal justice systems that want to go beyond the typical punishment-oriented reviews of the practitioners on the sharp end of the system, and to conduct “sentinel event” reviews of their own derived from the medical model when something goes wrong.  Jurisdictions are being recruited.

This will be a new thing in criminal justice.

These inquiries are focused not only on the choices of the individual cops, or forensic scientists, or lawyers, implicated in a surprising outcome, but rather on the whole constellation of system factors.

Like the hospital reviews, they ask not only “What happened?” but also “Why did it happen?” The goal is safety, and not just the safety of suicidal people, but also of the police who are forced to confront them—officers who, even if they survive their encounters, are traumatized by the experience.

So, in a criminal justice “sentinel event” review of a suicide-by-cop the role of training and supervision could be examined.   Was the cop trained in de-escalation techniques and equipped with non-lethal options?  Did the department have a Crisis Intervention capacity?

Had the 911 dispatcher gathered⸺and then conveyed⸺the useful information?  To the right people?  If not, why not?  How was this fatal situation created?  How can it be avoided?

These are system-oriented  event reviews, not personnel-focused  performance reviews; they look forward,  and they  aim at prevention, not at blame.

Are two reviews (or even three, if your city or state has opted into the NIJ/BJA  Sentinel Event effort) better than one?

An innovative recent study in Camden, N.J., (reported in The Crime Report by two of its leaders) casts doubt on that proposition.

In fact, the Camden findings (in my opinion, anyway) argue that we don’t need multiple parallel studies, but rather unified, collaborative learning reviews that enlist not only medical and criminal justice stakeholders working together, but members of the communities and the sub-groups they serve.

We need these learning reviews not only for the spectacular officer-involved fatalities, but for the “high frequency/low impact” missteps characteristic of daily criminal justice life in what a recent book called “Misdemeanorland.”

See: The Crime Report’s Q&A with Issa Kohler-Hausmann, author of “Misdemeanorland.”

The Camden study, conducted by the Camden Coalition of Healthcare Providers with support from the Laura and Jay Arnold Foundation, broke down the walls of multiple data silos in healthcare and criminal justice in Camden and used the numbers to illuminate the lives of a specific group of individuals entangled with both sets of practitioners.

To pull just one telling statistic from among the study’s many: 67 percent of the people who cycled through Camden’s Emergency Departments over the course of the study also cycled through its criminal justice system.

What I think the members of this group—“super-utilizers” to the public health practitioners, “frequent flyers” to the cops—could tell us is that from their perspective, health care and criminal justice are not two systems.  For frequent fliers, health care and criminal justice constitute one big system that dominates their daily efforts to survive.

It is pointless to think of a jail or an emergency room as “upstream” or “downstream.”  Each is simultaneously upstream and downstream of the other.

People who think about safety in other contexts draw a contrast between a complicated system and a complex system.  A jet airplane is a very complicated machine, but it can still be thought of in linear, sequential terms:  if x component fails, then y will happen.  So, find and fix the component.

But jet airliners in operation “become complex because they are opened up to influences that lie way beyond engineering specifications and reliability predictions.”   This is true of hospitals, police departments, prosecutors’ offices, courts, and correctional institutions too.

It looks even more true once you realize that these complex entities are themselves only elements of a more complex encompassing systems environment.

Decisions made in one part of this swirl are seldom automatic “causes” of effects in in other parts; usually they are “influences” that affect the probabilities, not switches that turn things on or off.

Look at the problem of how to launch “Abe,” one of the patient/defendants described by the Camden studies leaders, into a safe, healthy, law-abiding life (that drains fewer public resources) and you can see that you are not dealing with a simple mechanical challenge.

Over five years Abe was treated in emergency departments two dozen times, arrested more than fifteen times:  “A seemingly unbreakable cycle of hospital stays and arrests and incarceration, punctuated by periods of housing instability and homelessness, all of which appear to be driven largely by untreated substance abuse and lack of social supports.”

Some framework for collecting and disseminating cross-sector data will be an important step in breaking the cycle.  The authors of the Camden study are certainly right when they say that their work shows that there is “enormous value in fostering collaborative data sharing among agencies.”

But we should probably remember that data-measuring outputs casts only a pretty oblique light on processes. These processes, involving “lived realities of the people in the criminal justice system,” have to include the “lived realities” of the frontline emergency room nurse, patrol officer, sheriff, prosecutor, and judge who are making the decisions that keep Abe on his treadmill.

It would be surprising if these frontline practitioners greeted the Camden study’s findings with astonished shouts of “Eureka!”  The overlap of homelessness, medical issues, and criminal contacts is something they confront all day, every day.

(Innovative efforts such as the San Francisco Wraparound Project violence prevention initiative at Chan Zuckerberg San Francisco General Hospital have recognized the criminal/hospital nexus as a vital point of entry for community safety.)

Data-derived policies, even very good ones, won’t dispense with the people who have to execute them.  The reality is that the work the frontline workers actually do will seldom be identical to the work that policy wonks are able to describe in advance.

There is, as the Camden study notes, a tremendous variety in frequent fliers’ experiences.  That variety requires innovation, improvisation, choices between conflicting rules, and sometimes even rule-breaking—in short, workmanship—from practitioners.

With all of these actors involved, deciding how to rescue “Abe” is a complex socio-technical riddle, not a straightforward mechanical repair like mending a clockwork.

At 4:30 on some Friday afternoon, with the docket list still bulging, it made sense to each member of the “courtroom workgroup” of prosecutor, defender, and judge to offer Abe a plea to a greatly reduced charge and a sentence of “time served.”

It made sense to Abe to accept the offer and walk out the door.

That this was a mistake becomes clear only later (and in a different place, to different people) when it turns out that the record of conviction meant Abe is booted out of the family home in public housing that provides social support and allowed for medical continuity, or when it disqualifies him for a job or a program.

One of the things we can learn from looking at the general safety literature is that all of the decisions that we now deplore in hindsight as choices that kept Abe cycling were “locally rational” when they were made.

They may not have been heroic, prescient, or admirable, but they made sense to frontline people who were trying to get through their days.  Going “down and in” to focus tightly on one practitioner’s decision won’t be enough; we also have to go “up and out” to see why that decision was made. Leave the same inducements in place and the next practitioner may do the same thing.

When Diane Vaughan looked at the disastrous space shuttle Challenger launch decision she rejected the conventional view of amoral NASA administrators overriding safety concerns to meet the budgetary and political pressures driving the launch schedule.

Vaughan found that the decision was “a mistake embedded in the banality of organizational life.”  It had roots in the “normalization of deviance,” the accumulated drift, by small workarounds, informal work rules, and locally rational adaptations into accepting dangerous risks.

It was supported by a kind of structural secrecy:  that is, by “the way that patterns of information, organizational structure, processes, and transactions, and the structure of regulatory relations systematically undermines the attempt to know.”

Some part of this pattern as it applies to a frequent flyer is made up of formal confidentiality regulations:  the nurse is bound by HIPAA rules, the public defender by attorney-client privilege.  But more derives from mutual unfamiliarity: from a lack of insight into what counterparts in medicine (or public safety) are trying to do, why they are trying to do it, and how they are constrained by their environments.

Look at an avoidable suicide-by-cop or a re-entry failure or medical crisis dooming a frequent flyer like Abe to another downward loop on his spiral and you’re likely to see something of the kind.

Collecting and marshaling the data is crucial, but something like Vaughan’s ethnographic approach—complementing the data with the narratives of individual events and the “thick data” those narratives can provide—is crucial too, if we want to renovate a system that is currently keeping secrets from itself.

There is no reason to choose between, say, the Arnold Foundation’s data analysis and Jennifer Gonnerman’s rich narratives of Kalief Browder’s story.  They inform each other.

James Doyle

James Doyle

But to do this we need everyone’s perspective and on a constantly shifting variety of events.  Narratives confined in silos are no better than data confined in silos.

If everyone is doing “sentinel event” reviews anyway, why not do some together?

Editor’s Note: For another perspective on the Learning Review Process, see Ivan Pupulidy in The Crime Report, “Making Sense of Justice Tragedies.”

James M. Doyle is a Boston defense lawyer and author, and a frequent contributor to The Crime Report. He welcomes readers’ comments.

from https://thecrimereport.org