Should I Be Concerned About Bail Algorithms in Broward County?
Algorithm tools are now commonly used at various points throughout the criminal justice system - including the bail process in Broward County.
As our Fort Lauderdale criminal defense lawyers can explain, the predictive technology promises to provide courts with accurate risk assessments of defendants based on statistical analysis of various factors such as the charges, the defendant’s criminal record, their age, and any history of previously failing to appear in court.
Proponents of algorithmic tools in the criminal justice system say they can be wielded to combat mass incarceration. Critics say they are prone to perpetuation of racial bias.
The algorithms used by the courts promise evidence-based results, gleaned from tracking the recidivism rates of large groups of prior offenders and comparing which factors may be correlated to future recidivism. But we can’t overlook the fact that our criminal justice system in the United States is, in many ways, inherently biased.
For instance, substantial research has shown that while there’s little difference in the rates at which white and Black individuals consume cannabis, Black individuals are far more likely to be arrested and convicted for possession of it - and more likely to face stiffer sentencing upon conviction. So any algorithmic system that reaches conclusions about future recidivism risk based on statistical determinations of prior arrests/convictions/sentencing would need to specifically correct for such inequalities. Unfortunately, there is evidence that they fall short on this front.
This is where having an experienced Fort Lauderdale criminal defense lawyer to advocate on your behalf may be highly beneficial. While it’s true that prosecutors and judges will take into account the algorithmic conclusions, it’s not the only thing considered. Having an attorney knowledgeable about the state law and local practices and skilled in presenting persuasive arguments advocating on your behalf can go a long way toward protecting your rights and pushing back against this potential bias of predictive technology.
How Are Florida Bail Algorithms Supposed to Help?One in every 5 prisoners in the world is housed in an American jail or prison. In 2020, researchers at Duke Law School have concluded that science-informed risk tools may help to alleviate excessive imprisonment rates by persuading decision makers that more defendants can be safely supervised in communities, as opposed to being held in jail pending trial.
And there is evidence that such tools can work to reduce jail populations. In New Jersey, for instance, a pretrial risk assessment algorithm called the Public Safety Assessment, or PSA, was used to determine whether someone should be detained pending trial. It measured a defendant’s risk level - on a scale of 1 to 6 - to ascertain how likely they are to fail to appeal in court and/or to be rearrested if released. Elevated risk is calculated if the underlying crime was one of violence. This process was used in conjunction with the removal of a cash bail system. Since this new approach has been used, the ACLU New Jersey reports defendants have continued to appear in courts at similar rates with no increase in time and they tend to spend less time in jail.
Perhaps part of the reason for successes like this is that historically, such forecasting would have been based on the personal experience or intuition of the prosecutor, magistrate, or judge responsible for making decisions about bail or sentencing.
However, the most significant concern is that the algorithms are largely ineffective at eliminating racial bias.
Concerns About Racial Bias With Florida Bail AlgorithmsIn 2016, non-profit journalism outlet ProPublica published an in-depth investigation titled “Machine Bias” that explored how these algorithms were unfairly skewed against Black people.
That investigation focused heavily on criminal defendant cases out of Fort Lauderdale. For instance, two individuals were arrested on the same charge - petty theft. The first, a middle-aged white male with a prior history of theft-related crimes, such as robbery. The second, an 18-year-old young Black woman, had a prior history too, but for misdemeanors allegedly committed as a juvenile. Yet the predictive risk assessment algorithm used by the court in its decision to set bail pegged the white man as being at low risk of committing future crimes. He scored a 3. Meanwhile, the young Black woman scored an 8. Two years later, it turned out the computer system got it backward: The white man was serving 8 years for a subsequent felony theft charge, while the Black woman hadn’t been in trouble since.
Journalists at ProPublica reviewed some 7,000 Broward County arrests over a two-year period. What they found was that the scores were “remarkably unreliable” in their prediction of future violent crimes. Only 20 percent of those forecasted to commit future crimes of violence actually did so. It was a bit more accurate in predicting outcomes in misdemeanor cases, but the formula skewed substantially against Black defendants as future criminals - wrongly labeling them as high risk at approximately twice the rate of white defendants.
The non-profit company that created the algorithm, which is still in use in Florida as well as in other states’ criminal courts, disputed ProPublica’s findings and criticized its methodology. However, evidence of this bias isn’t limited to a single report.
Hire a Defense Lawyer Who Understands The AlgorithmAt The Ansara Law Firm, we’re closely familiar with this technology, how it’s used by the courts, and its potential blind spots. We can help our clients effectively advocate for more favorable outcomes to help offset any unfair bias potentially yielded by these systems.
If you have been charged with a violent crime in South Florida, contact the Fort Lauderdale Criminal Defense Lawyers at The Ansara Law Firm by calling (954) 761-4011.