How Racial Bias Contributes to Wrongful Conviction
From police algorithms to the death penalty, the Innocence Project works to tackle the racial injustice ingrained in the criminal legal system and the modern policing methods that perpetuate the inequity.
Special Feature 07.17.21 By Daniele Selby
When the Constitution and the Bill of Rights were written, they protected only the rights of white men — in particular white men who owned property. These documents, upon which our government and legal system were founded, did not extend equal rights to women or to Black people, most of whom were enslaved. And the driving force for the creation of formalized police forces in the U.S. was the desire to monitor and control the movements and actions of slaves — to catch runaways and quash revolts.
The disproportionate incarceration of Black people today is borne directly out of that dark history and the brutality of the Jim Crow era. Black people account for 40% of the approximately 2.3 million incarcerated people in the U.S. and nearly 50% of all exonerees — despite making up just 13% of the US population. This is, in large part, because they are policed more heavily, often presumed guilty, and frequently denied a fair shot at justice.
At the Innocence Project, we’ve seen that the majority of wrongly convicted people are those who are already among the most vulnerable in our society — people of color and people experiencing poverty. Two-thirds of the 232 people whose release or exoneration we have helped secure to date are people of color, and 58% of them are Black.
That’s why we are working to tackle the racial injustice ingrained in the criminal legal system and the modern policing methods that perpetuate the inequity.
Biased policing tools
As technology has advanced, law enforcement agencies have explored its applications to policing, resulting in the use of algorithms that attempt to predict who might be more likely to commit a crime and facial recognition tools that aim to identify the perpetrator of a crime using image databases. These might sound like tools that could make communities safe, but the reality is that they encroach on citizens’ privacy rights and endanger Black and brown people.
The algorithms used to predict future crimes are based on historical crime data, and research has found that racial bias is ingrained in this data because it reflects police activity and arrests rather than actual crimes committed. And those biases get amplified by these algorithms.
For example, a risk assessment algorithm may rate a person a high risk for committing another crime — likely garnering them a higher bail — because of high arrest rates in their neighborhood. However, the algorithm doesn’t take into consideration that poor communities and communities of color are typically policed more heavily and, as a result, see more arrests. It also does not factor in the “false positives” — cases in which the person was later determined to be innocent — produced by these large numbers of arrests.
A 2016 ProPublica investigation found that the risk assessment algorithm used in Broward County, Florida, for example, incorrectly predicted that Black defendants would commit future crimes at twice the rate as white defendants. The tool also underestimated white defendants’ risk of committing future crimes.
Already, tools like facial recognition software have come under fire for racial bias. Studies have found that facial recognition tools are relatively successful at identifying the faces of white males, but falsely identify Black and Asian faces 10 to 100 times more often than they misidentify white faces. Facial recognition algorithms also struggle to accurately identify women’s faces. Research shows that the software identifies young Black women with the poorest accuracy.
So far, at least three men — each of whom is Black — are known to have been wrongfully arrested based on misidentifications by facial recognition software.
Currently, no framework currently exists to inform the use of these predictive algorithms. In addition to supporting bans or regulations of certain technology used for policing, such as facial recognition software, the Innocence Project supports state-based legislation like New York Senate Bill S79. The bill would halt the use of biometric surveillance technologies until a regulatory task force is set up to approve existing and new biometric surveillance technologies, including predictive policing tools and investigative systems such as gang databases.
Unregulated databases ensnare innocent people
When developing or identifying suspects, police may rely on blanket surveillance methods, like predictive policing or databases, including gang or DNA databases, for leads. However, who is added to these databases is subject to the unfair discretion of law enforcement officials. The criteria for being added to a gang database vary from place to place, but police can decide to include people in databases based on things like tattoos and past arrests. They can also choose to add people to gang databases based on the color of their clothing, their neighborhood, family, and friends, and even for “staying out late.”
And people of color tend to be overrepresented in these databases.
In Portland, where Black people make up just 6% of the population, 64% of people included in the city’s gang database were Black, while members of white supremacist gangs were underrepresented. In Chicago, 95% of people who police identified as gang members during arrests were people of color. The New York City chief of detectives testified that 99% of the 17,200 people in the NYPD’s gang database in 2018 were people of color.
Police are generally not required to inform people that they have been added to a database, and there is little transparency around how these databases are managed and used. But we do know that innocent people are swept into these databases and wrongly arrested because of them. In 2016, the NYPD arrested 120 people in the Bronx on gang-related charges, but it later turned out that dozens of them were not in gangs. Still, 115 of them entered guilty pleas — with only two people choosing to fight their charges at trial.
Similarly, law enforcement’s use of sweeping — and sometimes unregulated — DNA databases lacks transparency and oversight. Investigative systems like these, which are shrouded in secrecy, can entrap innocent people and lead to wrongful convictions and harsh sentences.
For instance, New York City’s illegal DNA index contains profiles from children as young as 12, people who have never been charged with a crime or prosecuted, potentially undocumented people, and innocent people of color who were specifically targeted based on their race. The legislation that allows such databases to exist was not intended to enable cities to collect and store data in this way.Oversight over these databases should be established and, where appropriate, the databases should be eliminated altogether. Legislation like New York Senate Bill 1347 would end the practice of genetic stop-and-frisk enabled by unregulated DNA databases and would purge illegally obtained profiles.
of the approximately 2.3 million incarcerated people in the U.S. are Black. (Prison Policy Initiative)
of the 232 people whose release or exoneration we have helped secure to date are people of color, and 58% of them are Black. (Innocence Project)
of the 17,200 people in the NYPD's gang database in 2018 were people of color. (Center on Media, Crime & Justice at John Jay College)
of people currently on death row are Black. (Death Penalty Information Center)
Charging, sentencing, and the use of the death penalty
Explicit and unconscious racial biases don’t just play a role in policing. They can also influence charging decisions and trial outcomes. Black people are seven times more likely to be wrongly convicted of murder than white people, according to the National Registry of Exonerations.
Prosecutors tend to charge people of color — particularly Black people — at higher rates and with more serious crimes. As charges add up, the pressure to accept a guilty plea becomes more intense. Overall, about 12% of exonerees confessed to crimes they did not commit, according to the National Registry of Exonerations.
Whether or not a person of color accepts a plea agreement or goes to trial, they are likely to face a harsher punishment. A review of 220,000 cases in Manhattan, for instance, found that Black people were 19% more likely to be offered a plea deal that included time in jail or prison than white people, according to the Vera Institute of Justice. And when convicted at trial, Black people tend to receive more severe punishments.
We know from the 185 people who have been exonerated from death row since 1973 that the death penalty is unfair and unreliable. But, in particular, the death penalty poses a threat to innocent Black people. Historically, Black people have been sentenced to death at disproportionate rates. Today, more than 40% of people currently on death row are Black, and the states that sentence the most people to death are the same ones that previously carried out the most lynchings.
The race of the victim also impacts the likelihood of a death sentence. Overall, people convicted of killing white people are executed at 17 times the rate of those convicted of killing Black people.
Innocence Project clients Rodney Reed and Pervis Payne are just two examples of Black men in the South convicted of killing white women and sentenced to death. Both have maintained their innocence for decades. The Innocence Project remains committed to fighting for justice for Mr. Reed, Mr. Payne, and the many others whose wrongful convictions have been tainted by racial bias. And we continue to fight for reforms that advance equity and fairness in the criminal legal system by tackling injustice at every level of the system.
Leave a Reply
Thank you for visiting us. You can learn more about how we consider cases here. Please avoid sharing any personal information in the comments below and join us in making this a hate-speech free and safe space for everyone.