Cognitive Factors in Forensic Science
By Vanessa Meterko, research analyst at the Innocence Project
In the spirit of the beginning of a new school year, here’s a quiz for you: count the Fs in the following excerpt.
FINISHED FILES ARE THE RE
SULT OF YEARS OF SCIENTI
FIC STUDY COMBINED WITH
THE EXPERIENCE OF YEARS…
How many did you count? Many people say three or four, some say five, but rarely do people report the correct answer: six. The reason that so many of us get this simple exercise wrong is not because we’re stupid or lazy; it’s because we’re all experts in reading. That’s according to cognitive neuroscientist Dr. Itiel Dror, who studies expert decision-making at University College London. Dr. Dror explains that over time, we’ve learned that words like “of,” “the,” and “a” are of little importance when it comes to comprehension, so our brains have actually learned to ignore them, allowing us to read more efficiently without losing valuable content.
Earlier this summer, I attended a workshop about cognitive factors in forensic decision-making (i.e., the mental processes involved in determining things like how to analyze and interpret crime scene evidence) that was led by Dr. Dror and hosted by the Houston Forensic Science Center, an independent forensic laboratory. During this workshop, Dr. Dror presented this “count the Fs” example to introduce the audience—primarily forensic examiners and administrators—to the idea that the human brain has limited resources, and these limitations have implications when it comes to the search for truth and fairness in criminal justice.
Following this introduction, a steady stream of examples illustrated how cognitive factors like our expectations and selective attention can influence our perception and interpretation of the world around us. There was the study in which the smell of white wine—dyed red with an odorless dye—was described with words typically associated with red wines. And then there was the test of our attention that surprised many in the room (you can try it for yourself here).
Through these and other examples, it became strikingly clear that it’s not actually what’s “out there,” but how our brains process the information that matters, and that our limited mental resources necessitate adaptive strategies that are generally useful (e.g., in daily life, automatically gathering contextual clues can help us make quick, accurate decisions) but can also undermine the scientific goal of objectivity.
However, just being aware of the potential for mistakes is not enough of a protection against errors that could deprive an innocent person of his or her liberty (47% of the 330 DNA exonerees to date had problematic forensic science analysis or testimony involved in their cases). We can’t simply will ourselves to be unbiased. Instead, Dr. Dror proposed a variety of practical solutions to protect against cognitive bias.
One reform is based on the idea that there’s some information that a forensic analyst never needs. For instance, a fingerprint analyst does not need to know the race of the victim to do her job of analyzing a print recovered from the crime scene; likewise a hair analyst never needs to know whether or not the suspect confessed in order to perform his job. This type of information is irrelevant and analysts should be insulated from it. Of course, sometimes an analyst does need to see potentially biasing information (e.g., a fingerprint analyst may need to compare an unidentified print with a known suspect’s print, which could potentially bias the analyst). In situations like these, Dr. Dror advocated for a technique called “Linear Sequential Unmasking.” Essentially, this means providing analysts with the information they need, but doing it as late in the process as possible. Ideally, an analyst would view the unidentified print on its own, document the notable characteristics and features, and only then compare it to the suspect’s print, rather than looking at them simultaneously.
He also recognized the importance of appropriate solutions, acknowledging that we may not need to protect from bias in every situation. If a fingerprint is perfectly clear, it may be a waste of time for several different people to independently evaluate it. Alternately, if a print is smudged and the analysis is open to more interpretation, it may warrant an extra degree of protection from potential bias.
Finally, Dr. Dror continually emphasized that when we talk about biases, it’s not a criticism of forensic scientists—it’s an education in cognitive psychology. Typically, errors are not due to a lack of proper motivation or overt misconduct (those problems are actually easier to identify and eradicate); rather, errors are the result of the limitations of our human minds and should be used as learning opportunities.
So we can keep skipping the “of”s when we read, but we must recognize our potential for similar mistakes with much more substantial consequences. Supporting laboratory reforms that capitalize on our understanding of cognitive psychology will help prevent predictable human errors in the criminal justice system.
Leave a Reply
Thank you for visiting us. You can learn more about how we consider cases here. Please avoid sharing any personal information in the comments below and join us in making this a hate-speech free and safe space for everyone.