News 01.13.23

‘Big Tech’ Regulation Must Address Data Use in Criminal Investigations

President Biden recently called for more regulation of “Big Tech” — we think Congress can do more.

By Sarah Chu

Surveillance cameras. (Image: Jürgen Jester/Unsplash)

Whether it’s scrolling on your smartphone, sharing content on social media, or using facial scanners at travel points, every digital interaction generates data. What many don’t realize is that data — which can include information about your location, relationships, and even physical features — is turned over to private companies and the government without their knowledge.

“Big Tech” can use this data to profit off our private information or make us vulnerable to manipulation, exploitation, or abuse. Citing these vulnerabilities, President Biden called for Congress to take action in a Wall Street Journal op-ed on Wednesday.

“We’ve heard a lot of talk about creating committees. It’s time to walk the walk and get something done,” he wrote.

It’s true, the time for regulation to prevent exploitation by “Big Tech” is overdue — but it’s not just “Big Tech” in and of itself we should be concerned about, but also its applications. It’s crucial to consider how law enforcement and the government also can use our data without our consent in ways that can increase the risk of wrongful accusations, arrests, and convictions.

The Problem With Big Data Technologies

Big data technologies can create serious risk of wrongful conviction when applied as surveillance tools in criminal investigations. These technologies are often deployed before being fully tested and have already been proven to have disparate impacts on people of color. For example, the use of facial recognition technology has been increasing, despite being known to misidentify people of color at higher rates. Such technology has led to the wrongful arrests of at least four innocent Black people.

Surveillance technology that uses algorithmic tools may weaponize information about a person’s identity, behavior, and relationships against them — even when that information is inaccurate. Cristian Diaz Ortiz, an El Salvadorian teenager awaiting asylum, was arrested and slated for deportation after he was wrongly labeled a member of the international criminal gang MS-13 and included in a gang database. Law enforcement categorized him as a gang member based on algorithmic inferences because he had been “hanging out with friends around his neighborhood.”

Even if a surveillance technology is accurate, it can still increase the risk of wrongful arrest by distorting suspect development. By their nature, big data-driven tools cast a wide net and can generate a pool of potential suspects that includes innocent people.

In doing so, they can lead law enforcement to focus their investigations on innocent people. In 2018, Jorge Molina was arrested for a murder he did not commit after a new technology described as a “Google dragnet” found that Mr. Molina had been logged into his email on a device near the location of the murder. The device belonged to someone else and had been near the murder location, though Mr. Molina never was.

Once an innocent person is singled out and becomes a person of interest, tunnel vision can set in to the point where even powerful exculpatory evidence won’t shake an investigator’s belief in an innocent person’s guilt. The day after Mr. Molina’s arrest, a detective told the district attorney’s office that it was “highly unlikely” that he had committed the murder, yet Mr. Molina was not released for several more days.

This kind of investigatory tunnel vision has serious real world implications. For example, exoneration data shows that pre-trial exculpatory DNA results were explained away or dismissed in nearly 9% of the 325 DNA exonerations in the United States between 1989 and 2014.

Investigative technologies like these are still unregulated in the United States. Not only are there no requirements for how rigorously they must be tested before being deployed, there also are no rules ensuring full disclosure around them.

This means that people charged with a crime might not be told what technologies police used to identify them. And even if they do know which technologies were used, they may not have access to the information about how the tool works or what data was used in their case. Because so many of these technologies are proprietary, defendants are not allowed access to the source code and even basic information about the data usage and processing while mounting their legal defense.

Congress Must Take Action

We agree with President Biden that it’s time to set limits. And while the president emphasized the need for “clear limits on how companies can collect, use and share highly personal data — your internet history, your personal communications, your location, and your health, genetic and biometric data,” we believe Congress must go a step further.

Congress must make explicit in its anticipated bill that it will regulate how investigative tools are used in criminal investigations to protect people’s data and prevent wrongful convictions, including how data may or may not be collected, used, or stored in those investigations. Doing so would ensure the just application of algorithmic technologies far more efficiently than piecemeal regulation of individual technologies — especially given the constant proliferation of new tools.

Once a company or a government agency extracts data about your physical traits, location, or identity, that information is theirs forever and can be used by them in perpetuity. Without regulation, we can’t fully protect people — and in particular, vulnerable communities and historically criminalized communities — from data harms.

President Biden is right about this: We must take action to protect our data. And we look forward to working with Congress to advance equity in data privacy and protections in the criminal legal system to ensure their simultaneous contributions to public safety, strengthening communities, and the just and equitable administration of justice.

Leave a reply

Thank you for visiting us. You can learn more about how we consider cases here. Please avoid sharing any personal information in the comments below and join us in making this a hate-speech free and safe space for everyone.

Thanks for your comment

Featured news

Press "Enter" or click on the arrow to show results.

Search