The activist dismantling racist police algorithms
Hamid Khan has been a community organizer in Los Angeles for over 35 years, with a consistent focus on police violence and human rights. He talked to us on April 3, 2020, for a forthcoming podcast episode about artificial intelligence and policing. As the world turns its attention to police brutality and institutional racism, we thought our conversation with him about how he believes technology enables racism in policing should be published now.
Khan is the founder of the Stop LAPD Spying Coalition, which has won many landmark court cases on behalf of the minority communities it fights for. Its work is perhaps best known for advocacy against predictive policing. On April 21, a few weeks after this interview, the LAPD announced an end to all predictive policing programs.
Khan is a controversial figure who has turned down partnerships with groups like the Electronic Frontier Foundation (EFF) because of its emphasis on reform. He doesn’t believe reform will work. The interview has been edited for length and clarity.
Tell us about your work. Why do you care about police surveillance?
The work that we do, particularly looking at the Los Angeles Police Department, looks at how surveillance, information gathering, storing, and sharing has historically been used to really cause harm, to trace, track, monitor, stalk particular communities: communities who are poor, who are black and brown, communities who would be considered suspect, and queer trans bodies. So on various levels, surveillance is a process of social control.
Do you believe there is a role for technology in policing?
The Stop LAPD Spying Coalition has a few guiding values. The first one is that what we are looking at is not a moment in time but a continuation of history. Surveillance has been used for hundreds of years. Some of the earliest surveillance processes go back to lantern laws in New York City in the early 1700s. If you were an enslaved person, a black or an indigenous person, and if you were walking out into the public area without your master, you had to walk with an actual literal lantern, with the candle wick and everything, to basically self-identify yourself as a suspect, as the “other.”
Another guiding value is that there’s always an “other.” Historically speaking, there’s always a “threat to the system.” There's always a body, an individual, or groups of people that are deemed dangerous. They are deemed suspect.
The third value is that we are always looking to de-sensationalize the rhetoric of national security. To keep it very simple and straightforward, [we try to show] how the information-gathering and information-sharing environment moves and how it’s a process of keeping an eye on everybody.
And one of our last guiding values is that our fight is rooted in human rights. We are fiercely an abolitionist group, so our goal is to dismantle the system. We don’t engage in reformist work. We also consider any policy development around transparency, accountability, and oversight a template for mission creep. Any time surveillance gets legitimized, then it is open to be expanded over time. Right now, we are fighting to keep the drones grounded in Los Angeles, and we were able to keep them grounded for a few years. And in late March, the Chula Vista Police Department in San Diego announced that they are going to equip their drones with loudspeakers to monitor the movement of unhoused people.
Can you explain the work the Stop LAPD Spying Coalition has been doing on predictive policing? What are the issues with it from your perspective?
PredPol was location-based predictive policing in which a 500-by-500-square-foot location was identified as a hot spot. The other companion program, Operation Laser, was person-based predictive policing.
In 2010, we looked at the various ways that these [LAPD surveillance] programs were being instituted. Predictive policing was a key program. We formally launched a campaign in 2016 to understand the impact of predictive policing in Los Angeles with the goal to dismantle the program, to bring this information to the community and to fight back.
Person-based predictive policing claimed that for individuals who are called “persons of interest” or “habitual offenders,” who may have had some history in the past, we could use a risk assessment tool to establish that they were going to recidivate. So it was a numbers game. If they had any gun possession in the past, they were assigned five points. If they were on parole or probation, they were assigned five points. If they were gang-affiliated, they were assigned five points. If they’d had interactions with the police like a stop-and-frisk, they would be assigned one point. And this became where individuals who were on parole or probation or minding their own business and rebuilding their lives were then placed in what became known as a Chronic Offender Program, unbeknownst to many people.
Then, based on this risk assessment, where Palantir is processing all the data, the LAPD created a list. They started releasing bulletins, which were like a Most Wanted poster with these individuals’ photos, addresses, and history as well, and put them in patrol cars. [They] started deploying license plate readers, the stingray, the IMSI-Catcher, CCTV, and various other tech to track their movements, and then creating conditions on the ground to stop and to harass and intimidate them. We built a lot of grassroots power, and in April 2019 Operation Laser was formally dismantled. It was discontinued.
And right now we are going after PredPol and demanding that PredPol be dismantled as well. [LAPD announced an end to PredPol on April 21, 2020.] Our goal for the abolition and dismantlement of this program is not just rooted in garbage in, garbage out; racist data in and racist data out. Our work is really rooted in that it ultimately serves the whole ideological framework of patriarchy and capitalism and white supremacy and settler colonialism.
We released a report, “Before the Bullet Hits the Body,” in May 2018 on predictive policing in Los Angeles, which led to the city of Los Angeles holding public hearings on data-driven policing, which were the first of their kind in the country. We demanded a forensic audit of PredPol by the inspector general. In March 2019, the inspector general released the audit and it said that we cannot even audit PredPol because it’s just not possible. It’s so, so complicated.
Algorithms have no place in policing. I think it’s crucial that we understand that there are lives at stake. This language of location-based policing is by itself a proxy for racism. They’re not there to police potholes and trees. They are there to police people in the location. So location gets criminalized, people get criminalized, and it’s only a few seconds away before the gun comes out and somebody gets shot and killed.
How do you ensure that the public understands these kinds of policing tactics?
Public records are a really good tool to get information. What is the origin of this program? We want to know: What was the vision? How was it being articulated? What is the purpose for the funding? What is the vocabulary that they’re using? What are the outcomes that they’re presenting to the funder?
They [the LAPD] would deem an area, an apartment building, as hot spots and zones. And people were being stopped at a much faster pace [there]. Every time you stop somebody, that information goes into a database. It became a major data collection program.
We demanded that they release the secret list that they had of these individuals. LAPD fought back, and we did win that public records lawsuit. So now we have a secret list of 679 individuals, which we’re now looking to reach out to. And these are all young individuals, predominantly about 90% to 95% black and brown.
Redlining the area creates conditions on the ground for more development, more gentrification, more eviction, more displacement of people. So the police became protectors of private property and protectors of privilege.
What do you say to people who believe technology can help mitigate some of these issues in policing, such as biases, because technology can be objective?
First of all, technology is not operating by itself. From the design to the production to the deployment to the outcome, there is constantly bias built in. It’s not just the biases of the people themselves; it’s the inherent bias within the system.
There’s so many points of influence that, quite frankly, our fight is not for cleaning up the data. Our fight is not for an unbiased algorithm, because we don’t believe that even mathematically, there could be an unbiased algorithm for policing at all.
What are the human rights considerations when it comes to police technology and surveillance?
The first human right would be to stop being experimented on. I’m a human, and I am not here that you just unpack me and just start experimenting on me and then package me. There’s so much datafication of our lives that has happened. From plantation capitalism to racialized capitalism to now surveillance capitalism as well, we are subject to being bought and sold. Our minds and our thoughts have been commodified. It has a dumbing-down effect as well on our creativity as human beings, as a part of a natural universe. Consent is being manufactured out of us.
With something like coronavirus, we certainly are seeing that some people are willing to give up some of their data and some of their privacy. What do you think about the choice or trade-off between utility and privacy?
We have to really look at it through a much broader lens. Going back to one of our guiding values: not a moment in time but a continuation of history. So we have to look at crises in the past, both real and concocted.
Let's look at the 1984 Olympics in Los Angeles. That led to the most massive expansion of police powers and militarization of the Los Angeles Police Department and the sheriff’s department under the guise of public safety. The thing was “Well, we want to keep everything safe.” But not only [did] it become a permanent feature and the new normal, but tactics were developed as well. Because streets had to be cleaned up, suspect bodies, unhoused folks, were forcibly removed. Gang sweeps supposedly started happening. So young black and brown youth were being arrested en masse. This is like 1983, leading to 1984.
By 1986-1987 in Los Angeles, gang injunctions became a permanent feature. This resulted in massive gang databases, and children as young as nine months old going into these gang databases. That became Operation Hammer, where they had gotten tanks and armored vehicles, used by SWAT, for delivering low-level drug offenses, and going down and breaking down people’s homes.
Now we are again at a moment. It’s not just the structural expansion of police powers; we have to look at police now increasingly taking on roles as social workers. It’s been building over the last 10 years. There’s a lot of health and human services dollars attached to that too. For example, in Los Angeles, the city controller came out with an audit about five years ago, and they looked at $100 million for homeless services that the city provides. Well, guess what? Out of that, $87 million was going to LAPD.
Can you provide a specific example of how police use of technology is impacting community members?
Intelligence-led policing is a concept that comes out of England, out of the Kent Constabulary, and started about 30 years ago in the US. The central theme of intelligence-led policing is behavioral surveillance. People’s behavior needs to be monitored, and then be processed, and that information needs to be shared. People need to be traced and tracked.
One program called Suspicious Activity Reporting came out of 9/11, in which several activities which are completely constitutionally protected are listed as potentially suspicious. For example, taking photographs in public, using video cameras in public, walking into infrastructure and asking about hours of operations. It’s observed behavior reasonably indicative of preoperational planning of criminal and/or terrorist activity. So you’re observing somebody’s behavior, which reasonably indicates there is no probable cause. It creates not a fact, but a concern. That speculative and hunch-based policing is real.
We were able to get numbers from LAPD’s See Something, Say Something program. And what we found was that there was a 3:1 disparate impact on the black community. About 70% of these See Something, Say Something reports came from predominantly white communities in Los Angeles. So now a program is being weaponized and becomes a license to racially profile.
The goal is always to be building power toward abolition of these programs, because you can’t reform them. There is no such thing as kinder, gentler racism, and these programs have to be dismantled.
So, you really think that reform won’t allow for use of these technologies in policing?
I can only speak about my own history of 35 years of organizing in LA. It’s not a matter of getting better, it’s a matter of getting worse. And I think technology is furthering that. When you look at the history of reform, we keep on hitting our head against the wall, and it just keeps on coming back to the same old thing. We can’t really operate under the assumption that hearts and minds can change, particularly when somebody has a license to kill.
I’m not a technologist. Our caution is for the technologists: you know, stay in your lane. Follow the community and follow their guidance.
Deep Dive
Artificial intelligence
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
Google’s Gemini is now in everything. Here’s how you can try it out.
Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.
Providing the right products at the right time with machine learning
Amid shifting customer needs, CPG enterprises look to machine learning to bolster their data strategy, says global head of MLOps and platforms at Kraft Heinz Company, Jorge Balestra.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.