When the black lives matter movement gained such momentum that it became a global movement, Data for Good felt it couldn’t be silent: we believed in our hearts that data could help fight the good fight against systemic injustice. We wanted to find a way to find our expertise to help.
We interviewed activists, community members, policy-makers, public defenders, police, leaders in government, community health experts, for-profit companies that work with government data, researchers of police brutality that work with data, and even actual victims. And they taught us that data played a role that was both helpful and hurtful.
Data was helpful when activists and researchers used it to make cogent points that could inform real interventions. Campaign Zero’s #8CantWait is an excellent example of how police data can be used to inform effective policy making. Another example is that a police brutality data researcher was able to use data on shift-lengths and shift assignments to expose how deployment and staffing patterns are meaningfully correlated with violence.
We saw the ways in which data wasn’t being used for good. We learned that Palantir, a tech company, is collecting massive amounts of data on civilians through surveillance. We learned that when predictive policing algorithms are built on datasets that carry the same racist biases of our culture, they reify the deeply-embedded cultural mechanisms of systemic inequality. When we build predictive analytics without recognizing and removing these horrific cultural mechanisms, our machines perpetuate systemic racism by offering racist predictions.
Lastly, we learned that the amount and quality of the data that is available to revolutionary activists—compared to the data available to those who had a stake in keeping the system as it is—was deeply alarming. The sophistication of democratic data tools was embarrassingly low, while a whole industry existed to put sophisticated data technologies in the hands of the oppressors. We learned how corruption can disrupt even our most altruistic goals of transparency in data collection, and circumvent laws that were put in place to ensure data transparency around police brutality incidents.
We finished this project with both a sense of heaviness and a sense of hope. The problem is severe, it is systemic, it is difficult to solve, it is real. Where the money, resources, and usable data flowed in the ecosystem made fully visible the power imbalance that keeps this systemic problem from being solved.
But we also ended this project with hope: data will have a massive impact on systemic injustice and police brutality. It is up to us, those of us who work with data, to care about how data is used and the impact it has, and to lend our talents to seeing that it be used to end systemic injustice, rather than automate its dark impulse by embedding it into our country’s digital infrastructures.