Rate Your Latest Police Encounter

During his time as an army engineer, Brandon Anderson handled intelligence sent back from field soldiers and base commanders, seeing firsthand the impact that data had on governmental decision-making. When he returned to civilian life, Anderson decided to use his knowledge of data gathering and dispersal to create a tool that could accomplish a goal close to his heart: protecting communities from police brutality.

That tool, an app called Raheem.Ai, hosted on Facebook Messenger, allows people to report their interactions with police officers. It has received funding from Google and Barack Obama’s My Brother’s Keeper initiative; and last week Anderson was named a 2018 Echoing Green Fellow.

The app prompts users to enter the time and location of the incident, how they felt about it, and demographic data about the officer and about themselves.

Some of the information is then shared on an interactive map. One red dot in Oakland states, “A 36-year-old Black genderqueer person was pulled over and felt disrespected.” Not all of the reports are negative; a green dot in San Francisco reads, “A 28-year-old intersex Asian gender nonconforming person got stopped on foot and felt relieved.”

While he was building the app, Anderson spoke with police departments—rank and file officers as well as police chiefs—and said the response he often got was that, given all the interactions that take place between police officers and members of the community, the percentage of people killed is relatively low.

“How, then are you measuring impact in the community?” Anderson asked. “Is it solely by the number of people we don’t kill? That’s probably not a good metric. I’ve had my own experiences with police—I’m alive, but it doesn’t mean they didn’t impact me in traumatic ways.” Anderson’s own partner was killed by police, a driving force behind Raheem.Ai’s creation.

The app gathers experiences that are already being told: “These stories are what every police chief hears,” said Chris Burbank, Director of Law Enforcement Engagement at The Center for Policing Equity and the former Chief of Police at the Salt Lake City Police Department. “When someone gets stopped you hear ‘My friend gets stopped, [or] my sister.’ The idea of having the data available, having more feedback on how the officers interact with people, I think is very significant.”

Anderson wants to bring those stories out of the shadows. He started digging to understand how people were experiencing policing in their neighborhoods and learned that most people didn’t report their experience with police officers—especially when they were negative.

“People don’t trust the system,” Anderson said, which ensures that they rarely give feedback about police encounters. Anderson wants Raheem.Ai to make it easier for people to document and share their experiences with the police, to create a data-driven way for communities and cities to measure the impact of policing.

The importance of the app, said Burbank, is that it takes the information one step further. “It’s more formalized, so it starts to take on a better tone and the richness of data because there’s some accountability in it—it’s not just the arbitrary re-telling of a story.”

Anderson has three goals for the app: to reduce underreporting, to use the data to advance policy solutions to end police violence, and to arm communities with tools to engage in participatory budgeting. That last aim may seem disconnected from the rest, but Anderson sees it as key to re-thinking the way policing is conceived and funded in cities. Money currently spent on the police force could be diverted to mental health services, he said.

“The reason that police in Ferguson had gas masks, tanks, Humveess—it’s because they had a program that invested in that gear, and that gear was kept because they said, ‘We’ll need this some day,’” said Anderson. “We need to find new ways—better ways—to invest that money, that will provide opportunities that will ultimately solve crime.”

In his book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement, University of the District of Columbia law professor Andrew Ferguson, wrote that once police misconduct can be viewed as a system failure, rather than an individual problem, it becomes easier to address. “Normally, when you think about what police do and how they interact, it feels human: that every incident is its own unique incident,” said Ferguson. “But the more you quantify, you see police are doing similar things across the country. Once you see that repetition in data, you start seeing that these are

 

systemic issues.”

“The reality of an app that can reveal these patterns is another data point to show that this is a structural problem, and needs a structural response,” he said. Ferguson likens the community-generated data to Yelp reviews: the strength in number of individual voices.

“We start trusting those things in part because the numbers support the intuition that there may be something going wrong there,” he said. “There’s a sense in many communities that there’s something broken in the police-community relationship, but it’s an extra validation if you can see over and over again citizens giving negative reviews to interactions with police.”