The killing of George Floyd and violent crackdowns on protests across the country have sparked a debate over how much police reform is necessary in this moment in American history. On one end—embodied by the protesters in the streets—are calls for dramatic change, such as defunding or disbanding police departments. Others, like former Vice President Joe Biden, seek a more modest approach, endorsing the continuation of policies to discipline officers for misconduct and train them to recognize their own racial biases.
Implicit bias trainings are one of the most widely adopted police reforms, even though the evidence for their effectiveness is limited. Some research has shown these programs can potentially reinforce the stereotypes that they are meant to combat.
University of South Florida criminology professor Lorie Fridell is one of the preeminent police implicit bias trainers in the country. Her company, Fair and Impartial Policing, has offered trainings—using courses developed in part with funding from the Department of Justice—at hundreds of police departments throughout the United States and in Canada, including Dallas, Miami, Milwaukee, and Philadelphia. Notably, Fair and Impartial Policing offered training with the Minneapolis Police Department—which until recently employed Derek Chauvin, the officer who killed Floyd—as part of a wider $300,000 city contract in 2015. Those trainings, conducted by retired and current police officers, include day-and-a-half-long sessions at the command level and shorter sessions at the officer level using videos, small-group discussions, large-group discussions, scenarios, applications, and sometimes role playing to teach officers how to produce impartial policing. Commanders are also instructed on the importance of “recruitment, hiring, operations, culture, leadership message, and measurement.” Fridell’s overarching message to the officers she trains is that while stereotypes are based partly in fact, policing based on these stereotypes makes “you unsafe, ineffective, and unjust.”
Fridell’s company just finished its biggest contract yet, a two-year, $4.5 million deal to do trainings with the New York Police Department. She has stressed in interviews that some disparities in policing are inevitable and are not necessarily a sign of bias. In her trainings, she states that while some amount of disparity might result from biased policing, it also reflects different crime patterns; that is, people of color are arrested or stopped by police more often because they simply commit more crimes. She also argues that black people may be resisting during police interactions more often and interact with the police more than white people, making them more prone to experiencing police force. It’s true that reported rates of violent offending have been disproportionately higher among black Americans, but, as Wesley Lowery noted in 2016, researchers have found that “there is no correlation between violent crime and who is killed by police officers.” Further, multiple studies have demonstrated that racial disparities in police interactions and use of force cannot be fully explained away by crime rates. Finally, police reform advocates like Michelle Alexander have noted that a far greater number of black Americans have been put through the criminal justice system for nonviolent crimes than for violent crimes, indicating that violent crime rates are a red herring when it comes to the question of racially skewed policing. In two phone calls with me over the past week, Fridell acknowledged that some uneven policing patterns might indicate bias, while also explaining why she thinks there’s some truth to the “black crime implicit bias,” why she considers it necessary to tell officers that it’s a “myth” that black people are shot more than white people, and why she believes there’s no measurable way to determine if bias trainings are actually working.
Our conversations have been condensed and edited for clarity.
Jeremy Stahl: Lately I’ve been reviewing my own reporting on past excessive force incidents in places like Las Vegas, St. Louis, Gwinnett, Chicago, Ferguson, Milwaukee, Cleveland, New York, Los Angeles, Cincinnati, Fort Worth, Minneapolis—and it felt a little bit like Groundhog Day, like nothing has really changed in the six years since I’ve been reporting on this subject. In your work, you seek to train police officers to see and to seek to control as best as possible their own implicit biases. Why do you think that implicit bias in policing has remained such a persistent problem?
Lorie Fridell: I’m thinking. So, first of all it’s very hard to measure the extent to which bias impacts on policing because if we only look at disparity, we cannot conclude that disparity is the same as bias. … So, when we look at, for instance, any agency, and look at use of force stats, arrest stats for robbery, stop and search, we’re going to find that people of color are disproportionately represented in those groups. That disproportionality could be due to several factors. Most notably it could be due to bias on the part of police, but it also could be due to differential criminal behavior or resistance behavior on the part of demographic groups, and we can go there if you want to. But it’s very hard to measure—your question made it sound like we can measure the level of bias in policing over time. We really can’t measure it.
We’re going to have disparity, and as a social scientist, it’s easy to measure disparity, Jeremy, but it’s really hard to parse out the causes. So, that was a long way of not answering your question. But I will say this about, you know, how come we have no impact, and—or, actually, I don’t think we have no impact—how come we’re not seeing the signs of it. For us, we know we’re not going to impact everybody in the room. We are going to impact with our training at the line level the officers who are motivated to do the right thing.
Your organization has done implicit bias trainings with police departments around the country, including departments that have their own cases and reckonings over excessive use of force. The Minneapolis Police Department is obviously the most prominent recent example (and the NYPD). What do you think when you see news of police misconduct in a department where your organization has done a training?
I would consider ourselves necessary, but not sufficient. … The backdrop has to be that this agency has hired people who want to serve their communities. And in the old days we would hire people for policing who wanted adventure—to kick down doors. So, we’ve got to hire in the spirit of service. And then that person has got to be trained in the mindset of a guardian and not a warrior.
So when you get a news report like what happened to George Floyd, or you see an image of that, in a department where you’ve held training, you think to yourself, Well, obviously more needs to be done … and what we did was important, but not sufficient, and there has to be more going forward?
I don’t need to see a picture of an officer in one of our agencies doing that to know that. I mean we knew that when we developed the curriculum. We are not the answer to bias in policing. We are a necessary component. … Culture is really important. There are some agencies that clearly—and I’m not going to name them for you, sorry—brought us in too soon. The agency was one where there’s a culture of “the community is the enemy,” “we need to be aggressive,” all of the things that you might associate with a problematic law enforcement culture.
Do you think implicit bias training works?
I do think that implicit bias training works, but we have to put it in context. I think a lot of the criticism that we’re getting of late is due to frustrations around society, that we’ve been working on implicit bias training, and deescalation training, and use of force reforms and hey, we still have problems. There are going to be people with implicit biases who are made aware of them and because they want to do the right thing, that will give them motivation to implement the skills. Of course, I actually think that that’s most cops, that they want to do the right thing, but obviously that’s not all of the cops. … No one should believe that just bringing in trainings is going to turn the agency around. I’m so glad that we’re not just talking about the biases and racists in policing; we are talking about society because that’s where we get these cops. So unless we’re talking about long-term change in society, this is not a quick fix.
What data do you have that your programs have been effective, specifically, or that the concept of implicit bias training for police officers works?
There’s a science about what individuals can do to manage and reduce their biases. If you’ve been doing your reading, a way to reduce your biases is positive contact with people who are different from you. It’s not quick. It’s not easy. It might take years of interacting with certain populations. … So, yes, the first thing in terms of what basis do I have faith in my curriculum is that it is based on science. However, having said that, there are no published controlled evaluation results. We actually expect some out later in the summer, or in the fall. A controlled study means that we have to compare people who’ve had the training to people who have not had the training. And I expect to see changes in attitudes, knowledge, and skills.
What would measurable results of improvement look like to you?
We’re not trying to reduce bias. I’ve had people come to me and say, “Well, should we measure bias before the training and after the training” and I’m absolutely aghast. That is the No. 1 journalistic error that I’ve been reading about. We are trying to raise recognition of the implicit biases that we all have and give people skills for reducing and managing that. …
I want them to leave thinking, “Oh, well, bias in policing is an issue. It is not just about racist officers. It’s also about good cops. It turns out I am a part of the problem, but I also have the skills and techniques that I need to be part of the solution.” So that’s the important “attitudinal” part. “Knowledge,” that one’s a bit easier: Do they know more about implicit bias after the training than before the training? And actually it’s interesting, many cops come into implicit bias training knowing more certainly than they did when we started this in 2007 or 2008. And “skills,” I would want them to have an evaluation to show that they know the skills that we’ve provided with them and they are committed to using them. … I will say this about the challenge of evaluation including behavior and I say this as a social scientist, people think that social science can measure everything we want it to measure.
From what I’ve read of the data more generally, you’ll see an effect of implicit bias training in attitudes for a short period of time—and you’ll also see a much stronger effect amongst people who are already entering the program motivated—as you’ve said. But the effectiveness generally has not been measured at a longer amount of time and particularly for individuals who are not motivated.
Right. And, so, one of the keys is you said, “It hasn’t been measured at longer levels.” But, you are right about—the motivation, or the knowledge, or the skills, they have to be reinforced. … Again, we don’t claim that we’re going to impact everybody in the room. In fact, the evaluation that’s coming out is just looking at patrol and supervisors. Even though I think that we impact behavior, I do not believe that the evaluation will identify or detect impact on behavior, for several methodological reasons I can share. But in any agency, you can’t assume that the implicit bias training is going to impact on behavior unless the leadership is doing what they have to do. Think about this: You’ve trained your personnel in implicit bias and then the directive from the top is go out and make aggressive traffic stops and find the guns, the drugs, and the crooks. Well: Who do you think human beings are going to focus on? They’re going to focus on people of color, they’re going to focus on men, they’re going to focus on low-income.
So, there’s something that you just said that kind of struck me. It sounds like one of the techniques you use is perhaps to reinforce positions that officers already hold. In one interview you said, “One thing that’s important to remember is that stereotypes are based in part on fact. And that’s true even of the black crime implicit bias.” In another, you said, “An important message in our training is that stereotypes are based in part on fact. And we have to recognize this because in our country, people of color are disproportionately represented amongst the people who commit street crime.” Leaving aside for the moment the premise, is there a chance that having this message be such an “important part” of your training, that it could validate those stereotypes and biases among police?
We have to acknowledge that to maintain our credibility in the classroom. They’re expecting someone to come in and shake their fingers at them and tell them that “everybody commits crime at the same rate, but you are arresting blacks more, so you guys have got to stop.” We would have no credibility in the classroom. So, there’s no way that we can get to that class without recognizing that fact—and it is a fact, and I’m a criminologist, so we’ve studied it. But what’s real important and doesn’t always get into the interview is we follow that, we say, “Even though stereotypes can be based in part on fact, we err when we treat the individual as if they fit the stereotype.” … We don’t get pushback on that at all. And then we follow up by saying, “You know, most men don’t commit crime, most blacks don’t commit crimes, most Muslims are not terrorists.”
Do you think that some officers might just hear the first part of it? I’m thinking of this in the context of some of the other research I looked at: One of them was a study in the Journal of Applied Psychology in 2015 titled “Condoning Stereotyping? How Awareness of Stereotyping Prevalence Impacts Expression of Stereotypes.” And another was a 2011 study from the Association for Psychological Science titled “Ironic Effects of Anti-Prejudice Messages.” And to summarize both of them, they just make the case that, in their research, there is a risk of reinforcing stereotypes when that is part of your message.
You’ve really done your homework, haven’t you? Let me see if this is relevant to some of that research you just cited. Because there’ve also been studies that raise the issue of “Ok, if you tell everybody that everybody had biases, then they’re going to forgive themselves and go on with their lives.” Which is why we have to go beyond and we have to provide additional motivation. Which again in our classes is: Policing based on these stereotypes makes you unsafe, ineffective, and unjust. So, I don’t know if that overlaps in terms of the science, but boy you’re really impressive in terms of your research you’ve done.
Thank you for that. I guess a follow up to that immediately is that you’ve also said that “Police very often use a lesser level of force even when they’re justified at a higher level.” Did you see the report about Tulsa Police Department Maj. Travis Yates, who said systemic racism in policing “just doesn’t exist,” but—more importantly for the purpose of this question—that “we’re shooting African-Americans about 24 percent less than we probably ought to be, based on the crimes being committed.”
I had not seen the quote. I bet you he’s looking at the demographics of people against whom police use force and police would want to benchmark that against the people who commit crime, or maybe violent crime, and so that’s probably what he’s comparing, but I can’t tell you.
This seems to be an example of somebody who has had it reinforced to them that there is some basis in numbers for a stereotype, and their response is we should be shooting African-Americans about 24 percent more. [Editor’s Note: Fridell did not train the Tulsa police; the department contracted with a different implicit bias training company.]
I think what you’re trying to say is that we shouldn’t talk, or—I would still say: The risk of not saying to cops that we recognize differential behavior on the part of demographic groups, we’d get thrown out of the room as not being credible.
One of the points you’ve made on this subject is to say that disparities in police use of force might be explained by disparities in resistance to arrest. A writer I worked with did an analysis of Chicago Police Department data. What he found was that “when faced with a white subject deemed to present a deadly threat”—so this is by the officers’ own assessment of the situation—“officers used lethal force in just 28 percent of cases. Meanwhile, officers fired upon black subjects in 43 percent of similar situations.” So, there is some data out there that shows even accounting for what police officers are describing as their own assessment of the risk of a given situation, they’re firing more upon black Americans, which would in part possibly explain disparities.
That’s very interesting. And the fact that that data was produced is why we’re doing what we’re doing. Because that data would indicate that those decisions are not being based only on level of resistance, which is the legal foundation for using force. Those data would indicate that there is basically a different standard for using force for whites versus blacks. Even though social science is imperfect, that is some data that would indicate bias.
Similarly, though, in terms of your messaging on disparity: In one interview, you said that “one myth is that blacks are shot by police more than Caucasians.” I guess: Do you still view that as a myth given the data we just discussed, and is that a message you feel like officers need to hear?
There’s a difference between looking at absolute numbers and looking at rates or proportions. So, in most of the Washington Post data [on police use of lethal force], I hope that I get this right, Jeremy, but for the last few years if you look at absolute numbers of who was shot by police, the absolute number would be more Caucasians than blacks. But what they go on to say is that it is disproportionately blacks. … It’s interesting that you pull that up, because I hope you know that we’re doing what we’re doing because we do believe that there’s bias in policing because there is in all professions. I feel like you’re thinking that I’m saying the opposite somehow of what we’re doing.
I guess that if you’re acknowledging that there are greater numbers of black people who are shot by police than white people in terms of relative rates to the populations, saying that topline number and calling it a “myth” might detract from the more important number.
One thing we committed to do from the get-go is we’re always going to tell the truth. … Sometimes people ask me, “How do you know there’s bias in policing?” And my answer is “Well, I can’t look at all of the disparity statistics because I know as a social scientist that it’s easy to measure disparity, but it’s not easy to parse out how much of that is bias and how much of that is differential criminal activity.” So, I can’t look at, for instance, vehicle stop data, and tell you whether that is biased. The science that speaks to me is the science of implicit bias, that basically says we all have implicit biases. … I have been one to exercise and train in a great deal of caution in terms of drawing conclusions from disparity stats. I wrote a 500-page book on analyzing vehicle stop data, because people would collect the data, look at the demographics of people who were stopped, benchmark it against the census, and declare an agency biased or not. And it made me crazy. It is inappropriate because again there are other factors that can be producing it.
One response to thinking about those traffic stops specifically, which I think is a very important way of looking at this instead of just focusing entirely on incidents of use of force—
And so that is hit-rate data. Indeed what we’re doing in looking at hit-rate data is if we’ve analyzed the correct subset of searches—it’s been so long since I’ve talked about this, Jeremy—we are actually applying the outcome tests from economic theory, and this outcome test in economic theory would say that lower hit rates for minorities is a red flag for bias searches.
What does evidence of bias look like to you?
That is really challenging, Jeremy, I’ve spent about 25 years trying to figure out how to measure bias. … Here’s my mantra: It’s very easy for social scientists to measure disparity. It is incredibly difficult to parse out the causes or the sources of that disparity.
Is there any evidence that you can point to that you would acknowledge was evidence of bias? We talked about the contraband hit-rate during car searches being higher for white people than black people.
That is one of the better measures. It’s not foolproof, but a lower hit rate for people of color is at least enough evidence for a police department to sit up and take notice and try to examine it further.
That to you is evidence of bias then?
I would call it a red flag, can I put it that way? Because it’s social science, we don’t prove anything. It’s a red flag, not a bias.
There are multiple studies that have shown that white people and black people use drugs at around the same rates, there have been other studies that have shown that white people are, even according to self-reporting, dealing drugs at higher rates than black people, but black people are arrested and incarcerated at much higher rates for drug offenses. Is that to you another red flag in terms of bias?
That is a red flag. When we look at self-reported drug use across the population, it does not match the police intervention for those crimes.
What are your feelings about the “defund the police” debate?
When I hear a discussion of greater budgets for education in low-income communities, role models, job opportunities, that is a way to spend money on reducing crime, which by definition then would reduce the load on police departments. Now, my concern of course would be to take away the funds from police departments to respond to crime before we reduce it, because I’m concerned that the effects will be on the already marginalized communities. So: I’m excited about that discussion, but it would need to go—first we would want to address the factors that produce criminality and hopefully have an impact on crime, and then at that time we could reduce police budgets.
I want to know if you heard about this story from the other week. Derrick Sanderlin was an implicit bias trainer for years in San Jose, California. He took part in the protests in that city and attempted to deescalate a confrontation between fellow protesters and police officers, presumably using his own background and experience, and he was shot with rubber bullets in his groin.
Wait, is this a police officer or a community member?
This was a community member who was an implicit bias trainer for years in San Jose.
Ok, when you say implicit bias trainer, I was thinking law enforcement, so that was my own bias. Oh, I did see the story about the trainer who got hit.
Yeah. He’s 27. He may never be able to have children again. What did you think when you saw that story?
[Pause.] What did I think of that story? I mean. Jeremy, I don’t know what I thought of that story. I remember thinking I was very sad that somebody was in a protest that I think was not involved in violence was harmed at all. I can’t remember: Was the implication that the officers knew who he was and was shooting him intentionally? Remind me.
No, there was no implication of anything like that, I don’t think.
So what—so I don’t know if there is relevance to the fact that he was an implicit bias trainer, so I’m not sure that I’m an expert on responding to that scenario.
I guess the relevance is if a police department that has been trained by a person to try to reduce or manage their implicit bias ends up shooting that person and potentially disabling them: One, how does it speak to the effectiveness of the training? Two, how does it speak to the ability of an implicit bias trainer, somebody who knows this stuff to their core, to enter a situation in a conflict between police and protesters and seek to deescalate if police are motivated to use force?
Let me ask you this. Was there some indication that bias was involved? Or was this just a use of force issue? What was the evidence that there was bias?
They were protesting police brutality and excessive force. And in videos we’ve seen across the country throughout the last couple of weeks, there have been hundreds of videos—there’s actually a Google spreadsheet of somebody compiling it—demonstrating police officers using force on protesters who are specifically asking reforms of police and specifically protesting the notion of excessive force. In some way at least, there may be potential bias against that protester based on what he was protesting.
So, you’re interpreting it as the excessive force could be due just to frustration of being in that situation, but you’re saying that it could be retribution.
I would also note that the community activist is black.
Well, just because somebody is on the receiving end of a police activity and they’re black doesn’t mean it’s bias, you know that.
Again, I will just state that the issue is that he was a black person who was trained on counseling officers on implicit bias, he was attempting to deescalate a situation, and still he was shot.
I don’t know that we do a really good job helping officers deal when people are pushing their buttons, when they’re really frustrated, on how to control themselves. … But I don’t think I have any expertise for commenting on that particular situation.
But your instinct is that no bias played a part in that?
I try to be a fact-based social scientist. And I also recognize that I cannot look at a situation—I can’t even look at Chauvin, even though I’ve got my own ideas, I can’t look at that and tell you that that was bias. In my own mind, do I think of it in that term? I do. If I could look at a situation and tell you if it was bias or not, I could retire as a professor and be an expert witness.
So, in the Derek Chauvin incident as well, just in seeing that image, you did not infer bias?
No, let me answer it two ways. When I processed that, I perceived bias. As a social scientist, I can’t tell you whether or not it occurred, because I don’t read his mind. When I looked at that incident, the bias that came to my mind is outgroup bias. Because, he wasn’t fearful. Outgroup bias has to do with the “we” and “they” that we all have. And in our training, we talk about the fact that if you had a continuum and you went all the way down the “they” end of the continuum, you find dehumanization, where a person has pushed that group so far down in their continuum that they don’t even see them as human anymore. I could not picture Chauvin doing that in the business district of Minneapolis with a white man with a collared shirt. So, yeah, as a social scientist, you can’t put me on the witness stand and say, “Tell me about the biases of this person.” I can’t do that. Did I perceive it? Yes, I did. I can’t prove it.
You are at the end of a two-year, $4.5 million contract to do regular trainings with the NYPD. Do you have any idea if the contract is going to be renewed?
I do not. We have developed what we call booster training, because this can’t be a one-off. This has to be something that after somebody gets the basic training, we need to come back in and reinforce the messages. So we do have that booster training and it would be great if NYPD decides to do that. I think a lot’s going to depend on budget. We know COVID budgets are going to be slim.
And you’re saying you’ll have data on effectiveness, you’re hoping by some point this summer.
So, there is going to be an evaluation result that comes out summer or early fall. It is an independent investigation, so I’ve not seen the final report.
You were skeptical, though, it sounded like, that you might see outcomes in terms of behavior.
I think it would be very difficult to detect behavioral change, even with a high-quality study.