Was This Google Ethicist Fired for Doing Her Job?

Listen to this episode

S1: Hey, what next, listener? It’s Mary, just wanted to give you a quick heads up, the Today show is going to sound a little bit different. That’s because our friends over what next TBD, they had a great interview that just couldn’t wait until Friday. So we’re going to switch things up a bit for the rest of the show. You’re going to be hearing from Lizzie O’Leary. I’ll be back tomorrow.

Advertisement

S2: Earlier this fall to meet, Gebre submitted a paper for consideration for an academic conference, not something that’s particularly unusual for her to meet studies, ethics and artificial intelligence. So this is the kind of thing she does. But since she was working at Google, there were some standard resubmission hoops to jump through. Google wanted to review the paper, which to me wrote with several of her colleagues and signed off on it. And that’s when she says she started getting pushback.

S3: So this was the week before Thanksgiving at four 30 p.m randomly. There’s like a meeting on my calendar. Nobody’s telling me what it’s about.

S2: She was told by senior managers that the paper didn’t meet Google’s publication bar and that she should retracted or take the names of Google employees off of it. To me, wanted more clarity on who objected to the paper and why they wanted it retracted and said that if Google couldn’t provide that information, she would resign. This kicked off a few days of wrangling and several intense emails into one night, December 1st to meet was chatting with a woman on her team, one of her direct reports.

Advertisement
Advertisement
Advertisement
Advertisement

S3: I actually was thinking I was like, oh, you know what? I haven’t told her that this paper thing is escalating so quickly. I probably should tell her about it. Poor, poor woman is she’s not on this paper. She doesn’t know anything about it.

S2: But before Tamny could tell her what was going on, her direct report got an email. It was from a manager saying they had accepted to meet’s resignation a surprise to her.

S4: So I texted my other reporter and I said, did you also get the same message?

S2: She did. So Timothy texted her boss and at first he wasn’t answering.

S3: It’s like eight o’clock or something like that. And I was like, OK, I think this is fair game. I can just call him me, whatever. I can interrupt at this point. I call him. I was like, do you know that my my direct reports got this email from Megan saying that she accepted my resignation.

Advertisement

S2: Megan is making Cattolica she’s a VP at Google Research. He’s like, what? It was clear to me that whatever happened, it had come from someone higher up in the company than her direct boss that night when she tried to go to bed, she couldn’t sleep that whole night.

S3: I was thinking, what are they going to do? What are they trying to do? Did they think that I had so little value that there would really be like no backlash whatsoever? I would just go quietly.

S2: That was two weeks ago, and the time since has been anything but quiet. Timony says Google fired her. Google says she resigned.

S3: My team calls it resignation. Resignation, you write, I’m glad that they came up with this new word to designate someone.

Advertisement
Advertisement
Advertisement

S2: Thousands of people both inside Google and out have signed a petition on her behalf. The company’s CEO has had to answer questions about what happened. And to me is trying to figure out how she one of the few black women who studies ethics and I known almost universally as a star in her field, ended up here.

S3: To be honest, I still haven’t processed entirely because I’ve just been I was saying I’ve been on adrenaline. But if I were to take a step back and really think about what they did, how they did it, how disrespectful and devaluing it was, and how they treated me like a discardable object and what kind of message they sent to my entire community. It’s extremely hurtful. It’s extremely painful. And I think it’s kind of what they did is shake my sort of fundamental understanding of human beings and what they can and cannot do.

Advertisement

S5: Today on the show, a conversation with Tim Gambro about Google, about A.I., about racism in tech. I’m Lizzie O’Leary and you’re listening to What Next, TBD, a show about technology, power and how the future will be determined. Stick with us.

S2: Gymnich Ebru is known in the tech world for a lot of things, but one of them is starting the affinity group, Black and A.I., which exists to mentor and support black researchers in artificial intelligence. And in the days after she left Google, there was a big conference, virtually, of course, and she heard from a lot of people who are part of black and A.I. there. I’m wondering if you could tell me about that program and the type of stories you were hearing, what people were saying to you, because you tweeted that it was therapeutic to be talking to people about what had just happened.

Advertisement
Advertisement
Advertisement
Advertisement

S6: Really, it was very therapeutic to be in a space where everybody sort of understood what what we were going through, what I was going through. There was no need to explain. And we were all feeling a similar level of exasperation in many ways.

S2: Black and I was born at this same conference four years ago when Timmy looked around the room at the crowd of five thousand people and saw just six black faces.

S3: It was just literally a moment of panic. And I came back home and I wrote this Facebook post that like was shared by a lot of people. And that’s when we decided to intensify our efforts for Black and I. So before that, I had this little mailing list where I would just like say hi to any black person. I seem to feel like, hey, so, you know, my name used to be, you know, and so we like added a lot of people and we decided to have a workshop. And that’s sort of how we we started the organization as black.

Advertisement

S2: And I grew so did Timna, its reputation, particularly when she was part of a now renowned study that showed how facial recognition technology is biased to meet. And her colleagues, Joybubbles Mooney and Deborah Orji, both black women showed that commercial facial recognition software performed worse on darker skin, especially darker skinned women. The project was called Gender Shades. Well, it seems to me like there are two interrelated sort of problems of diversity in AI one, which is something that you explored in your paper gender shades. That data is often not diverse, which leads to outcomes like facial recognition software not performing well on black women. But but the other is the diversity of researchers. Gender shades was created by black women. And I guess I wonder, did it take black women researchers to ask those questions?

Advertisement
Advertisement
Advertisement

S3: Absolutely. Absolutely. I was a Ph.D. student when I advised Joia she was a master’s student. It’s highly unusual to have two women who are both students, just the two of them writing papers. Right. And we faced a lot of backlash because of it. So we have to constantly support each other and watch out and push through for this work.

S2: I’d like to ask you. Why you went to work at Google in the first place?

S7: Well, there were two things. One was that at the time I was at Microsoft Research for one year, I wanted to do a postdoc and it was in New York. I wanted to go back home to the Bay Area. And then another one was that they were actually specifically said that they wanted to start a Google brain office in Ghana. So it’s supposed to be the first office in Africa.

Advertisement

S2: So I wanted to help with that to meet ended up working with a team at Google that wrestled with the ethics and effects of artificial intelligence. One paper she wrote showed how to document and audit the data used to train an algorithm. So if something went wrong, a company could unwind the process and see what happened. This latest paper, the one that led to Tamny leaving Google, is about large language models. Basically, those are algorithms that predict language. You probably use them all the time when your phone order feels your text or when you talk to Alexa.

S8: This paper is drawing from a lot of different works. We were drawing on many people’s expertise and disciplinary traditions and prior work.

Advertisement

S3: But, you know, I think our prioritization of the risks and harms probably would be different from other people’s because of our background and because of our disciplinary traditions. So we discussed what we, in our view, the risks and harms are and what, in our view, should be done to meet in.

Advertisement
Advertisement
Advertisement

S2: Her collaborators named a few issues they saw with large language models, including the environmental cost of the computing power required to train them, the possibility that the models would suck up racist and sexist language in their attempts to learn, and that these models could be used to write language that seemed human and spread disinformation. Has anyone at Google said to you, this is where we had a problem? Or this thing that you have put in the paper undercuts one of our products.

S3: The first conversation we had was you have to retract the paper.

S4: Some of the product leads believe that the floors are too much, and so you have to retract the paper.

S2: Then she heard from Jeff Dean, the head of Google Research.

S4: He said, you know, I just skimmed the paper. And so as a quick skim, this environmental section, you know, is dependent on this flawed paper. But I was like, OK, so I wrote I wrote a whole thing. I’m like, where do you think it over is? And why do you think it overestimates? What what kind of changes are you suggesting? What is the issue? So on on Thursday of Thanksgiving, I spent my whole day literally writing this document Friday. I sent it to them because they told me to retract it. By Friday, I sent this document and I said, I hope, you know, this can be a basis for a conversation or some sort of back and forth. So if I’m going to retract a paper, I at least want to understand what is going to happen after. Are we going to try to rewrite it or are you just trying to kill this line of work? Like, what are you trying to do with your goals?

Advertisement
Advertisement
Advertisement

S8: Monday, I get an email from Megan responding to that says, Can you confirm that you have retracted the paper or haven’t taken your name off of the paper?

S3: And I’m like, are you kidding me? I wrote this whole thing. And you’re not even acknowledging that I wrote anything or that there, you know, I’m asking all these questions. She’s not even acknowledging it.

S2: So that’s kind of how it went. But what really seems to have angered Google is an email to me wrote in an internal group in the company called Google Brain Women and Allies there, she vented her frustration with what happened with the paper and what she saw as lip service to diversity.

S3: I have written so many documents. I mean, like I wrote a billion documents, I had a billion meetings. They just tire you out. They meet with you over and over again. They feel good about themselves for meeting with you. They don’t do anything.

S8: And then if you if you try to push them on it or tell them they’re doing something wrong, they told police, you, this has happened to me so many times, there is nothing in place for here right now that incentivizes them to do something different so we can write as many documents as we want. But as long as there’s, like, no incentive for the leaders to do anything differently, this document is not going to help. Meetings are not going to help. Nothing’s going to help. So that’s why I would say you should focus on leadership accountability. Do you think it was the email that got you dismissed or the paper their reasoning wanted to terminate me, quote unquote immediately was the email. And people in that email list are terrified now. They’re just like terrified to say anything because mind you, this email list was created to it’s called Brain Women and Allies, Women and allies. This is an email list for women and their allies to discuss the problems in this department with respect to diversity, inclusion, to push back slightly.

Advertisement
Advertisement
Advertisement

S2: I feel like any company reading this kind of, you know, semipublic internal discussion might say, OK, if that’s the way you feel you’re done here.

S8: No, that doesn’t make sense to me. Like this email list is to have an internal discussion about what to do better. So what to do better after I’ve seen all of these women spending their own time, their own free time, they’re not hired to write these documents. This is not their job. Their job is to be research scientist. They’re spending their own free time writing these documents, pointing out issues, and nobody’s listening to them. I didn’t make this email public. I didn’t leak it to the press. I didn’t go say here is an indictment of Google. This is an internal mailing list created specifically for the purpose of talking about the issues related to women. Right. If leaders are not held accountable, nothing is going to change.

S2: This is not the first time that venting on internal Google message boards has ruffled management feathers. Last year, the company created a new policy cracking down on political discussions and internal groups and Google employees have repeatedly staged walkouts to protest sexual harassment, lack of diversity and the company working on what they see as unethical projects. Where do you think the line is at Google between the intellectual and collegial freedom to have these kinds of discussions and maybe the corporate culture not to say them too publicly or too loudly?

S8: I don’t even think this is corporate culture. I think this is we had a research all hands after the George Floyd protests where we were. So people were crying. People were so emotional. We’re like pleading with them to do something different because we’re so exhausted. We outlined a bunch of principles that we called nothing about us without us. The number one thing we said was psychological safety. We need to have psychological safety in order to talk about the issues. If you don’t even have the psychological safety to discuss what you’re what you’re facing, then there’s no way to even move forward. There’s no way to fix your your company’s culture.

Advertisement
Advertisement
Advertisement

S2: Speaking of leaders and accountability, what do you make of Sundar patches apology or maybe let’s call it a statement where he says we need to accept responsibility for the fact that a prominent black female leader with immense talent left Google unhappily.

S3: Yeah, it feels like there’s so much gymnastics there. Well, this is what I make of it. They would have looked bad if they don’t make a statement. They still look bad after making that statement. They’re basically saying, we apologize for the backlash because we’re not happy about the backlash, because the backlash means, oh, you’re questioning whether you still have a place at Google. The Google walk out showed how toxic I was at Google. Right. A lot of women said that their number one issue was the department itself. They’re not there at all to to support women. They’re there to basically to ensure that there are the least number of losses for the company. They’re not going to try to make the company culture better.

S2: The other part of some of our bitties statement says that it’s important to me that our black women and underrepresented gigglers know that we value you and you do belong at Google. And I wonder what would have made you feel valued because it’s so clear when I listen to you. That you feel like you were held up externally as sort of a beacon of diversity at Google and yet you feel like you were undercut internally.

S3: It was so clear that they weren’t even treating me as a priest, like as a person, because you would discuss things with a person. You wouldn’t just order them around, let alone a world renowned expert.

Advertisement
Advertisement
Advertisement

S6: You know, I was constantly devalued. I mean, constantly. And it’s and and actually, people coming into Google told me this, that that they could not reconcile the difference in which I was viewed externally with the way in which I was treated internally.

S2: Why do you think that is? Why do you think Google undervalued you?

S6: I think it’s mostly racism and sexism, even even when it’s about issues of ethics. So they have all of these responsibility initiatives would like literally almost no black people. And the black people in it are just infuriated all the time. They keep on talking about us as if we’re like some like joy, says Cage curiosities.

S2: Joy Berlusconi, who who’s one of your collaborators?

S6: Yeah, she says Cange curiosities. And I always talk about what’s called parachute research in my research, where these group of people who look at you as a subject of study or something like that and they work on they did like. Oh, yeah. Like imagine the black person, the marginalized black person in their natural habitat. It’s like, I don’t know, like a National Geographic or something, you know, how they talk about you and you just sitting there like, oh my God. And they get promoted and they publish papers and they don’t deal with the consequences.

S2: We reached out to Google to ask them about ten minutes experience of the company and the circumstances of her departure, but didn’t hear back by recording time. Have you heard from anyone at Google since this happened? Officially.

S8: Officially, no.

S3: I haven’t even gotten my instructions on how to to return Google assets. Wow. They were they were so fast. I think. I believe they’ve mailed my check, but I. I’ve no idea how to return the computer, my work computer or anything like that. I don’t really I’ve been checking my email. I don’t I don’t see the instructions on how to do it.

Advertisement
Advertisement
Advertisement

S5: No, I certainly have not heard from anybody officially for now to me, just left with my statement in which he says that Google will, quote, begin a review of what happened to identify all the points where we can learn, considering everything from de-escalation strategies to new processes we can put in place.

S9: What do you want people to take away from this interview that I’m like a I’m a human and I laugh and I talk? You know, I, I, I can be pretty outgoing. And when you’re painted as this just and reasonable, angry person who needs to be deescalated, it’s dehumanizing. It doesn’t tell a story of what you’ve gone through and what you have tried to overcome to get through.

S5: Thank you so much for your time. Thank you for having me. Tim Ebru is an A.I. ethics researcher and the co-founder of Black in A.I.. That’s our show for today, TBD is produced by Ethan Brooks and edited by Allison Benedikt and Torie Bosch. Our executive producer is Lisa Montgomery. TBD is part of the larger What Next family. And it’s also part of Future Tense, a partnership of Slate, Arizona State University and New America. Mary Harris will be back on your feet tomorrow. Mary, thank you for letting me grab a hold of the show today. I’m Lizzie O’Leary. Thanks for listening.