S1: Hey, there. Before we get started, I’ve got a quick question for you. Have you quit your job recently or are you thinking about it? Maybe you’re on a business and suddenly all your employees are vaporized. If any of this sounds familiar. We want to hear all about it. The way to tell us is to record a voice memo on your phone and email it to what next at Slate.com. Or if you want, you can give us a ring. Leave a message. Pour out all your feelings. Our number is two zero two eight eight eight two five eight eight. All right. On with the show. For the last few weeks, as Facebook has dominated the news, there’s been this question where’s Mark? As in Mark Zuckerberg, the founder and CEO of Facebook. Last week, we found out he’s been in the metaverse.
S2: Hey, and welcome to connect. Today, we’re going to talk about the metaverse
S1: at the end of a month long news cycle. That’s included a whistleblower named Francis Hogan. Her cache of secret documents and hours of testimony on the floor of the Senate, Mark Zuckerberg released a video about his company’s future.
S2: We’ve gone from desktop to web to phones, from text to photos to video. But this isn’t the end of the line.
S1: That future apparently includes a new name, Medha, as you’ve probably heard by now, and a utopian vision of virtual reality for everyone
S2: and embodied internet where you’re in the experience, not just looking at it. And we call this the metaverse and you’re going to have to do almost anything you can imagine. Get together with friends and family, work, learn, play, shop, create.
S1: What the future does not include is a vigorous investigation into the many ways Facebook has harmed its users.
S3: I think it’s a giant public relations mistake to hide from this.
S1: Steven Levy has what he calls a Ph.D. in Facebook. He’s written a book about the company, interviewed both Zuckerberg and his number two, Sheryl Sandberg. Like a lot of observers, he gave the company’s response here a big thumbs down. But what’s shocked Steven most about the way Zuckerberg and Sandberg have approached this scandal is that he thought they’d learn by now. Steven had a front row seat during the company’s last big crisis, when Cambridge Analytica was able to use Facebook to mine for user data in a bid to elect Donald Trump.
S3: And for five days, Mark and Cheryl sort of went to a bunker and didn’t even talk to their own employees. You know, they sent out one of their lawyers to talk to them. And when I was writing my book, I quizzed both Mark and Sheryl on that, and they told me that was a mistake. You know, they should have been more upfront. Show got very emotional when she described this to me now.
S1: Steven says the bunker mentality is back only the bunker is a virtual reality utopia.
S3: And here they are again. And at one point the day that Francis has appeared on 60 Minutes, something that Mark knew was going to happen. He posted a video of him and his wife sailing on the bay
S1: as if nothing was happening.
S3: Yeah, I mean, it’s it’s worse. I mean, you know, you’re saying, you know, let them eat cake, right? You know, I’m not bothered by this. I’m on my boat.
S1: If to you, the ebb and flow of news about Facebook over the last few weeks feels kind of repetitive, Steven sees it a little differently. He sees a reckoning that Facebook can’t really avoid.
S3: If you look at it, take a step back. It might seem tidal. But I think actually it’s more building. You know, the waves get bigger.
S1: It’s interesting. You’re seeing that. You know, it feels like the waves are getting stronger and stronger. Do you think that means this time around, there might be some kind of consequence?
S3: Oh, it’s hard to say. I mean, you know, the wave Mark Zuckerberg goes to Congress. Some of the people, even who criticize them say, Wow, I really admire what you’ve built. I think really the biggest threat to Facebook, if you if you ask me and I think this becomes clear looking through these documents is that working for Facebook is not something you would be you. You could be proud of now, as you were five, 10 years ago.
S1: And you can see the workers who are here are complaining openly.
S3: Yeah, they’re complaining. And a lot of them are leaving
S1: today on the show. Mark Zuckerberg. He can’t escape to the metaverse forever. We’ll talk about why I’m Mary Harris. You’re listening to what next? Stick around. Each day at noon, Steven Levy gets a big load of documents sent his way. That is because he’s part of a reporting consortium that’s digging through everything Frances Hogan that Facebook whistleblower took with her when she left the company earlier this year. Hogan’s even provided a little glossary of search terms so journalists can better understand what they’re looking at. And Steven says that kind of attention to detail is part of the reason this Facebook scandal doesn’t seem to be going away because there have been whistleblowers before, but none of them have been quite this meticulous.
S3: She’s approached this like a job. You know, I actually know Francis weirdly about 60 or 70 years ago. I was embedded in an overseas trip with young Google product managers who were pegged to be the future leaders of the company. We spent 16 days together. And Francis was one of that group. Know you get to know a person when you have that intense period of travel. And I would have said that Francis, of all the people who who would have been the whistleblower in the future, it would have been her because she was why she took things from work very personally. She had a pretty firm version of what was right and what was not right.
S1: Yeah, I’m struck by just the procedure as its as it’s played out, going to the scc, going to the Wall Street Journal, going to Congress and then releasing this trove of information to a variety of news outlets, including you. So that then there’s a whole nother wave of scoops. And then she testifies in front of British lawmakers. So it’s sort of it’s a creating her own weather system of hers, you know?
S3: Right, right? No, I mean, look, if you’re going to be a whistleblower, you want maximum impact. And then to me, that that’s the case. Facebook can do ad hominem attacks on Francis, you know, all they like, but ultimately, it’s these documents that are irrefutable.
S1: Yeah. Can you just lay out what are the biggest revelations from those documents in your opinion, as someone who’s covered Facebook for so long?
S3: Well, I think there’s a lot of details that stand out and, you know, kind of shocking like anger rate. So much higher is a signal to boost the distribution of a post than, you know, anything that’s like benign or likable. But but ultimately, it’s many of the things that we were, you know, the people were thinking. Facebook does this and it poisons the conversation. It divides us, et cetera, et cetera. That makes teenage girls feel bad. You know, people get bullied. We all, you know, we’re writing about that. And the people in Congress were complaining about it. The regulatory agencies were complaining about it. But this is overwhelming proof of how deeply Facebook knew this internally and didn’t take the aggressive steps it needed to take to minimize the damage it causes. To me, that that’s the big thing. It’s not so much buried in this document or that document is, you know, a scoop that totally changes the way we see Facebook. It is. It’s kind of worse than we thought. And Facebook should have done more. And and there were these groups of people, you know, hundreds of people really doing research at Facebook who were super concerned about this and kept pushing Facebook to be more aggressive in the actions that
S1: took, including Francis Haugen.
S3: Yeah, yeah. I’m not saying it didn’t do anything. I mean, generally what happened was a study came up and saying, You know, we’re failing here, we should do X, Y and Z and Facebook might do X or Y, or they might try something else, but it turned out to be not enough. So some of them weren’t due to nefarious motivations. They were because the way Facebook is structured, it’s much easier to kill an innovation which protects people than it is to institute it. Because if you want to make a change in the newsfeed, many, many groups are involved and all those groups get the way in. If someone’s really against it, that improvement probably will get next.
S1: Yeah, I think you said that a lot of times changes that may have made the platform more user friendly, kinder, whatever they required, work from a lot of different teams collaborating, but it only took one person to say no. Right? Part of why I wanted to talk. You. Is that your book really centers? Mark Zuckerberg is the personality behind a lot of Facebook’s business decisions. I wonder how you see that play out in these documents that you’ve been poring over.
S3: Well, Mark has ultimate control over Facebook, you know, but he literally controls 56 percent of the voting stock. He can’t be fired. You know, one person has the ultimate sign off. He doesn’t make every decision at Facebook. But people think of him when they make decisions like irregular employees. Absolutely. It’s like, what
S1: would Mark
S3: do? Yeah. Well, yeah. What’s Mark’s vision here? You know, it very much is Mark’s company. Even on content decisions, sometimes he’s the last person to sign off. If people can’t agree on how to handle a hot potato in terms of content like, for instance, whether Donald Trump should be suspended or, you know, a band that goes up to Mark, Mark is super engaged, and he’s a stubborn fellow. I mean, that’s that’s one thing that I learned about him from talking to him so much, they help dig his heels in. Sometimes, when the evidence is overwhelming, he’ll give up this stance. He won’t flip on a dime like Steve Jobs would. Steve Jobs would be adamant about a subject, but as soon as he saw it was a convincing argument. The other way he would act like he was on the other side all along. By Mark isn’t that way. You know, it takes a lot to make him change his mind, but he can change his mind. He’s a very data driven person. But the data which enforces him is that Facebook is still making money hand over fist. Let’s not forget that.
S1: Yeah, that’s what I was going to say. The company is still profitable, and you can have a big media problem. But if Wall Street still loves you or your board still loves you. But I don’t know if it matters.
S3: Right, right. And Mark makes sure the board loves him because he’s got rid of all of the people on the board have been questioning him. Breaks goes down. You could, you know, make a fix. And it’s like, you know, hey, no harm because you can have the fix up within an hour. Never be right back where they were as opposed to. If you’re using like Microsoft Word, there’s a bug to get that not only fixed but out to update the program and millions of computers. That’s a long process, the more serious process. So it used to be a badge of honor among Facebook engineers to do something on the edge to break the program
S1: because you’re always updating.
S3: Yeah. And that’s sort of characterized the move to introduce new products. It became sort of a metaphor. But when it interferes with the company’s main drive to grow, that’s when things get slowed down. I pointed out when Facebook did this oversight board, the quasi independent board to overrule the power to overrule Facebook’s content decisions, whether to take down a post or leave it up and question the policy behind that. It took three years from the idea to the point where there was a board and they were making decisions. So in that case, Facebook like move really, really slow because this wasn’t something that was going to increase the number of users on Facebook. Hmm.
S1: And you see in these documents how move fast and break things. It’s kind of a growth mindset that is more important than that.
S3: Sure. I mean, this is something that’s been pointed out a lot. I certainly wrote about it in my book. You see this in action and the documents about how Facebook to this day, despite the billions of dollars that make some profits, has not invested in having people moderate the content as thoroughly in foreign languages as it does in North America. And, you know, the other countries that speak major languages in Myanmar, for instance, years ago, in the early part of the 2010s, twenty thirteen twenty fourteen, the system was being used to foment riots and to attack political opponents with misinformation. And Facebook was told about this and did almost nothing. And that that that’s totally growth. It wants to be everywhere, but it doesn’t want to spend the money to make it safe everywhere.
S1: When we come back, what’s going on inside Facebook now? I want to talk about this internal dissent that you’ve found as you’ve looked through the Facebook documents you focused on on badge posts. Can you explain what a badge post is for someone who’s outside of Facebook culture and may not know?
S3: Well, just like in any organization, when people leave quite often, I’ll write an email to the staff saying, You know, I’m off my next adventure. Great working with you and Facebook. They’re called badge posts because there’s a custom of taking a picture of the badge you hand in when you leave the badge that you swipe, when you go into the building and you know you have to wear it all times. These badge posts that I was writing about, they all universally say, I love the people I worked with, you know, as an employer. Yeah, Facebook would be great, but they point out to different degrees that they came to feel that Facebook was not good for the world, and they felt that the decision made to improve it weren’t aggressive enough. They were quite often tainted by political considerations, and no badge posts point that out.
S1: Yeah, it’s funny because the main question I’ve had looking at the coverage over the last few weeks of Facebook is, Gosh, I wonder what it’s like to work at Facebook right now, where every day if you look at the newspaper, your employer is splashed all over the front page. Do these posts give you any insight into that?
S3: Well, not only posts, but the the interviews I’ve done. It’s clear that if you work at Facebook, you’re much less likely, say, to wear company swag when you go out with your family.
S1: What are the workers afraid of like? Why? Why? Why not?
S3: People might point them out and saying, Why do you work? They might confront them.
S1: I mean, the thing that struck me looking at your reporting about these Facebook employees and their posts when they leave is, I think the Facebook response to Frances Haugen has been, well, you’re just looking at a small slice of our research and it’s biased and this is one person. But then you look at these badge posts and you realize, first of all, there are so many more people out there who have something to say who’ve worked at Facebook and are just saying it internally. They’re not doing what Francis Haugen did. But then also there’s this nuance to it where the employee is clearly. Appreciate Facebook and think of it, not as you know, an all encompassing bad there was a reason they went to go work there. And so to not see those people and their nuanced reactions feels like a real mess to me on the company’s part.
S3: Definitely. I mean, these are people who wanted to say Facebook and a lot of them in their badge posts and a lot of people I spoke to whose badge posts aren’t in this cache of documents. And you, they’re still releasing documents to us. Who knows? Maybe. Oh dear. But they told me it was just too frustrating. They got burned out and they were too tired to keep fighting.
S1: Yeah. I mean, one released his badge, posted a video.
S4: I for one, for those who don’t know me, my name is Max.
S1: This was this engineer, Max Wang, and it was a Long. VIDEO First of all, and he was like, I’m doing this because I want you to see my face.
S4: I don’t want to just be texting in a box. I don’t want to be a little circle with my face, with my badge. Photo in it. I want to be a flesh and blood person saying these things,
S1: and I think we’re failing and we’ve enshrined that failure in our policies.
S4: And I want to criticize openly and I want to criticize loudly, and I think we have to keep doing those things even if we get an explanation because there have been plenty of bad decisions all throughout history and all of the greatest justifications and processes and execution. None of it is enough to change the fact that these things cause people harm,
S1: which is pretty damning.
S3: These badge posts prove that the system isn’t working. If it was working, these people would be saying, Hey, press, you’re using these things wrong. You know, Congress, don’t look at it this way. Instead, they’re saying, you know, not only is the harm there, but we were hoping that Facebook would aggressively address it and take measures to minimize the harm. And those measures aren’t being taken to the degree that they have to be taken.
S1: You’ve taken issue with this comparison that a lot of lawmakers and pundits have taken where they’ve said this is Facebook’s big tobacco moment.
S5: Facebook and Big Tech are facing a big tobacco moment, a
S3: moment of reckoning.
S1: And I’m hoping you can just tease out why you think that analogy doesn’t work here, because I thought your thinking was really interesting.
S3: Right? I mean, tobacco, I don’t see any good that comes from tobacco. It’s a, you know, a chemical that poisons your body and can kill you. Facebook is different. Billions of people use it for a reason. Most people use it, get some value out of it. But there’s way too many cases where Facebook does harm that has to be addressed. But again, it’s not at the level that everything in Facebook is wrong. And if you wipe Facebook off the Earth, you wipe social media off the face of the Earth. We would necessarily be better off.
S1: Yeah, I mean, you you mentioned how former employees talked about how they’d seen people connect using Facebook in ways that were really positive for them. People who are, say, trans who found, you know, people online who they may not find in their own specific community, and it was powerful for them. And it really sets up this idea that Facebook is not while doing many things wrong. It’s just not all bad in the way that tobacco was. But I think about that, and I think it’s kind of interesting to play with the tobacco analogy because you can also look at it as how Facebook is responding in this moment.
S3: Well, yes, in terms of what Facebook but a company could do to rehabilitate itself and change itself. I went back to the 1982 Tylenol poisonings. A lot of people weren’t alive then
S1: for people who don’t remember it. Johnson and Johnson, this is in the 1980s and news comes out of people being poisoned by Tylenol. They’re cyanide in the capsules. Exactly how did the company respond?
S3: Well, the company immediately took 30 million bottles of Tylenol off the shelves. They stopped selling Tylenol. They paid the families of the victims, even though they could have argued that, you know, apparently after the things were on the shelves of the drugstores, someone bought it, put the capsules with cyanide and put them back on the shelves so they could have argued it wasn’t us. The drugstore was responsible. They didn’t do that. No questions asked. They paid off the people to talk to them personally. The chairman and CEO went out there to sixty minutes, the Nightline, all of other places and said what he personally was doing and he made a personal.
S1: It’s funny to me that you’re flagging this as analogous to Facebook, because I don’t see Facebook is doing anything like that. No one’s considering putting a halt to anything.
S3: Well, that’s that’s the difference, because within a year, the time when all sales were as strong as they ever were, it’s still one of the most popular, you know, analgesics in America. That’s the thing. I’m not saying that Facebook necessarily should be shut down. But compare the public reaction. They’re attacking a whistleblower. They’re, you know, making implications about publications like mine for writing on actual documents and backing it up with outside reporting. Mark and Sheryl are nowhere. Mark, wrote one blog post about this whole thing of the whistleblower a few weeks ago. It seems to me that the two of them should be everywhere. Talking about how concerned they are about the harm that the company does is as it’s in these documents. And even though it’s a small percentage of people possibly affected by this, you could have an argument about that, but they’re going to do everything they can and make a bigger effort to change that, even if it does mean slowing down growth. Or maybe, you know, having people spend less time on Facebook.
S1: Yeah. No one’s talking about slowing down growth. It’s interesting because you you’ve interviewed both of them. It sounds like you’re frustrated. You must have tried to interview them again in this moment.
S3: Yeah, I did try to interview Mark again and in the last couple of weeks. And it’s funny. As soon as I mentioned, you know, one thing I wanted to talk about was, you know, the moral aspect of what I was doing and that sort of shut down negotiations right away. Hmm. Mean he wants to talk about the metaverse, but this subject can’t be changed so easily.
S1: The thing that’s interesting about your reporting, though, is that you realize that the employees may be getting more and more unhappy, and that will become a problem for the company at some point.
S3: Some, you might assume. Well, I think I think it’s a problem now, and I think that really is the key problem for him. His ambitions are not only to dominate in this era, but dominate in the next year. That’s why he bought Oculus, the virtual reality company, in 2014. He saw it as like, maybe in 10 years we would be using virtual reality or augmented reality the same way we used phones. That was overoptimistic, but he still believes in that vision. Talking about the metaverse, but he can’t build a metaverse unless he gets the best people in the world, the best engineers, the best scientists to build it for him and Facebooks Handicap in building the Metaverse is a lot of other companies interested in these mixed reality vision is that people don’t trust Facebook. When Facebook tried to roll out a cryptocurrency, even though it tried to structure it, it would be quasi independent. People didn’t buy it because would you trust Facebook with money with currency? The answer largely was no. Would you trust Facebook with reality? That’s a question we’re going to have to ask.
S1: Steven Levy, thank you so much for joining me.
S3: Thank you, I enjoyed it.
S1: Mary Steven Levy is an editor at Large at Wired. He’s also the author of numerous books, including most recently, Facebook The Inside Story. And that’s our show. What next is produced by Davis Land, Danielle, Hewitt, Elaina Schwarze, Carmel Delshad and Mary Wilson. We are led by Alison Benedict and Alicia Montgomery. And I’m Mary Harris. You can track me down on Twitter. I’m at Mary’s desk. Thanks for listening. Talk to you tomorrow.