The Chaos That Made YouTube a Juggernaut

Listen to this episode

Lizzie O’Leary: Why don’t you guys go ahead and introduce yourself?

Mark Bergin: My name is Mark Bergin. I’m a reporter with Bloomberg and the author of a new book like Comment Subscribe Inside YouTube’s Chaotic Rise to World Domination.

Claire Stapleton: I’m Claire Stapleton and I worked at Google from 2007 to 2019. I was a marketing manager there at Google and YouTube, and I was the co-organizer of the Do you have a team Google account? And I subsequently left the company and now I read a newsletter about being a formerly disillusioned tech worker.

Advertisement

Lizzie O’Leary: I brought Mark and Claire on the show this week to talk about YouTube. Mark’s fascinating new book about the company is out this week, and Claire was one of his sources.

Claire Stapleton: Little teenage girls.

Speaker 4: Well, bow before me.

Lizzie O’Leary: One key story in the book. A story that says a lot about YouTube as a platform unfolded in 2017.

Speaker 4: They will buy all my merchandise.

Mark Bergin: They will click.

Claire Stapleton: Like on all my videos.

Lizzie O’Leary: You’re hearing the voice of PewDiePie, a.k.a. Felix Dolberg, YouTube’s number one creator. What you can’t see is that he’s interspersed his video with images of Nazi salutes. It was an in-your-face response to accusations that he was flirting with white supremacy and a story in the Wall Street Journal about anti-Semitism on his YouTube channel, something Dahlberg said he did as boundary pushing satire. YouTube never figured its biggest star would be beloved by neo-Nazis.

Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

Mark Bergin: It was right after Trump had sent out a letter about the Holocaust Day, and it didn’t mention Jews. And there was a lot of conversation then about the significance of that and how that was being received in the far right. And there was a Wall Street Journal reporter that went and looked at the Daily Stormer, which the neo-Nazi website, and then saw at the top of this website. A neo-Nazi website has the world’s biggest YouTuber and is his number one fan. And that’s how the story began. And then he noticed that this the site had flagged a lot of videos and that they they thought were sort of coded fascism and supporting their message.

Advertisement

Mark Bergin: So at the time, PewDiePie had been since 2012, the most subscribed YouTuber. At that point, he was close to 50 million. This star of that was beyond something the company had ever really conceived. Certainly when they invented this concept of subscribers. They didn’t think of it going that that large. And he was incredibly commercially successful and this sort of new form of performance art and new media and this vanguard of what YouTube saw as sort of transforming Hollywood and entertainment. And he was also this even then, this a great example of this star that they couldn’t control and often pushed the boundaries of respectability and did things that I think made people in the company certainly cringe and some of them pretty uncomfortable. But up until that point, it had never gone that far.

Advertisement
Advertisement

Lizzie O’Leary: Claire One of the things that comes across in the book is that you were very uncomfortable with PewDiePie and pushed back. And I wonder, was anyone listening to you when you saw him and his rise to fame? What did you think?

Advertisement
Advertisement
Advertisement

Claire Stapleton: Well, I think, you know, what was really awkward was that my job was basically values marketing. I was monitoring the conversation around YouTube. We were trying to steer clear of anything that could attract negative brand attention. And what we what we put out, promoted every single day. And I also worked on marketing campaigns that were around International Women’s Day or Pride or, you know, all these different things. I was being told we need to be out there as a brand like Patagonia or Teen Vogue. You know, we need to be, you know, standing up in the Trump era for the things we care about. So, you know, my discomfort with privacy content was almost secondary to saying hello. This is a huge contradiction here. I was just trying to get clarity and trying to get the leadership to take a look at what was an existential tension.

Advertisement

Lizzie O’Leary: Today on the show, a remarkable look inside YouTube, how the company created a new kind of algorithm driven entertainment, the darkness that unleashed, and how, despite it all, YouTube manages to fly under the radar, even as regulators are breathing down Facebook, Twitter and TikTok snacks. I’m Lizzie O’Leary and you’re listening to what next TBD, a show about technology, power and how the future will be determined. Stick with us.

Lizzie O’Leary: There’s so much video on YouTube now that it’s almost hard to remember that the company only started in 2005. Google bought it in 2006. The sheer amount of content on YouTube virtually guarantees that no one person’s experience is the same. You might use it for tutorials about how to fix a sink or do your makeup. Your neighbor might be watching PewDiePie. Your spouse could be sharing old clips of Saturday Night Live sketches. The company says 500 hours of video are uploaded every minute.

Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

Mark Bergin: I mean, there are I think that’s that it’s something like on the order of 5000 creators in the U.S. with a million plus subscribers. So not only that is there’s a huge demand. Since 2016, there’s been 1000000000 hours watched every every day. And now there’s also this gigantic economy that created people whose livelihoods and professional lives are on YouTube as well.

Lizzie O’Leary: How does the split break down between video from creators or kind of more organic video versus things like clips from major TV networks?

Mark Bergin: The Susan Wood jetski has said that it was about half what what YouTube call you. Yes, sorry. The CEO of YouTube in there called the Pandemic Creators, which I think is a hilarious term. So there’s record labels or traditional media. There’s Jimmy Kimmel, you know, John Oliver, like they are certainly in the past few years have been a bigger part of YouTube’s business because they’ve had to adjust the dials for the endemic creators to avoid kind of brand advertising catastrophes. It’s unclear. Like, you know, YouTube came out last year and said we’ve given $30 billion to creators. And you press them like, Oh, how much of that actually goes to independent YouTubers? What we think of as a creator versus how much of that went to Taylor Swift’s record label? And they don’t have a clear answer for that.

Advertisement

Lizzie O’Leary: One of the things that sets YouTube apart from other places on the Internet is how they pay creators. And I wonder, Mark, if you could walk me through that and explain why that is important.

Mark Bergin: Yeah. So they were really early in 2007 when they started what’s called the partner program, which was just sharing revenue with at that point it was about 30 popular creators, some of which are still around. Phil DeFranco is a YouTuber that has an audience like he does a daily news show, probably on par with like CNN.

Advertisement
Advertisement
Advertisement

Speaker 4: You beautiful bastards. They’ll be having a fantastic Wednesday.

Speaker 4: Welcome back to the Philip DeFranco Show. And let’s just jump into this.

Advertisement

Mark Bergin: So these incredibly influential who are like in youth culture and YouTube culture that at the onset it was they kept it relatively limited. And then around 2011, 2012, they opened up the floodgates and it was a very naive, very googly vision of the world that sort of, you know, Google has the same system for bloggers. They’ve this product called AdSense. And so if you start a blog, you can kind of pretty easily run banner ads and and monetize this the same sort of logic and system for YouTube, this massive self-governing that had millions of creators then making money and with very few rules in place, it was basically, don’t break copyright rules, don’t break graphic violence, don’t put up porn. Everything else is fair game, both to run on YouTube and then to run ads and share money with Google.

Advertisement

Lizzie O’Leary: What truly powers YouTube? What has helped it hook viewers is its recommendation algorithm. That little section on the right side of your screen that suggests another video and then another and another.

Mark Bergin: That initially was a pretty rudimentary recommendation system. It wasn’t until really 2014, 2015 when it became and YouTube at the time, it’s remarkable, went out and talked about this as like this sort of fantastic recommendation system that can and still to this day provide people kind of exactly that kind of, you know, optimize to feed you a video that you’re most likely going to click on and watch the full length of the clip.

Lizzie O’Leary: Claire Did anyone at YouTube realize the other side of that coin? Even in prepping this show, my producer’s algorithmic suggestions after pulling a number of PewDiePie clips or other clips, started pushing him into, frankly, misogynist content or things that got darker and darker.

Advertisement
Advertisement
Advertisement
Advertisement

Claire Stapleton: I don’t I don’t think people talk about it enough because the algorithm is you tube. The system is so totalizing there that all these other measures to mitigate harm are at the margins, essentially because the as Mark described, how the system is set up around ads, that means it’s around clicks and eyeballs.

Claire Stapleton: The main metrics that you to, you know, holds itself to every year are pretty much all about growth and all about, you know, bringing billions and billions of eyeballs and views into the site. And you can’t mess with that too much. The algorithm is incredibly adept at optimizing clicks. Watch time keeping people on the site as long as possible. So unless you fundamentally change, those goals are fundamentally changed the way the entire Internet system works. How can you address that? Ultimately, you’re leading people to rabbit holes and, you know, getting deeper and deeper into, you know, content that’s can be radicalizing.

Lizzie O’Leary: And and the object here is watch time, right? It’s not just clicks, but it’s keeping me on the site for as long as possible. Yeah.

Claire Stapleton: I mean, I think that, you know, it logically makes sense that, you know, the more people form community or connect with creators, the deeper they get into. I mean, I think this is why someone like Jordan Peterson is so popular because he offers it’s almost like a self-help parasocial dad connection. And, you know, there’s there’s so many different kinds of content that bridge off of that. The further you get into that worldview, the more time you’re likely to spend in a place like YouTube, the more connections you’re you’re going to feel with other people on the site. It’s too good for the bottom line to resist, really.

Advertisement
Advertisement
Advertisement
Advertisement

Lizzie O’Leary: YouTube now has a section called YouTube Kids, which was launched in 2015. It gives parents a series of controls to limit what their kids can see. But the company wasn’t always so deliberate when it came to what children could find. In fact, it resisted it for years. I’m thinking about kids. I’m thinking about YouTube being the preferred social media network, at least in 2018, for four kids ages 13 to 17. TikTok has now taken that mantle. But Mark, there was a lot of deliberation within the company about kids and maybe a better way to say it is looking the other way because kids under 13 weren’t supposed to be on YouTube. Considering how impressionable kids are and how likely they are to keep clicking, how was that conversation unfolding internally about what the algorithm was serving children?

Mark Bergin: That was, I think, one of the most profound sort of wake up moments and actually even earlier than we think of sort of when a lot a lot changed for YouTube starting in 2018. But but back even a decade ago, we saw this explosion of kids material that hadn’t existed before. And a lot of it was toy and boxing.

Speaker 5: Hey, guys, Disney collector here with this brand new package from Disney Infinity. This is from the movie Frozen. And he comes with Anna and Elsa. Let’s open here.

Mark Bergin: It was just massive. Like at one point it was most popular channel and all of YouTube was this anonymous called Disney Collector B R, the just toy unboxing. And at that time, you know, they were I think it’s important to remember there are people working at the company in leadership positions were just having young kids. I think having this reckoning and personally they tried something this early system they called nutritious and delicious. And so they they attempted to basically program quality into the algorithm and go into this in the book a little bit. That was dropped. They kind of pick up the pieces later on after some crises, in part because, you know, YouTube and Google just has this very sort of built into their DNA, this resistance to do any, like, heavy handed editorializing. Hmm. And they didn’t want to dictate, you know, what was kid’s content, what’s quality?

Advertisement
Advertisement
Advertisement

Mark Bergin: Right. Like, who are we to determine what’s quality? Who are we to determine what’s kid’s content? And then the major issue there was, as you mentioned, there’s COPPA, which is the law that you can’t serve targeted advertising to children under 13 or 13. Right. So YouTube was found guilty of violating that. And arguably, I think the most effective U.S. regulation of a big tech company was when the FTC fined YouTube in 2019. And kids content has has changed pretty dramatically on the platforms since.

Lizzie O’Leary: Claire What level of awareness was there within the platform about? Young children using it and and how few clicks away something that maybe looks like a kid’s video but gets really dark really fast.

Claire Stapleton: Yeah, I think that there was there was a recognition of that. And that’s why they put so much into the YouTube kids app, which was an environment that they could, you know, cordon off and, you know, have the have more moderation around the videos. I think it is an area where they have had to take seriously because of of of both press attention and then actual regulation. But again, it’s an absolutely huge segment of YouTube usage. It’s a wild, creative jungle of stuff. I mean, it’s the kid vloggers, the family vlogging. You know, we’re having a window into how people are, what people want to see and how people are consuming content in this next generation. And it’s terrifying, frankly.

Lizzie O’Leary: When we come back, advertisers are not fans of having their products next to questionable content. And guess what YouTube is full of? Mark. The flipside of being the place that everyone can upload their content is the kind of awful, dark stuff that gets up there that we have hinted at. Hate speech. Terrorism. How and when did the company start thinking about policing that?

Advertisement
Advertisement
Advertisement

Claire Stapleton: I think.

Mark Bergin: Fairly early. Google acquired them about 18 months into their existence. But they’d hired a team that had a worked and experience in there, which is kind of web culture. And a lot of they like early Internet and the NFF and these groups that were.

Lizzie O’Leary: The Electronic Frontier Foundation.

Mark Bergin: Yeah, I think you like very much saw the Internet as this frontier against government censorship and against big companies. And so they put it in place where a system of content moderation didn’t really exist. They were sort of inventing on the fly. And so the rules where were very lax, but they did have things in place about hate speech, about harassment. You know, this was I think these are obviously very subjective terms. YouTube now would like the preferred them to be not they’re written code. But at the time I think people at YouTube thought that, of course these are subjective and these are like, this is our community, this is our website. We’re going to enforce that and in ways that we want to write, like it’s certainly miles away of light, years away from from where we are today of this idea of like a platform has to be fully neutral.

Lizzie O’Leary: But of course, once you start selling ads to multinational corporations, they get very nervous about what those ads are next to, whether that content is sexual or offensive or just really weird.

Mark Bergin: Around 2012 is when they sort of opened up the floodgates. Anyone can make money at the same time, they they switched their algorithm to watch time. And you saw this tremendous growth. The leadership there at YouTube was saying, we want to tackle television. You know, TV has 4 to 5 hours a day at this point. YouTube had like 5 to 7 minutes of daily viewing of people’s attention. People’s attention. Right. It was sort of you get in bathroom breaks or something or like, you know, it wasn’t it wasn’t what it what it was today. I talked to a lot of people who were at the time like, you know, hindsight 2020, like when they’re measuring watch time, right? They didn’t see how many times people are like violating the rules or come close to the rules.

Advertisement
Advertisement
Advertisement

Mark Bergin: YouTube has said that about 70% of of all views come from the recommendation engine. So but back then, it was sort of it was a binary, right? It was like the video is on YouTube. It is making money. It is most likely being recommended to viewers. I think the breaking point was in early 2017, right after the PTI incident got a lot of press attention in Europe, which is important to remember, was where there was a lot more regulation and threat of regulation.

Lizzie O’Leary: Far, far less of a sort of free speech. American culture.

Mark Bergin: Yes. Like in Europe, this sort of libertarian, like free for all doesn’t fly very well. So there were some reporting about the brand issue was these household brands that were sponsoring fringe and extremist videos. All of them came out of the Murdoch papers. And there are people at Google who will tell you privately, like, you know, hey, look, it’s Murdoch, Rupert Murdoch, who’s been our enemy for years. Like they’re out to get us. And still, even to this day, it was this accusation that reporters were blowing up this story, which was fundamentally just going to advertisers and saying, like, here’s what you’re spending money on. Did you know that? Do you feel comfortable doing that?

Mark Bergin: And here is where YouTube is telling you this is just as good as television. And then you talk to advertisers and they didn’t know how the system worked. Right. It was this very opaque still to this day, like the digital advertising world is incredibly confusing and not transparent. And there was a defining moment where it became like, is this YouTube system just not sustainable?

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Claire, you are best known as one of the organizers of the Google walkout. But there is this really interesting note in the book that says you might not have gone so far in in vocally protesting Google if you hadn’t been at YouTube. That that that was really key. And I wonder what exactly you mean. What was it about being adjacent to all of this YouTube drama that. Pushed you in that direction rather than had you been sort of still a part of main Google.

Claire Stapleton: You know, I really think that YouTube is the tip of the spear with a lot of these issues in tech of companies wanting to be something. And the reality is, the Frankenstein that that Mark describes and I my my role in articulating and promoting the company’s values was in such contrast to what what I saw on the site every day, the way I saw leadership deal with issues related to the health of society and responsibility and brand safety. And that tension was killing me. I think that, you know, all of what was said around that time around, you know, the employees of Google Revolting was down to this contradiction of we’ve been sold something around the company’s values, around its contribution to society, around the kind of workplace it has. That is not happening in my in my day to day life.

Claire Stapleton: And so, yeah, I think that, you know, YouTube because there is such an intensity talk to what’s it’s a mirror of societal issues and it like escalates and bottles them up and in a way it contributes to it and it propagates these sorts of societal issues. But the YouTube aspect of it is a very important part of why I and many others were so mad. Half of the organizers were for were at YouTube. It wasn’t it wasn’t just me. But the disillusionment and the sort of sense of cultural rot at YouTube is is definitely real.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Mark One of the things I find fascinating, both in the book and in conversations you’ve been having around it, is that you call YouTube a social media company. Really? Explain how that’s the case.

Mark Bergin: Yeah, I mean, it’s both. Both and, you know, YouTube is everything and do all people, which is one of its faults, but it’s also one of its gifts like it is. It is a streaming service. It is a utility. It is a search platform, the second biggest in the world, behind behind Google. And it functions for a lot of people, like as a social network or as a para social network, as the academic term. Right. You have relationships with creators and influencers. You sort of and PewDiePie is a good example like YouTubers are even when they have a persona. There’s this relationship with the audience that they cultivate, and it’s made in a way that the fans think they know them very intimately when they do not.

Lizzie O’Leary: Why do you think YouTube doesn’t attract the same level of political scrutiny that the rest of of big tech does?

Mark Bergin: I’ll give you a quick three reasons. One is structural. Like most people, you in journalists and politicians in particular, anyone over 40 might use YouTube. Oh, I learned to fix my sink. I learn like I do yoga routines with it, right? I bake bread or I watch old archival clips. I do not see my uncle posting weird Cunanan memes, typically in the same way that the.

Lizzie O’Leary: Way the way you’d encounter that on Facebook.

Mark Bergin: Yeah. So I think that that’s one. Two is that it’s covered differently, right? It’s like covered by the Google reporter, which is important. And so it’s often treated not as social media, in part because it’s buried under Google and Alphabet. We have less view and it’s financials. That’s a reason why there’s incentives in my industry at least to cover it less. Third is Google’s is a very savvy company and has been around for a long time. At one point, we’re spending more than any other corporation on lobbying. They are a very effective political actor, and I think they are very good at staying out of the limelight.

Advertisement
Advertisement
Advertisement

Mark Bergin: And then Susan, which she is and this is difficult writing a book, it’s not a known entity like everyone. Everyone knows Zuck. Most people say Sheryl Sandberg spent a lot of her career building up her public persona. Jack Dorsey is a weirdo that everyone loves or heads. Most people don’t know who Susan is, and I think that’s actually intentional on the company’s part.

Lizzie O’Leary: What do you think, Claire? Why do you think YouTube has been good about kind of flying below the radar? Is that intentional? Is it an accident?

Claire Stapleton: No, I think it’s lucky. But also there’s some factors at play. I think that, you know, Google in general has managed to position itself as more of a utility. It’s something useful. It’s not feel bad. I mean, using Google, using YouTube, because it is sort of a passive experience for the most part. I think that there’s, you know, something about Twitter and Facebook that is like it’s just the color red.

Claire Stapleton: There’s a quote in the book that I think tim scott is. It was a someone i knew from who was in google h.r. Saying susan’s not charismatic and that actually works to youtube’s advantage. I think that that’s true. She’s when she speaks, it’s like white noise, like you never really hear what she’s saying. It’s also such a complex company. It’s doing like a thousand different things. It’s this constant reshuffle, you know, constant just being in a million different pies. I don’t know. I think in some ways it’s offering is less cohesive. We can’t quite like understand where it fits and that that may be part of it.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Mark, when you think about the future of YouTube, and I know you’ve been thinking sort of like big thoughts in in reading and in now releasing this book. Do you think it will still be able to have this lower profile? Like, is anybody in Congress who’s been thinking about Section 230 reform? Are they paying attention to YouTube? Is YouTube able to just say, look, man, we’re a platform and what people put up there that’s on them? You know, does it does it get to keep this, quote unquote, neutrality as as it moves forward?

Mark Bergin: Well, I hope it’ll attract more attention. The reason I wrote the book. There’s a website called To Filter that’s like the billboard for YouTube. And if you go look at the top performing YouTube videos by just volume, they are cocomelon, which, you know. Right. Like they’re they’re all.

Lizzie O’Leary: As the as the mother of a toddler. Yes, I do know Cocomelon.

Mark Bergin: So it is it is a juggernaut in kids media. There have been like multibillion dollar kind of acquisitions. And I don’t I don’t see that going away anytime soon. And I think but that is also getting you know, the California is is looking at a bill around with tighter restrictions around privacy for for users under 18. Like that’s something where like the FTC has taken action. So I think and Tik Tok is getting more attention in part because of its Chinese ownership, but that will have an effect where everything’s going to effect YouTube. Another example is Apple has restricted ads targeting and it’s had this big impact, as people might know about on Facebook. And Facebook has cried about that. YouTube’s business has I don’t know equally, we can’t say that definitively, but has been has been affected like that as a key part of YouTube’s business. Right. Was like selling targeted ad on on iPhones to people watching video. So it is a.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Fact Apple is out of that business.

Mark Bergin: Apple’s yeah and it’s it is. But it’s another example where YouTube was savvy enough not to talk about it. And the way that they’re structured within Google Alphabet, we don’t know. But it has had impact. And so it’s a yes and no. I think they’ll continue to be impacted. And it seems the record has shown a lot less scrutiny. One thing that has changed, to be fair. And another reason why they haven’t been scrutinized as much is because they don’t share as much data and videos is harder to analyze. As a researcher, it’s just much more complicated than text as far as moderation or any type of doing analysis. YouTube has been pressured in the summer to start sharing more data with outside researchers. I think that is something that’s really important in understanding how this big platform works and all its ramifications.

Lizzie O’Leary: Claire What do you want people to understand about YouTube, either from Mark’s book or from this interview?

Claire Stapleton: I think it’s really useful to look under the hood at how kind of chaotic and complicated things are at a tech company like this, because it it it pushes it back down to you to think about how you and the people around you are engaging with social media. I as I was reading through, I was just thinking that like, I can’t believe I’m letting my kids watch. They don’t watch so much YouTube.

Claire Stapleton: But just the exposure to potentially very dark places is something that everybody needs to be really responsible and vigilant about on an individual level because the companies really aren’t going to take the action ever that you would like to see as a parent or as an individual who cares about a democratic society or, you know, a world or not so polarized, we’re at war or something like that. YouTube can be trusted to be a part of anything kind of negative or corrosive going on in society from from my point of view. And so it’s a good reminder to look at it with a critical distance.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Mark Bergan, Claire Stapleton, thank you both very much.

Mark Bergin: Thanks for having us.

Claire Stapleton: Thank you.

Lizzie O’Leary: Mark Bergen is a reporter at Bloomberg News and the author of like Comment Subscribe Inside YouTube’s Chaotic Rise to World Domination. Claire Stapleton writes the newsletter Tech Support about Navigating Life as a Disillusioned Tech Worker. That is it for our show today. What next? TBD is produced by Evan Campbell. Our show is edited by Jonathan Fisher. Joanne Levine is the executive producer for What next? Lisa montgomery is Vice President of Audio for Sleep. TBD is part of the larger what next family, and it’s also part of Future Tense, a partnership of Slate, Arizona State University and New America. And if you’re a fan of the show, I have a request for you. Become a Slate Plus member. That means you get all of the Slate podcasts ad free. Just head on over to Slate.com. Slash, what next? Plus, to sign up. All right. We will be back next week with more episodes. I’m Lizzie O’Leary. Thanks for listening.