S1: This ad free podcast is part of your Slate Plus membership.
S2: Welcome to the show about how technology is changing our lives and our future. I’m Shannon Polis.
S3: Hey everyone welcome to everybody. We’re coming to you from Slate in Future Tense a partnership between Slate Arizona State University and New America. We’re recording this on the afternoon of Monday October 7. On today’s show we’ll be talking to Sarah Roberts an assistant professor at UCLA and author of the book Behind the screen content moderation in the shadows of social media. After the interview my colleague Aaron Mac will join me for don’t close my tabs where we talk about the best things we saw on the web this week. That’s all coming up on that.
S2: At a company called mega tech there are recent college graduates who spend their days watching some of the most gruesome violent and graphic videos that you can find on the Internet. They make less than fifty thousand dollars a year. They’re only allowed to do the job for one year straight because it’s so mentally grueling they actually never have insurance they can’t use mega tax rock climbing wall. They aren’t on track to be promoted within the company but the job they do is critical to magnet tech without a mega check. Users would have to watch all those ghastly videos instead. You’re probably a megaton user yourself mega tech is a fake name for a real company. Sarah Roberts an assistant professor in the Department of Information Studies at UCLA made up the name Maggie stuck for her book Behind the screen content moderation in the shadows of social media. Today we’re going to talk to her about the people who keep the Internet clean and pleasant for the rest of us. Well relatively clean and what clean even means in the first place. Thank you so much for coming on. We’re really excited to have you to start off.
S4: I’m wondering if you could help us imagine what the Internet would look like if there were no content moderators.
S5: I think there are corners of the Internet that give us a good idea of what that looks like.
S6: So there are some pretty notorious spaces online that pride themselves on lack of moderation and those are places like for channels and other sordid dark disturbing parts of online life. It’s a fraught place to be.
S5: Even with content moderation we know that all kinds of things go on online that we’d rather not see or be exposed to.
S7: But I think we would have to think about an environment without the work that they do as that exponentially worse.
S4: You told Slate this spring that a lot of people who you talk to about content moderators actually disbelieved you when you told them about them noting that it was shocking that people with no apparent motivation would just tell me there’s no way to legions of people do this job. You’re lying. And why isn’t this something that even people who should know even people with expertise just don’t know about.
S6: Well I think there are a couple of factors that go in to that position Naledi. The first one is that there is a propensity on the part of the companies that sell this on using these platforms and being engaged with this technology to imagine an experience that is fully governed by that technology. That’s not a deficit on the part of regular people. That’s us just hearing the message that we’ve been served for the past decade and a half. That technology can solve all of the problems. There was an active if not overt tacit effort to get people to first of all not think about this problem. And then second if they were ever to think about it imagine that computers were behind solving it. So I think it’s kind of like a one two punch like first of all don’t think about it if you can help it. And if you don’t see what’s something that’s not there why would you imagine it being there. That goes for things that get deleted before you ever see them and that also goes through the human intervention. But secondly if you do happen to think about that imagine that there’s a big room of servers somewhere churning away making decisions about what can stay up and what can come down.
S8: And I think somehow that notion was more palatable to people for a variety of reasons including a sense that that would be somehow fairer or more judicious in decision making or more evenly applied.
S4: So it’s human content moderators that spend a lot of time removing things like really gruesome stuff like images involving bestiality or child pornography. Why aren’t computers capable of doing that right now.
S7: Well actually there is a very specific case where computers are doing a good job and that is the case of child sexual exploitation that you mentioned the reason that computers are good at that particular case is because for better or for worse there is a tendency on the part of people who want to view and consume that material to recirculate known bad material. So there is actually a database out there it’s called Photo DNA in which a majority of the world’s child sexual exploitation material is known and is it resides within there. And so when that material comes online it can be mathematically matched algorithmically matched to known bad material and rescinded pretty much immediately upon upload but that only works in the case where there’s something to match against. So any time there’s new material that’s been organically produced in the world whatever it is there’s nothing to match against and that takes a human typically to make a decision more and more of the big companies are augmenting those decision making processes with computers. But even in those cases it’s human intelligence that’s gone into building the system. So there’s humor. You know we might say there’s humans all the way down the line.
S4: Do you think that there will eventually be a time when computers can do all of this digital dirty work and we’ll just look back on this as like sort of a dark era when when people had to do this. Or will there always be these kind of digital knowledge workers picking up the pieces somewhere in there.
S7: Well I really like your framing of thinking about these people as digital knowledge workers because that’s really what they’re doing there. They’re contributing a human type of knowledge a specific kind of knowledge that comes from balancing so many complexities in an instantaneous decision cultural norms linguistic issues politics the demands of the platform local jurisdictional laws all of these things that go into the decision making process. And it really is the province of the human mind to make a decision like that. I would say that of course for many people in the industry there is a desire to want to move away from human labor. But we ought to be wary of that in some regards. And I say this as a person who’s very much invested wanting a better work life for the people who do this because if we were to flip a switch today and make this all computational it would be worrisome in the sense that there would be no auditing mechanism for these decisions are going it’s already kind of a black box situation as it stands but as opposed to a computer program humans can take the decision to discuss their work under under the auspices of anonymity with someone like me they can leak to journalists they can push back on the job when they think decisions don’t make sense they can organize for better conditions and pay. Those are things that if we flip the switch you went to total machine work on this. We would no longer have any insight on. But I think regardless of if that’s possible and I think it’s probably not even if it were that human knowledge work that you described would have to go in on the input side anyway. So one of the things we’re seeing now is a shift or an increase in the amount of work that goes into building such systems on the front end in the form of people doing decision making on large data sets to train machine learning tools that are later applied. So it’s really just kind of a shift from where in the production cycle or in the production chain the moderation happens. Does it happen on the call center floor in the moment making a decision on a particular piece of content or is that is that knowledge work shifted and rendered even more invisible on the front end of an algorithmic tool that then gets deployed. But if you talk to people in the industry who are close to this practice then they’re candid with you they’ll tell you that there really is not going to be a time when humans will be excised from the process.
S4: One thing that I found really interesting reading your book and reading some of your work is that you talk about commercial content moderation as a form of online brand and reputation management as opposed to this just sort of natural process that happens like Facebook is naturally kind of an okay place because everybody on Facebook is kind of OK. I’m wondering if you could just explain a little bit for our listeners what you mean by content moderation is brand reputation management.
S7: When I first came to this topic it was it was 2010 which doesn’t seem that long ago but I guess that kind of is now. And it was sort of in the upswing of these firms they’re coming into their sort of best moments. And we didn’t know a lot about how how their media got produced or how the landscape that we we all played and and participated in really was constituted. So I sort of had to reverse engineer to think about once I found out about this practice which was through a new york times article in 2010 that talked about it. I had to think about undoing my own belief for my own received notion that the Internet was just people sort of like you said acting sort of OK. Most of the time with each other which I guess you know we kind of know isn’t true versus there being an active curation process. And I realize that knowing what I did about social media companies which is that they are not really in the business of making lay users like you or me happy but they’re actually in the business of of advertising their real clients or advertisers. And so I realized that in order to be able to market to advertisers and be successful in that space they had to have some control over the ecosystem that they were creating. And that’s when I realized that content moderation at this kind of commercial level and at scale had to be more about the firms keeping maintaining control over their own their own ecosystem that they were creating and then trying to sell to advertisers more than anything else. Of course there are affects that support healthy community or protect people from being exposed to harmful imagery and so on. But really first and foremost the companies are protecting their own brand which is their platform and their environment first which they then use to sell to advertisers who want to connect to consumers like us. And so once I realized that it helped me understand the logic of why these practices might be kept secret or obfuscated why the companies were loath to discuss them with regular users who were believing that they were engaging in an ecosystem that was about free expression and the circulation of any thought or opinion that one might have unfettered. That was the lure to get people like us. And at the same time that was unpalatable to advertisers and to people who wanted to perhaps have their own brands on the platform and not find their brand represented next to something reprehensible or disturbing. And so that that’s sort of the conceit of the of the notion of brand management. And again when I was able to talk to people inside the industry they absolutely conceded that there would be no circumstance under which they would give up control of the gatekeeping of their platform and let it just kind of freely flow. They were always going to maintain that gatekeeping practice whether it was to allow or disallow content.
S4: That framework makes a lot of sense to me when I think about the issue on Instagram of people not being able to upload photos of female nipples where that’s considered like really lewd even though like a lot of people argue that it’s a form of free expression or like I follow a tattoo artist who hopes do reconstructive touches on women who have had mastectomies and she has always like coming up with really clever ways to hide her photo. Ms. Yeah. It seems like there are a lot of people that want that kind of free expression on the platform but it wouldn’t be so good for advertisers.
S7: SS And when we when we’re able to kind of apprehend and get get a hold on this fact and kind of reframe the logic of the platform it helps to understand so much more of why for example marginalized groups of people or people whose identities are not considered mainstream or not given primacy in terms of advertising dollars find themselves on the wrong side of content moderation decisions so often and Instagram again being a really fundamental case in point where we’ll see LGBTQ identify people who might be expressing elements of their sexuality or gender identity through the platform or other kinds of people artists frequently like you said a tattoo artist comes comes to mind and other kinds of people who push the margins of sort of mainstream palate ability find themselves constantly being deleted and they are frustrated because they themselves believe that they’re on a platform that has as a fundamental principle the idea that they should be able to share anything when in fact that isn’t really a current operating principle and I would argue never has been. But that was never told to consumers although I suspect it was very much touted tab for tasers.
S4: Do you think more consumers are becoming aware of this fact that social media platforms do kind of exist to sell them in a way to advertisers.
S7: I think absolutely as compared to say in 2010 when I started my work on this topic we are seeing more and more awareness of this fact and I would point to the work of so many other scholars in this space. People like Sofia noble and Frank Pasquale and so many other people who try to really unpack the true nature of the true economic nature of the exchange that these platforms have proposed. But that said that’s the fundamental point of my work like to set that baseline to to reframe the public’s understanding of commercial content moderation moderation as brand protection will hopefully reorient their own relationship to the platforms and help them weigh and consider the costs of engagement and have you know have a healthier sort of civic or social conversation about the role of these platforms in our everyday life that is based on a more truthful evaluation of what they are. And it’s not that people are dumb but it’s it’s yeah I don’t think people are suckers. I mean I would include myself in this in this group. I think that in essence we were given a partial story or sold a half truth. And so now it’s time to sort of demand greater accountability and understanding of what the platforms are and if they don’t tell us themselves it’s down to journalists and researchers and others to to unveil that and reframe.
S4: OK we’re going to take a quick break but then we’ll be right back with more from Dr. Sarah Roberts So you first heard about content moderation reading this article in The New York Times in 2010 about a firm in Iowa where folks are doing this work. What was the moment when you realized oh I have enough here to make this like a big project of my research. It’s not just this one office building. It’s so many affected by this.
S9: I do credit that article it was so critical in my own in my own awakening about the issue and I think that for me it was that one story because I realized I was seeing the tip of the iceberg. I was seeing you know just one one bit of insight one crack in in the facade that showed the infrastructure behind it and my own background having been online since nineteen ninety three in the early social Internet pre pre graphical internet everything was text based. And it was really not commercialized yet seeing the evolution of what became of that Internet into what we know as now. And then also my own background as an I.T. worker kind of combining with my academic work and reading that article I was able to imagine that if if there were these couple isolated cases as reported in the New York Times there had to be actually an ecosystem out there. And so it was pretty much upon reading this this brief but powerful article that I knew that this was something. And I tell this to a lot of young researchers too. It’s the thing that you can’t stop talking to your friends about. It’s the thing that you keep coming back to the article you keep forwarding the thing you keep asking people about that’s the thing you need to pursue. And so that’s what this was for me. So I started going tell my professors and to my peers to people I knew in industry Hey have you ever heard of this practice. Hey did you know this was going on and what happened was that in each case where I spoke to someone they said two things First they said hi I never thought of that you sound like the smartest folks in the room again. They’re very knowledgeable so they said I never thought of that. And then they said don’t computers do that. And I realized I knew enough of course about computation to know in 2010 that no computers couldn’t do that at scale Absolutely not. So if this was a demand there had to be people all over the world somehow engaged at many levels. And so that was it. I couldn’t I couldn’t drop it. I kept bringing it up. I forwarded it for the art article that everyone I knew. And it sort of kept me up at night. And that’s how I knew that it needed to be pursued.
S4: So at that point did you start going out and trying to find people who worked in these jobs to interview I did.
S9: I actually tried to find those people in Iowa. And at the time I was the P.A. the students at the University of Illinois so not terribly far away you know by car and also not terribly far away culturally. So I’m a Midwesterner you know I’m from Wisconsin. So this is kind of my element and I thought well I have no research money but I do have a car. I could I could conceivably drive to Iowa. So I tried to start getting in touch with anybody in any of the communities where these call centers were set up by this company that at the time was called Calera. It’s been sold and renamed many times since then. And I couldn’t get anyone to respond. No one. And that was another indicator to me that this was something that there was total radio silence. So I suspected that after that New York Times article came out there was probably an edict issued throughout the company that was like Don’t don’t talk to reporters don’t be telling what we’re doing. The firms that we contract with don’t want you to do that we don’t want you to do that. So of course that just made me more intrigued. But it was it was hard work in those early days to try to find those again those figures and those cracks where I could slip in and and get to people. And I knew that if I went and knocked on the front door I was going to get no traction. And I was also going to kind of show my hand to firms that I was interested in this topic. So there was a lot of sneaking around that went on and a lot of convincing of people that you know the fact that they were violating their NDA their nondisclosure agreement that they were all compelled to sign to do this work would be respected by me. So you probably noticed throughout the book I use pseudonyms and made up names for companies with interviews that I’ve conducted and so on to kind of protect their their identity.
S4: How did you convince them to violate their ideas to talk to you.
S9: The biggest thing was that I would assure them that I would protect their their identity which I’ve done throughout the years. But once that was established and once it was sort of established that I was really keenly interested in worker welfare and also just process like what does your work life look like what are you asked to do on a daily basis. And that I cared about this work. Most of the workers were eager to talk to me because they knew that they were doing a mission critical kind of activity for the platforms and yet they themselves were completely muzzled.
S8: Most of them were being brought in as contract labor as really kind of disposable labor poorly paid. The conditions weren’t great and yet they knew that they were sort of this like very thin line of protection. And really on the front lines on behalf of the companies for whom they labored and so I think they had a sentiment of wanting the world to kind of know that the Internet ecosystem that we’ve come to assume was just how it was naturally was actually the result of a great deal of labor and curation and thoughtful engagement on the part of a lot of smart people who were sort of for lack of a better word being exploited and also being kept as the dirty little secret while the engineers in the company were using the climbing wall and they they couldn’t because they were contractors and they were being paid hourly by another company with no benefits. So they were they were kind of eager I mean not to put too grandiose a spin on it but in essence they may have seen themselves to a certain extent as whistleblowers but also as people who were seeking a certain amount of wanting to enlighten the public about the nature of online engagement itself that they held the keys to. In many regards. So there was sometimes a bit of eagerness to get the story out. Once I was able to establish that I would protect their actual identities and where they were just anecdotally one interesting thing that’s happened over the years since I’ve done this work is that I called the Silicon Valley firm in my book where a number of workers come from I call it mega tech which is obviously a made up pseudonym and I’ve had opportunities over the years to engage with lots of people from major tech firms in Silicon Valley major social media firms and many many times these people come up to me on the side at an event or a meeting and say you know I know our company is mega tech.
S10: People at different companies they all think about how they think they’re mega tech and I can’t confirm or deny which one it is right. So that’s been that’s been fascinating. And another data point in terms of telling me yeah everybody had this problem. Everybody in the industry had this problem.
S4: So when the mega tech chapter you talk to a few recent college grads from pretty prestigious places like Berkeley and co and they’re kind of excited to get this job at Meg attack they’re right out of college they haven’t majored in STEM but it’s the cool thing to do and they end up you know living in the Bay Area and working for less than 50 k and one of the things that was so striking to me about those interviews is how they don’t seem to fully recognize a lot of the stuff they’re going through. In particular one interviewee said you know this isn’t really a bad job I can handle it. I just drink a lot. What do you make of that.
S8: I think that for the people that I spoke to I I was very cautious in general with how I ask them about the repercussions of the work that they did because again as many of us known as I’m sure many of your listeners already know a fundamental part of the job of doing this front line commercial content moderation is to be exposed to material most of us would never want to see ever and then they’re exposed to it over and over again. So imagine your sort of worst nightmare boogey man imagery whatever that is for you and that’s what they see. So I tried not to provoke around that issue because I took it as a given that that was happening and in some cases there would be this kind of almost like a bravado where workers would say you know as you as you quoted and I think that’s I think that’s Max Breen who said that they would say you know other people can’t handle it but I can handle it I can do this job I can stomach it. I’ve got the mental fortitude for it. Other people don’t. But I’m doing it. I’m actually taking one team but I don’t really notice any bad effects. And I would take that at face value but then these other moments would happen they would they would slip through where they would be just really deeply revelatory like that like I drink a lot or I find myself avoiding social situations or in an intimate moment with my partner I suddenly pushed her away and I didn’t even want to tell her why. And it was because I was seeing an image that I’d seen at work that day or something horrible. And I just didn’t want to tell her. So part of the job as a researcher is to gather these data points for lack of a better word and then do an analysis and interpretation of what people are really saying. So they’re saying one thing sometimes and then contradicting what they’ve said moments later. And it’s my job to hear that and receive it and draw that to the fore. And it’s something that I heard over and over again. So not only is it important to note that they were struggling in some of these moments but it was important to note that they were telling themselves over and over again that they weren’t. And this was a coping mechanism for just to put it simply if you knew that you had to go back to that job the next day. Would you tell yourself. I can’t do it I can’t do it or would you say I can do it. And then crack open a beer to ease the pain. I think what you’re seeing there is reaching for mechanisms to make it through. And sometimes those moments where there’s a denial are much more powerful than if the person had just come out and said Yeah I’m struggling because you can see that they’re wrestling internally with with how to handle the nature of their work.
S11: And it was it was hard to hear that OK we’re going to take another quick break and then we’ll continue our conversation with Dr. Sarah Roberts.
S4: How do you draw boundaries between you and your research subjects. When you when you do this before.
S8: I think that’s a really powerful question for me because I didn’t fully consider that to be honest with you. When I went down this path I would never claim that my experience doing the research that I’ve done is the same as or. As difficult as the kind of work that the frontline moderators have done and that I described in the book. But this is a topic that has put me in touch with the worst of humanity too by proxy. I mean again just as just as sometimes the protagonist of the book aren’t always fully in touch with what’s going on with them. I would say that for me it wasn’t until a few years into this work that I realized man I’m having to think about stuff like child sexual exploitation. Why does that roll off the tongue. For me that phrase know why do I have to think about databases of that. Why do you have to think about that it’s a problem that people upload images of of themselves beating their kids or abusing animals or killing each other and that’s enough of a problem that there are people engaged to remove that content over and over like that sucks and it’s depressing and it’s just disheartening. And it took me a while to realize that there might be some repercussions for me again. They’re nowhere near. I don’t want to compare myself like as on a one to one basis to the people I’ve talked to. But I actually think it’s more like a good illustration for like all of us to a certain extent. Like I I grew up never having seen somebody whose death recorded live in front of me and broadcast the shooting death of Philander Castiel went all over the Internet over and over again for advocacy purposes for other purposes. And many of us regular people have been exposed to heinous imagery as the course of our use of this technology and of this kind of sociality. So I think that for me drawing a line it’s sort of representative of how we all have to rethink and redraw lines for ourselves or maybe ask ourselves how those lines have been redrawn for us without our full knowledge and consent.
S10: I’m sorry that’s so that’s a depressing answer. No.
S12: So I mean it’s a depressing world. How do you how do you draw lines with social media use in your own life and has that changed over the past few years as you’ve done this work.
S8: I think it has. If I’m if I’m honest no one will accuse me of being anti-social media in terms of my own behavior I’m I’m quite active on Twitter and in easily found there. But I I would say that I have kind of curated my engagement to certain particular places corners of the Internet that I’m willing to spend my my own personal time in that isn’t related to this work. You know I’m mindful of the impact on me. I try to keep a healthy mental barrier between whatever’s going on on the Internet and what’s actually happening in my immediate physical realm. But you know I can’t say that it has caused me to tell everyone to throw away their phones and get off social media because I myself I it’s not possible for me. So one of the I guess one of the things that it brings up in terms of my own relationship and other people’s relationships is what what is possible in terms of reconstituting the relationship that we have and what are we willing to do and not do.
S5: And so I think about that a lot. I mean I’ve certainly you know in terms of where I’m president online I’ve shrunk my footprint but I’m still totally online and you know maybe that’s not fully a good thing. I don’t know.
S4: So one last question you noted earlier in the interview that having humans somewhere in the system to do content moderation is a necessity whether it’s at the top of an algorithm or whether it’s fielding videos as they come in and as happens now. What would be a humane way to set up this job. What type of people would be doing it. How would companies be caring for them in a way that doesn’t set them up for psychological damage.
S8: Well I think that the status quo right now in many of the biggest firms that have as far as they’re concerned the greatest need for labor is to source the Labor inexpensively en masse and create these large scale enterprises. And that might not be the best way but because of the scale that they’re already operating at I don’t know. Two billion users for one particular platform comes to mind or 400 hours of video content uploaded per minute for another comes to mind their own eco logic sort of has closed in their view other options beyond that kind of like massive labor infrastructure but I don’t think that’s probably the best way. Even if that persists even if that’s going to be the model going forward for those big companies there are there are things that can be done. And this is a question I put to the moderators themselves often when I talk to them and one of the individuals that is in the book just said simply when I asked her those questions said simply pay us. And I think what she meant was literally pay us more so that we can have a better quality of life. And I think that’s absolutely necessary. Many of these workers I mean just in the United States alone have been working at minimum wage. And so then when you think about. How companies are outsourcing the labor to known cheap labor sites in other parts of the world you see that there they’re going for the low bid here and it means that they don’t value the labor. So within that that notion of pay us. There’s also this notion of value loss and value our contribution. Setting up an ecosystem where this work is entirely secret. Everybody’s covered by NDA. They’re not really even supposed to be acknowledging their existence. Now they sort of have to because journalists and researchers have pushed in and advocates and pushed it has changed. That’s changed the landscape. But these are still people considered very low value in these firms whereas they’re doing a fundamental job that goes to the bottom line of revenue generation for them. So I think you know valuing them more is a start. I think giving people a pathway through a firm instead of a revolving door so that you might see that you’re doing this for a period of time but you can take that expertise gained and move on and the company would be great. But when all of these workers are are hired at remove and through layers of contractors that’s maybe not always possible either. Benefits health care psychological care. Also there’s a variety of tools that could be built to augment the process and help wellbeing so that people wouldn’t necessarily have to be confronted with all of the the full imagery every time I’m working with two researchers even at one and Libby Hemphill to go that route. The two of them are specialists in content moderation and and are also thinking about what kinds of tools and processes we could build to augment and help the actual workflow. So I think there’s a lot of room for improvement but it’s going to take the the firms themselves that need this work to recognize the inherent value and humanity of the people they’re asking to do it. I should say that Facebook came out in May with a with an announcement that they were raising the minimum wage for all of their content moderators in the United States across the board to a baseline fifteen dollars an hour and then in more expensive cities to a higher wage. And I thought that was great.
S9: But it told us some important things. First of all it told us that prior to that 15 dollars had necessarily been H that people were making to do this job. And it also told us that other firms the other big firms the other players in the industry weren’t there either. I waited for press releases to come from other companies from Google or snap to say Oh us too we’re doing that too. And it was like tumbleweeds. So we know that the status quo is woefully inadequate right now and 2019 also 20 dollars an hour in New York or San Francisco is still forget it.
S2: Yeah. To set you up with the kinds of life where you can like relax enough more like you’ve got a second job you’re you’re driving Lyft thank you so much for coming on our program to talk about it. I really appreciate you for having me.
S13: All right. We’re gonna take one final quick break and then Erin Mac will join me for don’t close my talks where we’ll talk about the best things we saw on the web this week OK now it’s time for down close my tabs.
S12: Joining me is my colleague Aaron Mack who’ll be hosting the show next week. Hey Erin. Hey Shannon. So what’s your top for this week.
S14: So for my time this week I want to recap a Slate article in light of a recent order that the Supreme Court handed down on Monday. So basically the court denied a petition from Domino’s to hear a case concerning the Americans With Disabilities Act. It was a big win for disability rights because Domino’s had been arguing that their websites and delivery app did not need to be accessible under the terms of the act. The plaintiff in the case was arguing that he was unable to order a pizza because Domino’s didn’t make their platforms accessible to people who are visually impaired. So Richard supple wrote about the case for Slate in September in a piece called Domino’s wants to slice away at the Americans With Disabilities Act supple is legally blind. He talks about just how much of modern life he wouldn’t be able to participate in if the companies aren’t required to make their websites accessible. You know obviously it’s not just pizza. It’s also bank accounts online utility bills gig economy jobs Amazon deliveries. So you know a lot of jobs nowadays also require us to be able to use the websites. And some we’ll just as a really good job describing the challenges of you know finding checkboxes and doing captures and a lot of other things we take for granted. So it’s really good context for this case. And so as Domino’s is going to have to make their website accessible now so the Supreme Court by striking down its appeal means that Domino’s actually just has to face the plaintiff in court. So they throw it back to a lower court and the trial is going to play out. And one of these lower courts that I think the case was just talking about whether or not you can charge Domino under the Americans With Disabilities Act and now they actually have to you know do the whole whole process of holding a trial.
S4: Interesting. So it’s going to be a little bit longer.
S14: Yeah it seems like it’s going to drag on for a bit. But I mean having the Supreme Court you know deny this petition is is a big win for disability rights because it just kind of reinforced the idea that the Americans With Disabilities Act needs to apply to websites as well. And what’s your tap for this week.
S12: My job this week is what if this were enough by Heather Law School. It’s a book that’s now out in paperback. It is notably not on the Internet though I’m sure you could get an e-book version. But after talking about content moderation this week for my interview I just had the feeling that I get from time to time that I should be spending less time on social media. And Heather Harvard Law School’s writing is a really good place for me to land when I’m not just scrolling through Instagram or Facebook. She writes the Ask Polly advice column for the cut which is also really good. But this book is just a collection of essays about basically being satisfied in your life the way things are. And you know doing that in this age of like where we’re constantly communicating that we’re happy and we’re striving for staff and we’re achieving our goals and like it’s great and we’re getting engaged and we’re going to weddings we feel good all the time and sometimes we feel bad. Mostly we feel good. And this book just always helps me take a pause from all of that. Do you have a particular essay that you would recommend in the book. That’s a great question. I don’t. I do have this one image from the book where she is at a party and it’s like you know everybody is doing what they do it party isn’t being fancy and interesting and she goes and sits in the backyard and just like there is this wet dog story around with this dog for her and she’s like this is exactly what I need to be doing right now. And I just think about it especially as I have a dog.
S3: All right. That’s our show. You can e-mail us at if then at Slate dot com. Send us your tough questions show and get suggestions or just say hi. You can also follow me on Twitter. I’m at Shen us. Thanks again to our guests Dr. Sarah Roberts. And thanks to everyone who has left us a comment or review on Apple podcasts or whatever platform you use to listen. We really appreciate your time.
S15: If done is a production of Slate in future tense a partnership between Slate Arizona State University New America. If you want more of Slate’s tech coverage sign up for the future times newsletter every week you’ll get news and commentary on how tech advances are changing the world in ways small and large. Sign up at Slate dot com slash future news.
S13: Our producer is Justin Dee right. Thanks also to Rosemary Belson who engineered for us in D.C. We will see you next week.