The Surveillance Is Coming From Inside the (Smart) House

Listen to this episode

S1: This ad free podcast is part of your Slate Plus membership.

S2: Welcome to the show about how technology is changing our lives and our future. I’m Shannon Polis.

S3: Hey everyone. Welcome to everybody. We’re coming to you from Slate in Future Tense a partnership between plate Arizona State University and New America. We’re recording this on the afternoon of Tuesday September 24th. On today’s show we’ll be talking to Sam Alito a design researcher at the University of the Arts London who studies surveillance and abuse with smart home devices and intimate relationships.

S4: After the interview my colleague Aaron Mac will join me for don’t close my tabs where we talk about the best things we saw on the web this week. That’s all coming up on that.

S2: Do you have any smart technology in your home. Like millions of people. I do. And Alexa as well as light bulbs and an air conditioner that can be controlled from an app on my phone even when no one is home this kinds of technology makes it wonderfully easy to control my environment without leaving bad or keep things hospitable for my dog. When I’m at work it’s becoming commonplace to worry about how tech might be used by companies for nefarious purposes. Say listening in on your conversation about shoes to advertise a pair of sneakers to you. But for folks who find themselves the subject of intimate partner violence they also offer a way for abusers to control them track them and even make them doubt their own sanity smart home technology can be confusing. Even if you’re savvy about how your smart plugs function it could be hard to convince someone else that your gear isn’t just malfunctioning but really is being turned against you. Roxanne Leto is a design researcher at the University of Art’s London who’s exploring ways to make this tech safer. We’ll also chat about some of her other work on how to control her biases and user interfaces. I’m gig economy platforms like the information that’s displayed about your host when you’ve broken arrow bamboo. Thank you so much for being here.

S5: Sam thank you for inviting me. This is exciting.

S6: So we’re just gonna get right into it. You wrote a piece for digital last year describing how a smart home with its myriad sensor enabled devices can easily become a Smart prison not only due to state and corporate surveillance but also as an instrument of surveillance between intimate partners and family members. Can you tell us a little bit about what that kind of intimate partner surveillance looks like.

S7: Yes sir when we talk about smart homes and intimate partner surveillance it’s within the broader context of what’s been called technology facilitated abuse or tech abuse and that’s generally part of a larger pattern of abuse and control within domestic abuse. And it’s when perpetrators use technology to stalk harass and threaten their victims either during their relationship or once it’s over. And this can include stuff like threatening or coercing victims into giving up the passwords so that perpetrators can monitor their accounts across a number of different devices and platforms depending on what the victim uses. But it can also include covert monitoring. So monitoring when the victim isn’t aware of it and this can happen either because the perpetrator has had access to the victim’s device previously and installed something like stalker where or even just by using legitimate apps like find my friends or find my iPhone.

S5: So when it comes to smart homes this is all even easier because devices in the household are usually shared and they’re mainly set up using one account. So if the perpetrator sets up a smart thermostat or a smart door lock it will usually be tied to the perpetrator’s account giving them access to remotely control the device but also to view all the historic usage locks on a device.

S6: So I’m imagining a situation in which you know so a victim of abuse lives with their partner and they both have set up access to say the air conditioner or the thermostat and they both can control it on their phones and the abuser is I think this was an example in your paper say turning up the temperature in the home or turning it way down and kind of gaslighting the victim.

S8: Yeah those are actually the cases that we’ve seen the most so far those linked to thermostats and it can lead the victim to question sort of if they’re losing their minds like why is the temperature changing all the time or thinking that the device is malfunctioning. So sort of this sense of questioning their own reality and not really understanding what’s going on.

S6: What were some of the other common forms of abuse that you saw besides turning the temperature up or down.

S9: There was a lot of concern around Endo security cameras which are quite often marketed as something you can use to keep an eye on your pets while you’re at work or to keep an eye on your kids.

S10: But the fact is that once one of these devices has been set up whoever has an account on it can remotely log on and get a live video stream from inside the house without the user in the home being notified that someone’s looking in or being asked for permission for this to happen. I think that’s one of the main one of the main concerns around smart home devices.

S6: And it sounds like it can be hard for victims to even realize that this is happening.

S5: Yeah no that’s true because these devices don’t really have a forward dances that will tell you what’s going on.

S10: So if an endo security camera was here in the room with me now someone could potentially be looking in and I would have no way of knowing because there’s nothing visually or or even there’s nothing on the device that will notify me that it’s being remotely accessed it’s quite hard to know what’s happening and then it’s even harder to prove to the authorities.

S6: And I also thought it was interesting that you write about how these devices are sort of black boxes in terms of how they work. And I thought about how even in my own you know I have an Alexa and I have smart lights that my boyfriend’s out. Are there any ways that you can see victims or just everybody in our society being better educated about how this stuff works to kind of help someone get a grasp on how it can and might be used against them.

S10: Yes. Rubin with the charities I’m working with. We’ve been talking a lot about education around digital privacy and digital security and this is be something that needs to happen probably at a young age in schools. It’s so fundamental to everybody’s life but also then it’s to be a push in terms of technology redesign and how these devices are being developed because the onus cannot always be on the victim to educate themselves and protect themselves. Right. Things like smart home devices being default with one user account and then the person who holds that account being responsible for creating accounts for the other users in the house that might work fine in an idealized family unit where there’s no conflict. But if you have an abusive situation the perpetrator is obviously not going to create accounts for everybody else. So there needs to be sort of this thinking and this planning at a technology design level that no not everybody has the same home life experience and that we can’t just assume that devices with one account where this one person has access to everything is adequate for everybody.

S6: So you worked with survivors of intimate partner reviews for this paper. Are you anticipating smart home security and privacy through this process called code design. Can you tell us a little bit about what that process was like.

S5: Yes I designs the process of involving the communities you’re designing for into the creative process itself from figuring out what the issues are to constructing design briefs and deciding avenues for prototyping and development. And I’ve been working with survivors of into intimate partner abuse and professional support workers in a number of ways. And I feel like I need to say that all contact I’ve had with survivors is always mediated by domestic abuse support charities so that we can ensure that all the workshops or the interviews that everything’s conducted in those sort of ethical and safe way. The first phase of the project was interviewing survivors and support workers to understand what’s going on now in terms of tech abuse and that’s mainly around smartphones and social media. And once we understood what the main issues right now are we moved on to a phase of running design workshops with survivors aiming to think about the near future of smart home devices and how we can anticipate the threats posed by these devices and take a proactive approach in supporting victims with the tools and the information that they need to protect themselves.

S6: So you talked about how one solution for future apps could be to have it so that two accounts are automatically made or it’s easier for one person to not have control over say a smart air conditioner. What are some of the other solutions that you guys came up with together.

S11: One of the others was related to multi factor authentication and biometrics two different ways of authenticating a user in a system for example fingerprints and voice recognition is a lot safer than passwords. Perpetrators can easily coerce or threaten victims into giving away their passwords that might not be as easy to do with biometric authentication.

S8: So that was one thing but then also creating different levels of authentication at different steps. For example what we were talking about with remote access of devices if this system knows there is a user in the house there should be different levels of permission for that user than for someone accessing it remotely. Why is it that these devices can be controlled remotely. Even if this is already somebody in the House and even if that were to be possible then perhaps the user in the House should have to give authorization for this to happen.

S5: So that was one thing then people. The participants also talked a lot about the usability of privacy and security controls.

S8: Survivors already felt quite overwhelmed with the amount of privacy settings across social media across smartphones and they just felt that all these different smart devices all from different manufacturers in the house just makes this problem a million times worse. It’s just too time consuming and too complex to make sure that nobody who you don’t want accessing your data will be accessing it when you’ve got so many different things to monitor and to keep on top of especially if there’s an update that changes sort of the privacy defaults. How do you keep on top of this all the time so that was another one. And then also there was a big mis sort of lack of understanding of where data is stored and how it’s accessed survivors that we worked with didn’t really understand the cloud and trouble distinguishing between data that for example or me the corporation that owns it would have access to data that would be shared within the household and that participants were very keen on tools that would visualize what type of data is being captured where it’s being stored and exactly who has access to it. And when I say who I’m not talking about corporate surveillance in this case I’m talking about peer to peer surveillance. So who in my home has access to this data.

S6: It sounds like there’s such an onus right now on victims of abuse or just everyone living in an intimate partnership to really have this deep knowledge of all of their devices. Yeah and I mean it’s.

S8: I don’t know what your experience of it is but it is already quite challenging to manage one’s digital privacy across the things we use now imagining a future where your whole house is equipped with all of these devices it seems like an unreasonable amount of time and effort for people to be spending in protecting themselves.

S12: Oh yeah. Like I’m a childless woman and I live with roommates and I have a lot of free time to not even find it hard to dedicate time to making sure my password manager is up to date.

S9: Yeah yeah. And that’s actually a really good point because it’s not only in abusive relationships that the way these technologies are being developed aren’t adequate. I mean I live in London house sheds for 20 and 30 year olds are really common. Everybody lives in a house you’re imagining a house mate it brings all these devices into your house do you really feel comfortable in knowing there’s a security camera in every room that they could just tap into remotely. I mean it’s not just in abusive relationships there are so many different living arrangements that are just not being taken into consideration.

S6: Yeah that’s a great point. I’m thinking right now of a friend who reviews smart speakers for his job and he said that he has five of them in his room because his roommates don’t want them in the living room which is understandable but that even then they’re still in the house. So how can we take the onus off of individuals. Is it just up to companies to design better solutions and to build in some of these things and to be proactive about educating folks who buy their products about how they work.

S8: I think corporations need to assume their responsibilities but also there needs to be policy around protecting citizens rights and protecting rights to autonomy and and privacy which are being infringed. And then on an educational level business neat doesn’t need to raise awareness and to make sure that we’re educating our citizens and equipping them with the tools necessary to manage their digital lives.

S13: OK we’re going to take a quick break but we’ll be right back with more from rock Sam Alito.

S6: What kind of policies could help with these issues. I’m imagining maybe in some ideal future worlds that there would be a law that all new smart home devices have to go through a review not unlike we review the state’s health devices by the FDA to make sure that they have these safety measures in place. Is that the kind of thing you envision.

S8: Yeah I would imagine. I mean I’m not an expert on policy so I can’t really speak much about it. But assuming these devices are going into people’s hair you need to at least make sure that all the adults in the House have the same level of control or have the opportunity to have the same level of control over these devices over what data is being gathered and over who’s accessing it. I’m not exactly sure how that would look because I’m really not a policy expert. I’m a designer and a researcher.

S6: So you talked a little bit about designs that you would like to see on future products. Are there any products out there right now that you think are doing a good job of being secure or have features that you think are laudable.

S9: I haven’t really come across any that are making a observable efforts to ensure that data is protected between peers in the household. No there’s quite a lot of emerging research in the field so maybe the design changes will happen once sort of this research has matured.

S12: But at the moment there’s nothing I could really point to to say this is a good example that must be frustrating but at times frustrating to me that companies aren’t being proactive about this themselves.

S8: I think they should. They’re really not aware of the issue. I guess when you’re designing with this sort of ideal family unit in mind then you’re not really talking to people who fall out of this group then you wouldn’t necessarily be aware of what the issues are.

S6: Yeah. And it’s hard for me as someone who has smart home devices that I don’t entirely know how they work and other people in my household do have a better handle on how they work. I don’t want to be in the state where I’m thinking about OK but how can I protect myself. And in case there’s like nice setup that I have go sour. That just sounds like such a burden.

S9: Yeah it definitely is. We’ve been working with refuge which is one of the biggest domestic abuse support charities here in the UK. And what we’ve been doing is developing a checkbox that answers survivors technical questions things like how do I know if my location is being shared through couple’s family sharing for example and what the checkbook does is it spits out a video with visual instructions and a guide showing you exactly how to check all of these things across a number of different devices and platforms. And while that that may be useful to someone who needs to protect themselves again it’s good for people to help themselves but it’s placing the onus of protecting yourself back onto the survivor. It’s really hard to rein to navigate because on the one hand you want to give people the tools they need. But on the other hand they really shouldn’t have to be doing this for themselves.

S6: I’m wondering if you could talk a little bit more about ways that technology can help domestic violence organizations communicate with victims.

S9: Yeah that’s a bit of a gray area for us because while technology can do a lot of great things the risk is that pushing technology in this sort of sector will lead to face to face support is being slashed in favor of support mediated by technology. So in the case of this checkpoint what it does is it strictly on technical questions. I’ve also seen proposals out there of using A.I. and chat boards to give victims support around safety planning and emotional support and a number of other things that victims need.

S14: And I just don’t really feel like that’s a good idea. It doesn’t feel like we should be replacing the services offered by these charities with A.I. or with any other technological solutions.

S8: And that’s certainly I see that in the UK as part of an austerity agenda where there’s it sort of the a high level need to slash the costs involved in providing support. But at the same time you’re going to cut the costs but you’re going to be delivering support that is not adequate.

S6: That’s. Does that make sense and that’s fascinating in a terrible way. Because my instinct and hearing about a chat box for domestic abuse organizations is oh this would be another wonderful way to reach more people or to make the barrier to getting help really low. But it sounds like if it’s not delivered as a bonus but it could actually end up replacing people or preventing them from getting help.

S8: Yeah preventing them from getting a help they actually really need the emotional support that they actually really need and that you would never be able to deliver through. I go through a checkpoint that makes a lot of sense.

S6: I also report on digital health and in the States we have text message therapy is kind of exploding right now and there is kind of the same concern there that someone’s going to see a therapist over you know essentially a secure email chat and that that’s going to stymie them for saving somebody’s face to face or over video. What about having not a chat bots but chats where humans are behind behind the other screen. Is that potentially helpful or do the same concerns come up there.

S8: I think that’s potentially helpful especially when you’re considering younger age groups and I think some services for young people here in the UK already do offer that sort of support through via Web chat. And that’s super useful because quite often the younger generation just is more comfortable with chatting over the Internet rather than picking up a phone and actually speaking to someone. So there is a place for it you just need to understand who you’re trying to reach and understand that it won’t be the best way to do it for everybody.

S6: I’m also I think you mentioned in another interview that that can be a good way for people to overcome a language barrier if a victim doesn’t speak. The most common language in whatever country they’re em that they can get get help and use say Google Translate to communicate with a domestic abuse organization that way.

S8: Yeah I think even if you have a web chat you would need someone on the other side that speaks the same language as the victim when you’re in a situation like domestic abuse. There’s a lot of stress there’s a lot of anxiety there’s a lot of fear. I wouldn’t feel comfortable asking someone to interact via web chat and having them translate things over Google Translate for example. It’s just another way of adding more stress onto an already difficult interaction and a very difficult situation especially when you know I mean Google translates pretty good but it’s not always accurate.

S13: OK.

S15: We’re going to take another quick break and then we’ll continue our conversation with Roxanna Leto I’m wondering if you could tell me a little bit about the work you do with fair UI.

S5: What that is so February is a non profit project that I’m doing with a design studio based in Copenhagen they called some young and it’s about understanding how we can reduce the opportunity for bias on online gig economy platforms so platforms where people pitch for short term jobs and to complete the job quite often without meeting their employer. So it’s all managed over the Internet and there’s quite a lot of evidence that there’s a significant bias in favor of white males on these platforms where everybody else is sort of discriminated against even though they have the same qualifications.

S7: And what we’re doing in fair UI is testing out different ways of designing the interface and sort of tweaking the information we show about job candidates to see if there’s an effect on bias. So things like showing displaying all or not people’s profile pictures displaying or not their names potentially disclosing information at different stages of the process rather than disclosing all information all of the information upfront that’s what that is about.

S6: OK. So on the fair you I rang you mentioned like not displaying a profile picture would be one way to reduce bias. Is there anything that reduces bias that is kind of surprising or like you would expect an element of someone’s profile to to create bias by actually hiding like this one thing is helpful.

S8: We’re still sorting through all the data. So I don’t have a definitive answer for you but we have been looking at other research on how bias may be reduced through things that you can display in the user interface. And one of the things we’ve come across is reviews and ratings which we thought and we were quite surprised to find we thought that reviews and ratings would have a positive effect on bias. We thought it would reduce bias. And what we’ve actually seen in all the research that’s out there is that reviews and ratings are themselves biased with white men often receiving the best reviews and more reviews than everybody else. And black women receiving fewer reviews and reviews with more negative words in them overall. That was one thing that surprised us quite a bit and what we’re thinking about at the moment is. So if removing profile pictures and removing people’s names does have an effect on bias or if it does reduce bias then how do you design a platform where you’re able to maintain some sort of level of trust between uses when this information is not being displayed. So I imagine a LinkedIn with no profile pictures and no names. Would you be able to sort of cultivate that sense of trust that uses having a platform and would people even be willing to use a platform that does not display that sorts of information. That’s where our thinking is at the moment.

S6: Yeah I stayed in an Arabic movie this weekend upstate and I’m having a hard time imagining being as comfortable going to someone’s home in a remote area of New York if they didn’t have their little smiling profile picture there.

S9: Exactly. And I think that’s also why we went to explode progressive disclosure of information. So maybe you’ve seen the pictures of the place and they all seem nice and then they offer you book. Do you know that you will see the person’s profile picture and their name so you know you will see it before you go but you don’t see it before you make the booking for example. I’m not sure there’s a lot to explore in this field. Cooper did some work around that as well with now they only disclosed people’s names and profile pictures once the right has been confirmed. If I’m not mistaken Oh that’s smart.

S8: So what are you hoping the outcome of this research will be that you develop a set of guidelines that then companies can pick up and try to implement exactly that’s the aim of the project to come up with guidance for four designers and for developers that they can implement and use in their daily practice.

S6: Roxanne thank you so much for taking the time to chat with us. Your job sounds fascinating and extraordinarily necessary. Thank you. It was lovely to speak to you. All right we’re going to take one final quick break and then Aaron Mac will join me for don’t close my eyes or we’ll talk about the best things we saw on the web this week.

S2: OK. Now it’s time for don’t close my tabs. Joining me now is my colleague Aaron Mack who will be hosting the show next week. Hey Erin. Hey Shannon. So what’s your top for this week.

S16: My tab this week is a guardian op ed by Julia. Kerry Wong titled The Viral selfie app image net roulette seems fund until it called me a racist slur. So the piece focuses on the image that roulette online tool that was making the rounds last week. You basically upload a picture of a face and the tool will try to identify the kind of person it is using an algorithm. So the tool was developed by these A.I. researchers based on a data set of more than 14 million photos that have been classified using some crowdsourcing and Wong writes that you know a lot of tech journalists have been posting about this on Twitter and they tend to be predominantly white. So they had been given labels like whether man or pilot. Some pretty innocuous stuff. But when Wong who is biracial put her face in the algorithm kind of labeled her with a racist slur for Asian people. And she points out that this was partly on purpose. This was the intent of the tool. It was meant to highlight how biased and imperfect a I can be. The whole thing kind of made me think that maybe we need more of these consumer facial recognition apps that go viral like that that Google one that matches people’s faces the paintings or that other one that matches your picture to a dog breed. And in all these instances people have talked about how A.I. systems tend to play up racial stereotypes when these users are putting in their their pictures and maybe this is the best way to illustrate the problems with IV for a mass audience.

S12: Interesting. So it’s easy to tell someone that you know as biased and I think we all sort of know that at this point. But it really hammers home the ways in which it can be biased in a very visceral way.

S16: Right. You have a sort of tool to play around with and you can kind of see it firsthand how it plays up certain stereotypes or racialized features. And yeah it just brings a theoretical issue that seems very like an imagined harm at this point. Like you don’t really see it happening to you everyday but having this kind of app on your phone and having it reinforces biases in real time in front of you seems valuable even though it’s extremely uncomfortable and upsets interesting.

S12: But I do wonder if you mentioned it you know it is mostly white journalists who are playing around with this app and say oh I got whether man or whatever. If it ends up leaving people who really need to hear this information still kind of in the dark and maybe ends up calling people racist or sexist things that you know they’re they’re already aware that the world can sometimes consider them in this way.

S17: Yeah that’s a good point. I guess it would it would put undue burden on people of color and women to kind of talk about their stories online and posted on Twitter or whatever. Yeah I guess you can’t just ask people to do that work for us. Yeah it is it is it is tricky a tricky kind of calculus there.

S12: So what’s your take for this we might have is a piece that ran on Slate dot com. It’s called cashing in on climate change. It was put together by Slate’s Henry crowbar and it’s a series of short articles about companies and people that stand to make money off the climate crisis. And so I wrote one of the blurbs and it is about the concept of clothing that’s toxic to text and in particular this one company called insect shields and insect shields started out making clothing for the military for folks at West Point. And today thanks to the expansion of texts and the expanded risk of Lyme disease which is in part being egged on by climate change you can now buy a..

S18: Take care of all sorts like infinity scarves dog vests hammocks blankets. You could just outfit your entire body for summer and clothing that is poisonous to Tex. And so I kind of examined whether that stuff was helpful or you know stands to cash in on folks who are really worried about Lyme disease. The answer is a little of both. There are also little articles about private firefighters AC units and how air conditioning company is as the planet gets warmer in part from our use of air conditioners. They’re gonna sell even more air conditioners. I learned that jellyfish are gonna do really well and an ocean that’s warming up and getting more acidic because they actually thrive in low oxygen water.

S12: So jellyfish are gonna be winners of climate change. It was I don’t want to say a fine look at climate change because none of this is fine but it really made reading everybody else’s reporting and doing reporting on this made me think really hard and creatively about all of the little ways that our world is going to shift and about the ways that some people might be able to profit off of that.

S17: Yeah I remember reading your tick blurb when this came out. So did the companies that were making these clothes. Were they cognizant of the climate change impact on their business or did it seem like this was maybe something in the back of their mind but they weren’t really like hammering home on it.

S12: The marketing person that I spoke to had no comment on how climate change affected Lyme disease rates or their clothing and to be fair to them. I think that they’re mostly offering a totally legit product that can help protect people against ticks that are spreading for a large number of reasons including humans are kind of moving into their homes and deer are spreading. So it’s really when I started the piece I sort of thought maybe I will find someone who is like sitting there making these dumb scarves that people buy because they’re worried. And I ended it thinking you know maybe the next time I go into the woods I should be wearing their socks. It’s hard to give a ruling on whether this clothing works in a very general way because if you’re wearing socks vs. a shirt one socks help a lot more than a shirt. But it seems like they are you know in some ways a valuable addition to the market.

S17: Yeah I was looking at those links that you had put in the piece seems like the L.L. Bean stuff is clearly meant for like outdoor activity but the insect shield stuff actually seems like it’s might be for everyday use which I find I guess kind of underlines a point about the expanding tick issue. Yeah I found it interesting that the styles seem to be more casual than the other things he had pointed.

S12: Oh yeah they definitely had some business casual type infinity scarves but maybe you just want to look really stylish right here hanging out in your backyard which does have texts in and in these like scarves do protect from other kinds of bugs. So there’s no like definitive rule on where the line should be with Andrew tech gear. Yeah I just I don’t think I’m gonna be showing up at the office anytime soon with that anti tech infinities.

S3: All right. That’s our show. You can e-mail us at if then at Slate dot com. Send us your tech questions show and get suggestions or just say hi. You can follow me on Twitter. I’m Jan Paul. Thanks again to our guests rock Sam Alito and thanks to everyone who’s left us a comment or review on Apple podcasts or whatever platform you use to listen. We really appreciate your time.

S19: If there is a production of Slate in future tense a partnership between Slate Arizona State University and New America. If you want more of Slate’s tech coverage sign up for the future tense newsletter every week you’ll get news and commentary on how tech advances are changing the world in ways small and large. Sign up at Slate dot com slash future news. Our producer is Justin do you write thanks also to Rosemary Belson who engineered for us in D.C..

S20: We’ll see you next week.