Can A.I. Know What You’re Feeling?

Listen to this episode

S1: When I got reporter Kate Kaye on Zoom, I asked her to engage in a little thought experiment with me.

S2: Okay.

S1: You’re looking at me via Zoom right now. Right. Admittedly, you are human. You are not a computer. But like, what do you. What do you see? How do I seem?

S2: Are you is this a question about what emotion you might be expressing on your face?

S1: Yeah.

S2: Right now, I mean, who knows? I think that’s I don’t know. Honestly, I wouldn’t know.

Advertisement

S1: I was asking Kate to try to figure out my emotional state based on my face. That’s because she’s written a series of articles for protocol about artificial intelligence, supposedly learning to do just that. Take our expressions and microexpressions, capture them by a computer vision.

S2: And then put it through a process that spits out some sort of score regarding whether or not someone’s engaged with what’s being said in a meeting, in a virtual meeting or in a virtual classroom scenario.

S1: That’s right. Companies are developing and selling AI products intended to tell your boss or your teacher how you’re feeling.

S2: We’re seeing more examples of this stuff that’s really been kind of just in the research level over the past few years. It’s not necessarily talked about a lot in relation to like something you can buy, right, a product and now it’s starting to be productized.

Advertisement

S1: But what’s unclear is how well or whether it really works. Today on the show, can I really know what you’re feeling? I’m Lizzie O’Leary and you’re listening to what next? TBD a show about technology, power, and how the future will be determined. Stick with us. I think I to a lot of people can be extremely confusing what it can do and what it can’t. I think some people might be familiar with predictive text. That’s a fairly clear example of a machine can look at lots of examples of writing and learn to suggest some common phrases. Emotion feels different. I wonder if you could explain what emotion AI is.

Advertisement
Advertisement
Advertisement
Advertisement

S2: What most of these technologies do is, again, they’re taking facial expression data, so they’re using a camera of some sort that is ingesting imagery data. And it’s that they’re applying computer vision to detect. What kind of a facial expression that might be categorized as other types of data that’s used in emotional? I typically might be your tone of voice, and then it might use things like just the text of the content. So a lot of times the processes, in addition to capturing image data, they’re also looking at or using the text of the dialogue in a virtual meeting, for example.

S1: Emotion AI is being used to track whether customer service workers are getting through to their clients, how audiences respond to ads, and whether drivers seem distracted behind the wheel. I was really struck in one of your stories about kind of the human problem of sales during the pandemic, that sales is an occupation that requires so much attention to these little cues, to tone of voice, to the smile, to the eye that wanders. And it’s really difficult to do it online. I wonder if you could walk me through how these companies propose to utilize emotionally in sales.

Advertisement

S2: One company that I write about is called Unifor, and they have software that they sell for sales and they sort of come out of the world of customer service technology. And what their system does is it actually does this in real time. Let’s say your you open up, you know, you’re on a zoom call and you’re a salesperson, you’re initiating the call. And what it first does is it asks to record. Then what it’s doing is if I’m the salesperson, I’m seeing a box on my screen that is gauging in real time the engagement level and the sentiment of either the one person or the room. And it’s measuring, Oh, engagement just went up or engagement just totally plummeted. When you mentioned the price of the software you’re selling or whatever it is.

Advertisement
Advertisement
Advertisement
Advertisement

S1: Is it is it just engagement that they’re measuring or are they looking at other things as well?

S2: This is just how this one company decided to productize it, where they’re measuring the engagement level and the I think the sentiment they also provide reports. So this is pretty common with these kinds of companies where they might provide you with if you’re a salesperson or maybe your sales manager and you want to know what all of your sales people did this month in terms of what kind of engagement level they had. This company provides a report and they have a little blurb that says, wow, engagement spiked up 10% this month. If you’re managing, like, hundreds of salespeople and not only do you want to help them, you also want to keep tabs on who’s better at what they do outside of just their actual sales. It’s a way that you can have some of this stuff quantified.

Advertisement

S1: The idea of, you know, recording and evaluating sales experiences isn’t that new? I think a lot of us are used to. Let’s say you call some customer service line. You’re used to that little thing that says this interaction may be recorded for quality control or blah, blah, blah, blah, blah, blah, blah, blah. But this this goes further in part because it is real time. Mm hmm. What is the evidence say about whether machine learning can accurately read emotions?

S2: Well, it’s there has been research done recently that suggest it you know, that it doesn’t.

S1: In a large study released in 2019, a group of psychologists found that inferring emotions from facial movements is incredibly difficult. They spent two years looking at data and examined more than 1000 underlying studies. Human feelings, they said, are simply too complex to be gauged from expression alone. If I scowl, I might be angry, but I also might just be confused or having trouble seeing something. Moreover, different expressions can mean different things depending on culture.

Advertisement
Advertisement
Advertisement
Advertisement

S2: People being able to assess this stuff is is highly questioned. And I think we have to remember that. You know, just on its own as a as a like a base level question to ask before we ask whether or not. Technology can do it.

S1: What do you know about how these models are trained? What were they trained on in order to recognize what my face is doing and what I might be thinking on the inside?

S2: I mean, the world of AI. There’s a whole labor force based all over the world in, you know, countries where the labor is a lot less expensive. People are just hired to label and annotate individual pieces of data that are then fed into A.I. systems to train them. They hired people who just do this kind of data labeling for all sorts of AI, and they gave them guidance for what would be considered a happy face versus a sad face or a confused look versus a engaged look or whatever it might be. And if there is a discrepancy among multiple data label ers about what it should be, then they would sort of only include it in the dataset that’s used to train the AI if there’s agreement among more than one data label or about how it should be labeled. I mean, the example that’s often used when we talk about data labeling is like, oh, people looking at images of apples and bananas. Like, Yeah, we know the difference between a dog and a cat and an apple and a banana. Nobody’s going to argue about that one. Facial expressions are a little different. Shouldn’t they be considered something different?

Advertisement

S1: But it sounds like the train has sort of already left the station in terms of products being created. Whether or not that that kind of of reliance is accurate.

Advertisement
Advertisement
Advertisement

S2: The train’s left the station in terms of products being. Introduced and you can probably assume that there’s going to be more. Zoom itself is considering integrating emotion aid type assessments into a sales software that they themselves just released that doesn’t right now have it. But they’re saying, oh, yeah, that’s I mean, they told me we’re we’re seriously considering doing this. If Zoom does it, that’s a game changer.

S1: It would especially be game changing for online classrooms, which are a priority for the companies making this technology. Intel and a company called Classroom Technologies are developing an emotion AI based system to run on top of Zoom.

Advertisement

S2: And their goal is to help teachers. And the idea is, let’s help the teacher kind of gauge whether or not students are picking up on what he or she is teaching. And so or pick up on whether or not a student might be confused about something or is a student board at this stage, they told me we don’t even know how it might be integrated. We just feel like what Intel is providing here is something that could be used as an additional signal for a teacher to use. And so we don’t really know what it’s going to look like yet.

S1: Right now, this all feels like, you know, early proof of concept testing. Did you get a sense from your reporting of the likelihood that something like what Intel is working on is going to become a product sold to two educational markets.

Advertisement

S2: A company like Intel. They can call it proof of concept, but but that is part of the process of turning something into a product ultimately, and sort of just vetting, you know, whether or not it should be or could be or would make sense to do that. So it seems like whenever there are technologies that have been adopted, if there’s a way to add a new feature or a new thing or find a new market for it, it’s going to happen. If we look at the fact that Zoom is possibly going to integrate emotion into a sales software, Zoom’s used for classrooms all over the world. If Zoom sees value in having this emotional AI stuff in a sales setting, maybe some time down the road, it turns. It ends up as a feature in your classroom.

Advertisement
Advertisement
Advertisement

S1: When we come back, investors see gold in the hills. The company is making a motion. I software have been attracting a lot of deep pocketed funders. Classroom technologies is backed by investors, including NFL quarterback Tom Brady, AOL cofounder Steve Case and Salesforce Ventures. In all your reporting, I was struck by the valuations on these software companies and the investors and the amount of money that is behind this. It feels like it has momentum. Do you think that’s correct?

S2: Just the idea of emotion. A.I. being incorporated into this stuff isn’t necessarily what’s driving the momentum. But you can look at a company like Unifor as I used it as an example, which recently got a series E that’s many series in round of funding of $400 million. That’s a lot of money. And so I think that you could look at the fact that they’re out there really promoting this emotion AI component of their technology as like a key selling point to what they’re doing and the fact that they have all this money behind them as a sign that it just feels so different from other worlds of technology that I’ve seen.

S1: Yeah. This is your beat. Does does emotion. I feel different from the other kinds of air you cover.

S2: I cover things that enterprises that businesses use. And so a lot of times what enterprises use, you know, it’s it’s like they’re doing data analytics and they’re looking at. They just want to improve their efficiency as a company or they want to maybe they’re in manufacturing and they want to like predict when a piece of equipment needs maintenance. A lot of times AI is somewhat mundane in terms of its use, and it might also incorporate data that has nothing to do with people, or at least in any direct way. And in this case, it feels different to me because we’re talking about a very physical, very personal component of who we are as people and our bodies. We think of this stuff as biometric data, and it has this really sterile kind of terminology associated with it. Like the fact is, it’s usually referring to how we walk, how we talk, what our face looks like. I mean, this is who we are.

Advertisement
Advertisement
Advertisement

S1: Okay. Thank you so much for talking with me.

S2: Thank you. It was great.

S1: Kate Kaye covers eye for protocol. That is it for the show today. TBD is produced by Ethan Brookes, were edited by Tori Bosch. Joanne Levine is the executive producer for What next? Alicia montgomery is the executive producer for Slate Podcasts. TBD is part of the larger What Next Family, and it’s also part of Future Tense, a partnership of Slate, Arizona State University and New America. And I want to recommend you listen to Tuesday’s episode of What Next. It’s about what intelligence the U.S. is and is not sharing with Ukraine. We’ll be back on Sunday with another episode. I’m Lizzie O’Leary. Thanks for listening.

S3: You. You. You you. You. You. You. You. You. You. You you.