The Trust Engineers

Episode Summary

Title: The Trust Engineers - In 2011, Facebook noticed a spike in users reporting photos over Christmas. Millions of reports were for harmless photos miscategorized as inappropriate. - Facebook engineer Arturo Bejar investigated and found most reports were from people embarrassed by photos of themselves. They picked random reasons like "hate speech" to get photos removed. - Facebook added options for users to explain why they disliked photos, such as "embarrassing" or "bad photo." This boosted responses, revealing the real issues. - But Facebook still couldn't remove photos for being embarrassing. So they prompted users to message the photo poster directly. Few did until Facebook provided pre-written messages, increasing messages sent. - Testing variations, they found phrases like "Hey [name], please take down this photo" worked better than sorrier versions. This shows how small tweaks to language can nudge people's behavior online. - Some criticize Facebook manipulating user emotions. But Arturo argues they're just trying to facilitate communication and trust online, since nonverbal cues are missing.

Episode Show Notes

First aired in 2015, this is an episode about social media, and how, when we talk online, things can quickly go south. But do they have to? In the earlier days of Facebook, we met with a group of social engineers who were convinced that tiny changes in wording can make the online world a kinder, gentler place. 

We just have to  agree to be their lab rats.

Because Facebook, or something like it, is where we share and like and gossip and gripe. And before we were as aware of its impact, Facebook had a laboratory of human behavior the likes of which we’d never seen. We got to peek into the work of Arturo Bejar and a team of researchers who were tweaking our online experience, to try to make the world a better place. And even now, just under a decade later, we’re still left wondering if that’s possible, or even a good idea.

EPISODE CREDITS 

Reported by - Andrew ZolliOriginal music and sound design contributed by - Mooninites

REFERENCES:

ArticlesAndrew Zolli’s blog post about Darwin’s Stickers (https://zpr.io/ZpMeUnRmVMgP) which highlights another one of these Facebook experiments that didn’t make it into the episode.

BooksAndrew Zolli’s Resilience: Why Things Bounce Back (https://zpr.io/7fYQ9iDYAQBu)Kate Crawford's Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (https://zpr.io/9rU5CGSit3W4)

 

Our newsletter comes out every Wednesday. It includes short essays, recommendations, and details about other ways to interact with the show. Sign up (https://radiolab.org/newsletter)!Radiolab is supported by listeners like you. Support Radiolab by becoming a member of The Lab (https://members.radiolab.org/) today.Follow our show on Instagram, Twitter and Facebook @radiolab, and share your thoughts with us by emailing radiolab@wnyc.org

Leadership support for Radiolab’s science programming is provided by the Gordon and Betty Moore Foundation, Science Sandbox, a Simons Foundation Initiative, and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.

Episode Transcript

SPEAKER_04: Radiolab is supported by Apple Card. Apple Card has a cash-back rewards program unlike other credit cards. You earn unlimited daily cash on every purchase, receive it daily, and can grow it at 4.15 annual percentage yield when you open a savings account. Apply for Apple Card in the Wallet app on iPhone. Apple Card subject to credit approval. Savings is available to Apple Card owners subject to eligibility requirements. Savings accounts provided by Goldman Sachs Bank USA. Member FDIC terms apply. SPEAKER_19: Listen to this. Listener supported. WNYC Studios. Crack cocaine plagued the United States for more than a decade. This week on Notes from America, author Donovan Ramsey explains how the myths of crack prolonged a disastrous era and shaped millions of lives. Listen now wherever you get your podcasts. SPEAKER_11: You're listening to Radiolab. SPEAKER_13: From WNYC. SPEAKER_04: Rewind. All right. SPEAKER_14: Hey I'm Chad Ibermarad. I'm Robert Krolwich. This is Radiolab, the podcast. So here's a story we've been following for a while. Comes from a friend of mine Andrew Zali who is a great thinker and writer. He wrote a book called Resilience Why Things Bounce Back and he's a guy who thinks a lot about technology. I have been interested in a long time for a long time in the relationship between technology SPEAKER_11: and emotion and because well I've thrown more than one cell phone to the ground. SPEAKER_14: Andrew and I were having breakfast one day and he pitched me on this idea of doing a story about Facebook. I am not a huge believer in doing stories about Facebook but this story was wickedly interesting and profound in its way. So he and I have been following it for a couple of years up and down through this rollercoaster of events. It really begins in 2011. Well let me back up for a minute. SPEAKER_08: SPEAKER_11: One of the challenges talking about Facebook is just the scale of the thing. So you know there's there's one point three billion people on earth as of March 2014. Those are active monthly users. There's a billion people who access the site through through mobile devices. Just to put that in perspective there's more Facebook users than there are Catholics. That can't be true. Yeah. SPEAKER_14: No. Yeah. It turns out it is true but they're neck and neck. Anyhow the overall point is that when you have one out of every seven people on the planet in the same space trying to connect across time and geography you are bound to SPEAKER_11: create problems sometimes. SPEAKER_16: Facebook making headlines again tonight. The issue this time. SPEAKER_14: Before we go there we should introduce you to the guy in our story who is the problem solver. SPEAKER_07: My name is Arturo Bejar and I'm a director of engineering at Facebook. SPEAKER_14: Story begins Christmas 2011. SPEAKER_11: People are doing what they do every holiday season. They're just they're getting back together with their families and they're going to family parties and they're taking lots and lots of pictures. And they're all uploading them to Facebook. SPEAKER_07: And at the time the number of photos that were getting uploaded was going pretty crazy. SPEAKER_14: In fact in just those few days between Christmas and New Year's there are more images uploaded SPEAKER_13: to Facebook than there were the entirety of Flickr. SPEAKER_14: Wait. You're saying more images were uploaded in a week to Facebook than all of Flickr all time? Yeah. Wow. Which created a situation. SPEAKER_15: The number of photos was going up and along with the number of photos going up the number SPEAKER_07: of reports was going up. SPEAKER_11: What he means by reports is this. Back in 2011. If you saw something on Facebook that really upset you you could click a button to report SPEAKER_14: it. You could tell Facebook to take it down which from their perspective is a really important mechanism because if you're Facebook you don't want certain kinds of content on your site. SPEAKER_15: You don't want nudity. SPEAKER_07: You don't want like drug use, hate speech, things like that. SPEAKER_11: So a day or so after Christmas. Thereabouts. Facebook engineers come back to work and they find waiting for them literally millions of photo reports. SPEAKER_07: Yes. And then the people that would be necessary to review everything that was coming in. It kind of boggled the mind. How many people would you have needed? I think at the time we were looking at it which is two years ago and again all this has grown much since then. We're looking at like thousands. SPEAKER_14: Like some giant facility in Nevada filled with nothing but humans looking at Christmas born. We were actually joking about this but we found out later there actually are thousands of people across the world who do this for Internet companies all day long which clearly warrants its own show. But for our purposes just know that when our photo is reported a human being has to look at it. SPEAKER_07: Exactly right because there needs to be a judgment on the image and humans are the best at that. SPEAKER_11: So Arturo decided before we do anything let's just figure out what we're dealing with. SPEAKER_07: And so we sat down with a team of people and we started going through the photos that people were reporting. SPEAKER_14: And what they found was that about 97 percent of these million or so photo reports were drastically mis-categorized. They were seeing moms holding little babies. SPEAKER_15: Reported for harassment. Pictures of families in matching Christmas sweaters. Reported for nudity. SPEAKER_07: Pictures of puppies reported for hate speech. SPEAKER_14: Puppies reported as hate speech? SPEAKER_07: Yes. SPEAKER_15: And we're like what's going on right? So they decide let's investigate. SPEAKER_11: OK so step one for Facebook. Just ask a few of these people. SPEAKER_13: Why don't you like this photo? Why did you report this? SPEAKER_14: Responses come back and the first thing they realize is that almost always the person complaining about the image was in the image they were complaining about. And they just hate the picture. Maybe they were doing a goofy dance someone snapped a photo and they're like why did you post that? Take it down. SPEAKER_11: Maybe they were at a party. They got a little too drunk. They hooked up with their ex. Everybody took a picture and that person says oh you know that's a one time thing that's never happening again. Take it down. SPEAKER_14: Arturo said there were definitely a lot of reports from people who used to be couples. And then they broke up and then they're asking to take the photos down. SPEAKER_07: And the puppy? SPEAKER_14: What would what would be the reason for that? SPEAKER_15: Oh because it was maybe a shared puppy. You know maybe it's your ex-wife's puppy. SPEAKER_14: You see it makes you sad. Take it down. SPEAKER_07: So once we've begun investigating you find that there's all of this relationship things that happen that are like really complicated. SPEAKER_11: You're talking about stuff that's the kind of natural detritus of human dramas. SPEAKER_14: And the only reason that the person reporting it flagged it as like hate speech is because that was one of the only options. They were just picking because they needed to get to the next screen to submit the report. SPEAKER_07: So we added a step. SPEAKER_13: Arturo and his team set it up so that when people were choosing that option. SPEAKER_13: I want this photo to be removed from Facebook. Some of them would see a little box on the screen that said how does the photo make you feel. And the box gave several choices. The options were embarrassing, sadening, bad photo. SPEAKER_07: And then we always put in an other where you could write in whatever you wanted about the SPEAKER_11: image. SPEAKER_07: And it worked incredibly well. I mean like 50 percent of people would select an emotion like for instance embarrassing. And then 34 percent of people would select other. And we read those we sit down and we're reading the other. And what was the most frequent thing that people were typing into other. It was it's embarrassing. SPEAKER_14: It's embarrassing but you had embarrassing on the list. I know. That's weird. I know. SPEAKER_14: And it's it's. Arturo was like OK. Maybe we should just put it's in front of the choices. SPEAKER_11: As in please describe this piece of content. SPEAKER_13: It's embarrassing. It's a bad photo of me. It makes me sad. Etc. And when they wrote out the choices that way with that extra word we went from 50 percent SPEAKER_07: of people selecting an emotion to 78 percent people selecting an emotion. In other words the word it's all by itself boosted the response by 28 percent from 50 SPEAKER_14: to 78. And in Facebook land that means thousands and thousands of people. I mean just to slow down for a second I'm trying to think of what could it what could SPEAKER_08: that be. It's. Do people like full sentences or. SPEAKER_14: Just thinking it's always good to mirror the way people talk. Right. Arturo's idea though which I find kind of interesting is that when you just say embarrassing and there's no subject it's silently implied that you are embarrassing. But if you say it's embarrassing well then that shifts the sort of emotional energy to this photograph thing. And so then it's less hot and it's easier to deal with. SPEAKER_08: Oh how interesting. That thing is embarrassing. I'm fine. It's embarrassing. It is responsible. Not me. Good for Arturo. That's interesting. It's a subtle thought. SPEAKER_14: It's very subtle. But it still doesn't solve their basic problem because even if Facebook now knows why the person flagged the photo that it was embarrassing and not actually hate speech they still can't take it down. SPEAKER_11: I mean there's nothing in the policy the terms of service that says you can't put up embarrassing photos. SPEAKER_14: And in fact if they took it down they'd be violating the rights of the person who posted it. SPEAKER_15: Like there's nothing we can do I'm sorry. SPEAKER_08: So they'd actually fence themselves in a little bit. Yeah. I mean I'd always put in another. I would just be like go deal with it yourself. SPEAKER_14: That's what I would say. Talk to the person. No honestly that's the solution. He wouldn't put it that way but I what he needed to have happen was for the person who posted the picture and the person who was pissed about it. To talk to each other. To work it out themselves. SPEAKER_14: So Arturo and his team made a tweak where if you said this photo was embarrassing or whatever a new screen would pop up and it would ask. SPEAKER_15: Do you want your friend to take the photo down? And if you said yes I would like my stupid friend to take the photo down. SPEAKER_13: We put up an empty message box. Just an empty box that said we think it's a good idea for you to tell the person who SPEAKER_11: upset you that they upset you. SPEAKER_07: And only 20 percent of people would type something in and send that message. They just didn't do it. SPEAKER_11: They just said I'd rather you deal with this. SPEAKER_14: So Arturo and his team were like OK let's take it one step further. When that message box popped up we gave people a default message that we crafted. SPEAKER_13: To start that conversation. Just get the conversation going. SPEAKER_07: And it's kind of funny the first version of the message that we did was like hey I didn't like this photo. SPEAKER_08: Take it down. Hey I don't like that photo that's a little aggressive. SPEAKER_14: It is. But when they started presenting people with a message box with that sentence prewritten in. Almost immediately. SPEAKER_07: We went from 20 percent of people sending a message to 50 percent of people sending a message. Really? It's surprising to all of us. We weren't expecting to see that big of a shift. SPEAKER_08: So this means that people just don't want to write. They'll sign up for pretty much anything. No. Not necessarily. SPEAKER_14: No. Maybe it's just that it's so easy to shirk the responsibility of confronting another person that you need every little stupid nudge you can get. I see. OK. That's how I see it. So they put out this prewritten message. It seems to really have an effect. So they're like OK. If that worked so well why don't we try some different wordings. Instead of Hey I didn't like this photo. SPEAKER_11: Take it down. Why don't we try Hey Robert I didn't like this photo. Take it down. Just putting in your name works about seven percent better than leaving it out. Meaning what? It means that you're seven percent more likely either to get the person to do what you asked them to do. Take down the photo. Or to start a conversation about how to resolve your feelings about it. Oh we're now measuring the effectiveness of the message. SPEAKER_08: If I'm objecting will the other party pull it off the computer. SPEAKER_14: Pull it off or just talk to you about it. OK. They also tried variations like Hey Robert would you please take it down. Throwing in the word please. Or would you mind taking it down. SPEAKER_07: And it turns out that would you please performs four percent better than would you mind. They're not totally sure why. SPEAKER_14: But they tried dozens of phrases like would you please mind, would you mind, I'm sorry to bring this up but would you please take it down, I'm sorry to bring this up but would you mind taking it down and at a certain point. SPEAKER_14: Andrew and I got. We're here to see Arturo. SPEAKER_14: We just want to see this whole process they're going through up close. So we took a trip out to Facebook headquarters Menlo Park California. This is about a year ago. Had you been here before? No I have not. So it's before the hubbub. We met up with Arturo who sort of walked through campus. Yeah that's like the hammock. SPEAKER_07: It's one of these sort of like socialist utopic Silicon Valley campuses where people are like SPEAKER_14: in hammocks and there's volleyball happening. We actually have baby foxes here. They had foxes running around at one point. So we were there on a Friday because every Friday afternoon Arturo assembles this really big group. To review all the data you got about 15 people crammed into a conference room like technical folks. SPEAKER_15: Mustaba software engineer trust engineering at Facebook. Dan Farrell I'm a data scientist. Paul I'm also an engineer. SPEAKER_18: A lot of these guys called themselves trust engineers. SPEAKER_14: And every Friday the trust engineers are joined by a bunch of outside scientists. Dakar Keltner professor of psychology UC Berkeley. SPEAKER_14: Matt Gillingworth I studied the causes and nature of human happiness. Miliana Simon Thomas and my background is neuroscience. SPEAKER_14: This is the meeting where the team was reviewing all the data about these phrases. And so everybody was looking at a giant graph projected on the wall. It's kind of supporting your slightly U-shaped curve there in that especially in the deletion SPEAKER_09: numbers the hey I don't like this photo take it down and the hey I don't like this photo would you please take it down are kind of the winners here. It's kind of interesting that you see the person that's receiving a more direct message SPEAKER_03: is higher 11 percent versus 4 percent. SPEAKER_14: One of the things they notice is that anytime they use the word sorry in a phrase like hey Robert sorry to bring this up but would you please take it down. SPEAKER_07: Dan Farrell Turns out the I'm sorry doesn't actually help it makes the numbers go down. Dan Farrell Really? SPEAKER_03: Miliana Simon Seven and nine are the low some of the low points and those are the ones that say sorry. Dan Farrell So like just don't apologize just don't apologize SPEAKER_14: because like it shifts the responsibility back to you I guess. Dan Farrell No it doesn't it's just it's just it's gentler. Dan Farrell No I mean it's like it's a it's a linguistic psychology subtle thing. Miliana Simon You're making that up. Dan Farrell I am kind of but one of the things that really struck me at this meeting on a different subject is that the scientists in the room as they were looking at the graph taken in the numbers a lot of them had this look on their face of like holy. Amelia Simon Thomas I'm just stunned and humbled at the numbers SPEAKER_20: that we generally get in these studies. Dan Farrell That's Amelia Simon Thomas from Berkeley. SPEAKER_14: Amelia Simon Thomas My background is in neuroscience and I'm SPEAKER_20: used to studies where we look at 20 people and that's sufficient to say something general about how brains work. Like in general at Facebook like people would scoff at sample sizes that small. SPEAKER_10: Dan Farrell That's Rob Boyle who's a project manager at SPEAKER_10: Facebook. Rob Boyle The magnitudes that we're used to working with are in the hundreds of thousands to millions. Dan Farrell It's kind of an interesting moment because SPEAKER_14: there's been a lot of criticism recently especially in social science about the sample sizes how they're too small and how there's they're too often filled with white undergraduate college kids and how can you generalize from that. So you could tell that some of the scientists in the room like for example Dacher Keltner he's a psychologist at UC Berkeley like oh my God look at what we can do now we can get all these different people. Rob Boyle Of different class backgrounds different countries. Dan Farrell To him this kind of work with Facebook this SPEAKER_14: could be the future of social science right here. Rob Boyle There has never been a human community like SPEAKER_10: this in human history. SPEAKER_14: Dan Farrell It's somewhere in the middle of all the excitement about the data and the speed at which they can now test things. SPEAKER_10: Dan Keltner The bottleneck is no longer how fast we can test how things work it's coming up with the right things to test. SPEAKER_14: Rob Boyle Andrew threw out a question. SPEAKER_11: Andrew SPEAKER_21: Keltner That kind of blew me back a little bit. SPEAKER_11: I was like I've been a research subject and I had no idea. SPEAKER_14: Lulu Haines Coming up everybody gets the idea and the lab rats revolt. Stay with us. Lulu Haines Lulu here. SPEAKER_04: If you ever heard the classic Radiolab episode Sometimes Behave So Strangely you know that speech can suddenly leap into music and really how strange and magic sound itself can be. We at Radiolab take sound seriously and use it to make our journalism as impactful as it can be and we need your help to keep doing it. The best way to support us is to join our membership program the lab this month all new members will get a t shirt that says sometimes behave so strangely to check out the t shirt and support the show go to radiolab.org slash join WNYC studios is supported by Carvana SPEAKER_05: introducing Carvana value tracker where you can track your car's value over time and learn what's driving it. It might make you excited. Whoa didn't know my car was valued this high. It might make you nervous. SPEAKER_10: Oh markets flooded. My car's value just dipped 2.3 percent. It might make you optimistic. SPEAKER_05: Our low mileage is paying off our values up and it might make you realistic car prices SPEAKER_10: haven't gone up in a couple weeks. SPEAKER_05: Maybe it's time to sell but it will definitely make you an expert on your car's value Carvana value tracker visit Carvana dot com to start tracking your car's value today. SPEAKER_04: Radiolab is supported by Capital One with no fees or minimums banking with Capital One is the easiest decision in the history of decisions even easier than deciding to listen to another episode of your favorite podcast and with no overdraft fees. Is it even a decision that's banking reimagined. What's in your wallet. Terms apply. Visit Capital One dot com slash bank Capital One and a member FDIC. SPEAKER_02: After her emails became shorthand in 2016 for the media's deep focus on Hillary Clinton's server hygiene at the expense of policy issues is history repeating itself. SPEAKER_22: You can almost see an equation again I would say led by the times in Biden being old with Donald Trump being under dozens of felony indictments. SPEAKER_02: Listen to on the media from WNYC find on the media wherever you get your podcasts. SPEAKER_14: This is Radiolab and we'll pick up the story with Andrew Zali and I sitting in a meeting at Facebook headquarters. This is about a year and a half ago we had just learned that at any given moment any given Facebook user is part of 10 experiments at once without really their knowledge and sitting there in that meeting. You know this was a while ago we both were like did we just hear that correctly. SPEAKER_11: That kind of blew me back a little bit. I was like I've been a research subject and I had no idea and I had that moment of discovery on a Friday and literally the next day Saturday. This is scary. The world had that experience Facebook using you and me as lab rats for a Facebook experiment on emotions. SPEAKER_14: Barely a day after we'd gotten off the plane from Facebook headquarters the kerfuffle occurred. SPEAKER_08: Facebook exposed for using us as lab rats as lab rats lab rats. SPEAKER_14: Facebook messing with your emotions. You may remember the story because for a hot second it was everywhere. Facebook altered the amount of it was all over Facebook. There was an academic paper had come out that showed that with some scientists the company had intentionally manipulated user news feeds to study a person's emotional response. SPEAKER_10: Seriously they wanted to see how emotions spread on social media. SPEAKER_14: They basically tinkered with the news feeds of about 700,000 people. SPEAKER_01: 700,000 users to test how they'd react if they saw more positive versus negative posts and vice versa. SPEAKER_14: And they found an effect that when people saw more positive stuff in their news feeds they would post more positive things themselves and vice versa. It was a tiny effect. Tiny effect. But the results weren't really the story. The real story was that Facebook was messing with us. It gives you pause and scares me when you think that they were just doing an experiment SPEAKER_16: to manipulate how people were feeling and how they then reacted on Facebook. SPEAKER_11: People went apoplectic. It has this big brother element to it that I think people are going to be very uncomfortable SPEAKER_01: with. SPEAKER_11: And some people went so far as to argue. SPEAKER_00: I wonder if Facebook killed anyone with their emotional manipulation stunt. SPEAKER_11: If a person had a psychological or psychiatric disorder, manipulating their social world could cause them real harm. SPEAKER_01: Make sure you read those terms and conditions my friends. SPEAKER_06: Always. That's the big takeaway. What you hear is a sense of betrayal. That I really wasn't aware that this space of mine was being treated in these ways and that I was part of your psychological experimentation. That's Kate Crawford. I'm a principal researcher at Microsoft Research. SPEAKER_14: Visiting professor at MIT, strong critic of Facebook throughout the kerfuffle. SPEAKER_06: There is a power imbalance at work. I think when we look at the way that that experiment was done, it's an example of highly centralized power and highly opaque power at work. And I don't want to see us in a situation where we just have to blindly trust that platforms are looking out for us. Here I'm thinking of an earlier Facebook study actually back in 2010 where they did a study looking at whether they could increase voter turnout. They had this quite simple design. They came up with a little box that would pop up and show you where your nearest voting booth was. And then they said, oh, well, in addition to that, when you voted, here's a button you can press that says I voted. And then you'll also see the pictures of six of your friends who'd also voted that day. Would this change the number of people who went out to vote that day? SPEAKER_14: And Facebook found that it did. That if you saw a bunch of pictures of your friends who had voted and you saw those pictures on election day, you were then 2% more likely to click the I voted button yourself, presumably because you too had gone out and voted. Now 2% might not sound like a lot, but it was not insignificant again, I think, by the SPEAKER_06: order of 340,000 votes, the votes that they estimate they actually shifted by getting people to go out. Really? SPEAKER_14: So these are people who wouldn't have voted and did? SPEAKER_06: Who wouldn't have voted and who they have said in their own paper and published paper that they increased the number of votes that day by 340,000. SPEAKER_14: Simply by saying that your neighbors did it too? SPEAKER_06: Yeah, by your friends. SPEAKER_14: Now my first reaction to this, I must admit, was okay, I mean, we're at historic lows when it comes to voter turnout. This sounds like a good thing. Yes. SPEAKER_06: But what happens if someone's running a platform that a lot of people are on and they say, hey, you know, I'm really interested in this candidate. This candidate is going to look out not just for my interests, but the interests of the technology sector. And I think they're, you know, they're a great candidate. Why don't we just show that get out to vote message and that little system design that we have to the people who clearly, because we already have their political preferences, the ones who kind of agree with us and the people who disagree with that candidate, they won't get those little nudges. Now that is a profound democratic power that you have. SPEAKER_14: Kate's basic position is that when it comes to social engineering, which is what this is, companies and the people that use them need to be really, really careful. In fact, when Andrew mentioned to her that Arturo had this group and the group had a name. SPEAKER_11: He actually runs a group called the Trust Engineering Group. His job is to engineer trust. When Andrew told her that, Facebook users, you're smacking your forehead. SPEAKER_15: I think we call that a face palm. SPEAKER_14: She face palm really hard. SPEAKER_06: These ideas that we could somehow engineer compassion, I think to some degree have a kind of hubris in them. SPEAKER_06: Who are we to decide whether we can make somebody more compassionate or not? SPEAKER_14: Couple months after our first interview, we spoke to Arturo Bejar again. At this point, the kerfuffle was dying down. We asked him about all the uproar. I know this is not your work. This is the emotional contagion stuff. But literally like hours after we got back from that meeting, that thing erupted. Do you understand the backlash? SPEAKER_07: No, I mean, I think that, I mean, we really care about the people who use Facebook. I don't think that there's such a thing as, I mean, if anything I've learned in this work is that you really have to respect people's response and emotions, no matter what they SPEAKER_15: are. SPEAKER_14: He says the whole thing definitely made them take stock. SPEAKER_07: There was a moment of concern of what it would mean to the work. And there was like this, is this going to mean that we can't do this? Part of me being honest coming here is I actually want to reclaim back the word emotion and reclaim back the ability to do very thoughtful and careful experiments. I want to come back to the word experiment. You want to reclaim it from what? SPEAKER_07: Well, suddenly like the word emotion and the word experiment, all these things became really charged. SPEAKER_14: Well, yeah, because people thought that Facebook was manipulating emotion and they were like, how could they? Yes, but in our case, right? SPEAKER_07: And in the work that we're talking about right now, all of the work that we do begins with a person asking us for help. SPEAKER_14: This was Arturo's most emphatic point. He said it over and over that, you know, Facebook isn't just doing this for fun. People are asking for help. They need help. Which points to one of the biggest challenges of living online, which is that, you know, offline, you know, when we try and engineer trust offline, or at least just read one another, we do it in these super subtle ways using eye contact and facial expressions and posture and tone of voice, all this nonverbal stuff. And of course, when we go online, we don't have access to any of that. SPEAKER_07: In the absence of that feedback, how do we communicate? What does communication turn into? I mean, I think about like what it means to be in the presence of a friend or a loved one and how do you build experiences that facilitate that when you cannot be physically together? SPEAKER_14: Arturo says that's really all he's up to. He's just trying to nudge people a tiny bit so that their online selves are a little bit closer to how they are offline. And I got to say, if he can do that by engineering a couple of phrases like, hey, Robert, would you mind, et cetera, et cetera, well, then I'm all for it. SPEAKER_08: Why not take the position that to create a company that stands between two people who are interacting and then giving them boxes and statuses and advertising and so forth. This is not doing a service. This is a way to wedge yourself into the ordinary business of social intercourse and make money on it. So you're acting like this group of people now is going to try to create the moral equivalent of an actual conversation? First of all, it's probably not engineerable. And second of all, I don't believe that for a moment. All I'm thinking is they're going to just go and figure out other ways in which to make a revenue enhancer. No, I don't think it's one or the other. SPEAKER_14: I think they're in it for the money. In fact, if they can figure this out and make the Internet universe more conducive to trust, less annoying, it could mean trillions of dollars. So yeah, it's the money. But still, that doesn't negate the fact that we have to build these systems, right? That we have to make the Internet a little bit better. SPEAKER_08: That's fine. This idea, however, that you're going to have to coach people into the subtleties of the relationship. Tell them you're sorry. Tell them, just, you know, here's the formula for this. He doesn't want, he did something. You need to repair that, here are the seven ways you might repair that. To do all that, it's as if the Hallmark Card Company, instead of living only on Mother's Day, Father's Day, and birthdays, just spread its evil wings out into the whole rest of your life. And I don't think that's a wonderful thing. SPEAKER_14: I think, you know, I have a slight opinion of it. I mean, you got to keep in mind how this thing came about. I mean, they tried to get people to talk to each other. They gave them the blank text box, but nobody used it, right? So they're like, okay, let's come up with some stock phrases that, yes, are generic. But think about the next step. After you send the message saying, you know, Jad, I don't like the photo, please take it down, presumably then you and I get into a conversation. Maybe I explain myself, I say, oh my God, I'm so sorry, I didn't realize that you didn't like that photo. I just thought that that was an amazing night. I just thought that was a great night. I didn't realize you thought you looked so sorry. I'll take it down. It's cool. It's cool. See, now, presumably we're having that conversation as a next step. Why do you presume that? SPEAKER_08: How many of the birthday cards that you've sent to first cousins have resulted in a conversation? Maybe not. See, that's the thing. Sometimes these things are actually not, they're really the opposite of what you're saying. They're conversation substitutes. SPEAKER_14: Maybe. Maybe they're conversation starters. SPEAKER_08: Maybe that's the deep experiment. SPEAKER_14: Are they conversation starters or substitutes? Well, I hope they're conversation starters. Yeah. Because maybe that would be a beginning. SPEAKER_11: It kind of, in my mind, goes back to the beginning of the automobile age. SPEAKER_14: This is how Andrew puts it. SPEAKER_11: There was a time when automobiles were new. And, you know, they didn't have turn signals. The tools they did have, like the horn, didn't necessarily indicate all the things that we use it to indicate. It wasn't clear what the horn was actually there to do. Was it there to say hello or is it there to say get out of the way? And over time, we created norms. We created roads with lanes. We created turn signals that are primarily there for other people so that we can coexist in this great flow without crashing into each other. And we still have road rage. And we still have road rage. We still have places where those tools are incomplete. SPEAKER_14: Thanks to Andrew Zolle. Many, many, many, many thanks. Yes, definitely. For bringing us that story and for reporting it with me for so long. And to Arturo, who you kept bringing back into the studio. SPEAKER_14: Yes, thank you very much to Arturo and the whole team over there. And by the way, they have changed their name. It's no longer Trust Engineering. It is the Facebook Protect and Care team. Really? Yeah. We had some original music this hour from Mooninite, thanks to them. Props to Andy Mills for production support. And also, Andrew Zolle put together a blog post. If you go to radiolab.org, you can see it, which covers some really interesting research that we didn't get a chance to talk about. And if you've ever sent an email with a little smiley face, you're definitely going to want to read this. radiolab.org. I'm Chad Abumrad. I'm Robert Krolwich. Thanks for listening. SPEAKER_18: Radiolab was created by Chad Abumrad and is edited by Soren Wheeler. Lulu Miller and Latif Nasser are our co-hosts. Lillian Keefe is our director of sound design. Our staff includes Simon Adler, Jeremy Blum, Beckham Ressler, Rachel Cusick, Makedi Foster-Keyes, W. Harry Fortuna, David Gable, Maria Pascu-Tieres, Sindhu Nyanasanbandhan, Matt Kielty, Annie McEwen, Alex Neeson, Sara Khare, Ana Rasquette-Bass, Sarah Sandback, Ariane Wax, Pat Walters, and Molly Webster, with help from Andrew Vinales. Our fact checkers are Diane Kelly, Emily Krieger, and Natalie Middleton. SPEAKER_17: Hi, this is Ellie from Cleveland, Ohio. Leadership support for Radiolab science programming is provided by the Gordon and Betty Moore Foundation, Science Sandbox, Assignments Foundation Initiative, and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation. SPEAKER_04: Radiolab is supported by Capital One. With no fees or minimums, banking with Capital One is the easiest decision in the history of decisions. Even easier than deciding to listen to another episode of your favorite podcast. And with no overdraft fees, is it even a decision? What's banking reimagined? What's in your wallet? Terms apply. See CapitalOne.com slash bank. Capital One NA, member FDIC.