'Like' button

Episode Summary

Title: 'Like' Button Summary: Paragraph 1: Leah Perlman, a former Facebook employee, invented the 'Like' button in 2007. It became hugely popular, increasing engagement on the platform. But when Facebook changed its algorithm in a way that reduced engagement on Perlman's own content, she felt distressed and bought Facebook ads to drive more likes. Paragraph 2: Researchers have found that 'likes' and notifications trigger dopamine responses in the brain, similar to slot machines. The like button massively increased engagement for Facebook and other platforms. Paragraph 3: Michal Kuczynski, a researcher at Cambridge, realized you could predict personality traits and demographics from people's likes. With enough likes, he could predict traits better than friends or partners. Paragraph 4: Facebook gets full access to users' likes which helps them tailor newsfeeds to maximize engagement and target ads. Some problematic ad targeting has occurred, like excluding African Americans from housing ads. Paragraph 5: Experts question how effective Facebook's ad targeting really is. But they do excel at maximizing attention and screen time. As social approval becomes addictive, emotional literacy and self-love may help manage compulsions.

Episode Show Notes

Facebook’s 'like' button is ubiquitous across the web. It’s how user data is collected, meaning adverts and newsfeeds can be targeted more effectively. Some say there’s nothing to worry about, but others point to the Cambridge Analytica scandal, suggesting how Facebook might shape our opinions. But is there something else we should be worried about? Approval from our friends and family can be addictive – so is the pursuit of “likes” on social media the reason we’re glued to our mobile phones? Tim Harford asks how should we manage our compulsions in this brave new online world.

Episode Transcript

SPEAKER_02: Amazing, fascinating stories of inventions, ideas and innovations. Yes, this is the podcast about the things that have helped to shape our lives. Podcasts from the BBC World Service are supported by advertising. SPEAKER_01: 50 Things That Made the Modern Economy with Tim Harford Leah Perlman draws comics, sharing her ideas on topics such as emotional literacy and self-love. SPEAKER_00: When she started to post them on Facebook, she discovered that her friends found them healing and endearing. But then Facebook changed its algorithm, how it decides what to put in front of our eyeballs. If social media is a big part of your life, an algorithm change can come as a shock. You might suddenly find that your content is being shown to fewer people. And that's what happened to Leah. Her comics started to get fewer likes. She told an interviewer for Vice.com it felt like she wasn't getting enough oxygen. She could pour her heart and soul into a drawing, then watch as it racked up only 20 likes. It's easy to empathise. Social approval can be addictive. And what's a Facebook like if not social approval distilled into its purest form? Researchers now liken our smartphones to slot machines. They trigger the same reward pathways in our brain. More likes, new notifications, even an old-fashioned email. We never know what we'll get when we pull a lever. Faced with a sudden drop in her likes, Leah started to buy ads on Facebook. That is, she started to pay Facebook so more people would see her comics. She just wanted the attention, but she felt embarrassed admitting it. There's an irony behind Leah's embarrassment. Before she was a comic artist, Leah was a developer at Facebook. In July 2007, her team invented the Like button. It's now ubiquitous across the web, as content creators invite you to signal your approval to your Facebook friends. There are similar features everywhere from YouTube to Twitter. For the platforms, the benefit is obvious. A single click is the simplest possible way to get users to engage, much easier than typing out a comment. But the idea wasn't immediately appreciated. Facebook CEO Mark Zuckerberg repeatedly knocked it back. And the symbol. While a thumbs up means approval in most cultures, in some it has a rather cruder and less friendly meaning. Eventually, in February 2009, the Like button launched. Leah Perlman remembered how quickly it took off. Almost immediately, 50 comments would become 150 likes. More engagement, more status updates, more content. It all just worked. Meanwhile, at Cambridge University, Michal Kuczynski was doing a PhD in psychometrics, the study of measuring psychological profiles. A fellow student had written a Facebook app to test the big five personality traits. Openness, conscientiousness, extraversion, agreeableness and neuroticism. Taking the test gave the researchers permission to access your Facebook profile with your age, gender, sexual orientation and so on. The test went viral. The data set swelled to millions of people and the researchers could see every time those people had clicked like. Kuczynski realised he was sitting on a treasure trove of potential insights. It turned out, for example, that a slightly higher proportion of gay men than straight men liked the cosmetics brand MAC. That's only one data point. Kuczynski couldn't tell if you're gay from a single like, but the more likes he saw, the more accurate guesses he could make. At your sexual orientation, religious affiliation, political leanings and more. Kuczynski concluded that if you liked 70 things, he'd know you better than your friends. After 300 likes, he knew you better than your partner. Facebook has since restricted what data gets shared with app developers like Kuczynski's colleague, but one organisation still gets to see all your likes and more besides. Facebook itself. What can Facebook do with its window into your soul? Two things. First, it can tailor your news feed so you spend more time on the platform. Whether that means showing you cat videos, inspirational memes, things that will outrage you about Donald Trump or things that will outrage you about Donald Trump's opponents. Second, it can help advertisers to target you. The better ads perform, the more money it makes. Targeted adverts are nothing new. Long before the internet and social media, if you were opening a new bicycle shop in Springfield, say, you might have chosen to advertise in the Springfield Gazette or Cycling Weekly rather than the New York Times or Good Housekeeping. Of course, that still wasn't very efficient. Most Gazette readers wouldn't be cyclists and most subscribers to Cycling Weekly wouldn't live near Springfield, but it was the best you could do. You could say that Facebook simply improves that process and it's nothing to worry about. If you ask it to show your ads only to Springfield residents who've liked content on cycling, who could object to that? This is the kind of example Facebook tends to cite when it defends the concept of relevant advertising. But there are other possible uses which might make us feel more queasy. How about advertising a house for rent and not showing that advert to African-Americans? The investigative website ProPublica wondered if that would work. It did. Facebook said, oops, that shouldn't have happened. It was a technical failure. Or how about helping advertisers to reach self-proclaimed Jew haters? ProPublica showed that was possible too. Facebook said, oops, it wouldn't happen again. This kind of thing might worry us because not all advertisers are as benign as bicycle shops. You can also pay to spread political messages, which can be hard for users to contextualise or verify. A firm called Cambridge Analytica claimed it had swung the 2016 election for Donald Trump in part by harnessing the power of the like button to target individual voters, much to the horror of Michal Krasinski, the researcher who had first suggested what might be possible. In reality, Facebook's potential for mind control still seems to be reassuringly limited. Experts who've looked into Cambridge Analytica question how effective they really are. And for all the targeting, analysts report that the click-through rate on Facebook adverts still averages less than 1%. Perhaps we should worry more about Facebook's undoubted proficiency at serving us more adverts by sucking in an inordinate amount of our attention, hooking us to our screens. How should we manage our compulsions in this brave new social media world? We might cultivate emotional literacy about how the algorithm affects us. And if social approval feels as vital as oxygen, maybe more self-love is the answer. If I see any good comics on the subject, I'll be sure to click like. SPEAKER_01: Thanks for watching.