Facebook Data Scandal
By now you’ve probably heard that Facebook is scrambling to contain the fallout over the Facebook Data Scandal. Mark Zuckerberg finally addressed the story on Wednesday, but only after a Delete Facebook campaign had started to gain momentum around the world, learn about how showing proof of income works.
“So this was a major breach of trust, and I’m really sorry that this happened. We have a basic responsibility to protect people’s data, and if we can’t do that, then we don’t deserve to have the opportunity to serve people” – Mark Zuckerberg
So how do we get here and how does Facebook recover? Well, first of all, let’s figure out what happened.
So on March 16th, the New York Times and The Guardian reported that a data mining firm named Cambridge Analytica, which had worked on Donald Trump’s presidential campaign had improperly obtained access to more than 50 million user profiles.
Experts believe the firm could have used that data to gain an unfair advantage in targeting voters. Mark Zuckerberg didn’t say anything about these stories for five days. And then on Wednesday, he wrote a Facebook post and gave an interview to CNN.
This scandal is extremely weird because we’ve known the basic details that Cambridge Analytica got access to these profiles for more than two years. It’s not clear that the data they obtained was really all that useful to them, and the number of profiles that they supposedly obtained, 50 million, that might turn out to just be marketing hype from Cambridge Analytica.
Despite all that, this is the biggest public relations crisis Facebook has faced since the aftermath of the 2016 election. Senators are calling on Zuckerberg to testify, the Federal Trade Commission is investigating, British authorities are investigating, and the Facebook stock price has been declining.
So let’s look at what happened. The story starts in 2014. That’s when a University of Cambridge researcher named Aleksandr Kogan created an app called “thisisyourdigitallife”. About 270,000 people downloaded it, gave away their information, and Kogan, unbeknownst to them, passed along that information to a data mining and political strategy firm, named Cambridge Analytica.
At the time, Facebook’s platform API let developers like Kogan access information about your friends as well as yourself. Christopher Wylie, who used to work at Cambridge Analytica, told The Times and The Guardian that that was how his company was able to get access to information of as many as 50 million people.
And the idea was, that by gleaning your Facebook likes, the company could begin to understand your personality and then, more effectively target political advertising at you. This kind of thing is known as psychographic profiling. Experts say it can be useful at the margins in persuading voters, but at the same time, they say its effect can be easily overstated.
Granted, Trump’s campaign wasn’t the first to gain information about potential supporters using Facebook. In fact, in 2012, President Obama’s election team had created an app and done a very similar thing. But there was a big difference.
President Obama’s team told voters what it was doing. Cambridge Analytica obtained this information in total violation of Facebook’s rules and didn’t tell anybody who was taking Kogan’s personality quiz that their data would eventually be used for political advertising targeting. The Guardian revealed the scheme in 2015.
Facebook went to Kogan and Cambridge Analytica and demanded that they delete all of the data that they had obtained in violation of Facebook’s rules.
But the reports say that, in reality, Cambridge Analytica and Kogan never deleted the data, and Facebook never investigated to see whether they had deleted the data as promised.
This gets to the heart of why some people are deleting their Facebook accounts right now. Facebook made it too easy for developers like Kogan to get access to their data, to their friends’ data, and to share it.
And it never informed people that their data had been improperly used. The scandal also came at a time when trust in Facebook has never been lower. In the aftermath of the 2016 election, we saw how obviously fake stories could spread faster and more persuasively than many true ones.
And we learned that Kremlin-linked groups had waged a highly effective misinformation campaign on the platform. In some cases, even illegally buying ads.
Meanwhile, new research was coming out showing that just browsing the newsfeed can make you feel worse about yourself, and a group of former Facebook executives came forward and expressed regret about their part in building the social network, to begin with.
After a long delay, Zuckerberg announced plans to address abuses like Cambridge Analytica’s.
In 2014, Facebook had already stopped developers from gaining access to information about your friends. Now it’s going a step further. If you go three months without using an app, Facebook is gonna cut off developer access to any information about you.
And for developers that did have access to all that information, including information about your friends way back in 2014, Facebook is going to demand that they submit to an audit or be kicked off the platform.
So will this fix the problem? On one hand, it’s a start. Restricting developer access to your data could help Facebook start to rebuild some trust. But the larger issue for Zuckerberg is that he’s really confronting three crises at once. There’s the data privacy issue that the Cambridge Analytica story reveals.
There’s the newsfeed integrity issue and whether we can trust what we see on Facebook.
And there’s the broader cultural reckoning over social media, how we spend our time there, and whether it’s ultimately good for us and the world.
In January, Zuckerberg said that fixing Facebook’s platform would be his personal challenge for the year. And yet, as he wades into this latest crisis, five days after it began, it’s not clear Facebook is doing everything it can to address it.
And as a Delete Facebook campaign starts gaining steam around the world, the challenge of fixing Facebook’s platform feels greater than ever.