The Cambridge Analytica Angle, Explained

Ken AshfordElection 2016, Science & TechnologyLeave a Comment

The Cambridge Analytica scandal is suddenly a major problem for Facebook.

On Tuesday, the Federal Trade Commission opened an investigation into how Cambridge Analytica, ostensibly a voter-profiling company, accessed data about 50 million Facebook users, according to The Wall Street Journal. It’s not alone: The GOP-controlled Senate Commerce Committee demanded answers from Facebook on Monday, as did Senator Ron Wyden, a Democrat of Oregon.

The social giant’s stock has also lost about 12 percent of its value since The New York Times and The Guardian broke the first stories about the scandal over the weekend.

The scandal sure seems like bad news. But if you’re a little fuzzy, here is a summary of the heart of the scandal as it involves Facebook, in one paragraph:

In June 2014, a researcher named Aleksandr Kogan developed a personality-quiz app for Facebook. It was heavily influenced by a similar personality-quiz app made by the Psychometrics Centre, a Cambridge University laboratory where Kogan worked. About 270,000 people installed Kogan’s app on their Facebook account. But as with any Facebook developer at the time, Kogan could access data about those users or their friends. And when Kogan’s app asked for that data, it saved that information into a private database instead of immediately deleting it. Kogan provided that private database, containing information about 50 million Facebook users, to the voter-profiling company Cambridge Analytica. Cambridge Analytica used it to make 30 million “psychographic” profiles about voters.

That’s the whole thing. The Guardian referred to the data misuse as a “breach,” a description which Facebook contests. “No systems were infiltrated, no passwords or information were stolen or hacked,” tweeted one Facebook executive. But it’s not hard to see why U.S. and U.K. lawmakers remain interested in the episode: It’s almost like Facebook was a local public library lending out massive hard drives of music, but warning people not to copy any of it to their home computer. When someone eventually did copy all that music—and got in trouble for it—isn’t the hard-drive-dispensing public library responsible as well?

There is a second part of the scandal, but it concerns Cambridge Analytica and its connection to President Trump’s political world. So, here’s a summary in two paragraphs:

Cambridge Analytica has significant ties to some of President Trump’s most prominent supporters and advisers. Rebekah Mercer, a Republican donor and a co-owner of Breitbart News, sits on the board of Cambridge Analytica. Her father, Robert Mercer, invested $15 million in Cambridge Analytica on the recommendation of his political adviser, Steve Bannon, according to the TimesOn Monday, hidden-camera footage appeared to show Alexander Nix, Cambridge Analytica’s CEO, offering to bribe and blackmail public officials around the world. If Nix did so, it would violate U.K. law. Cambridge Analytica suspended Nix on Tuesday.

Cambridge Analytica also used its “psychographic” tools to make targeted online ad buys for the Brexit “Leave” campaign, the 2016 presidential campaign of Ted Cruz, and the 2016 Trump campaign. If any British Cambridge Analytica employees without a green card worked on those two U.S. campaigns, they did so in violation of federal law. If information or data was passed on to Russians (the aforementioned Aleksandr Kogan had previously unreported ties to St. Petersberg University and Cambridge Analytica had done a presentation for Energy firm Lukoil, which is now on the US sanctions list), then you have possible “collusion”.

But there’s still much we don’t know about Cambridge Analytica. Did its “psychographic” tools, built with the misused Facebook data, actually work? Did various hard-right campaigns consider Cambridge Analytica so important because its technology reshaped U.S. and U.K. politics—or because using it ingratiated campaigns to Robert and Rebekah Mercer, two of the richest people in the world? And if Cambridge Analytica really was a voter-profiling company, what was its chief executive doing apparently promising to bribe and blackmail public officials?

Questions remain about Facebook’s role, too. Since the 2016 elections, public ire has focused on the company’s powerful News Feed and the role it played in amplifying Russian propaganda and other hoaxes. Lawmakers have also criticized the company’s lax sale of political advertisements to purchasers literally paying with Russian rubles. Political ads are not regulated as closely online as they are on the TV or radio.

But the Cambridge Analytica scandal opens a new front for the company. Before Facebook became a distributor of news, it was a platform for online applications, like personality quizzes and social games like Farmville. Facebook has allowed third-party app developers to access some private user data since May 2007, when it first opened the Facebook platform. Users must consent to giving apps their data, but sometimes—as in the case of Kogan’s app—developers can access data about a consenting user’s friends, without getting those friends’ consent.

During the ensuing decade, Facebook has occasionally tweaked how much data apps can access. But over that time, how many developers abided by Facebook’s rules? How many followed Kogan’s route, caching the data and making their own private databases? Where is that information now? And if all that private user data is as powerful as Cambridge Analytica once said it was, what has it been used to do?