- Justin Sullivan/Getty
- There are new details of the secretive new terms Facebook is asking developers to sign before they can run apps on the social network.
- The new terms are designed to help Facebook police what data developers use and to prevent a second Cambridge Analytica.
- Facebook is trying to be more transparent but it isn’t publishing these new developer terms openly.
- This makes it hard to see how Facebook might prevent further abuse of its platform.
Facebook has talked about restricting developer access to its platform in the wake of Cambridge Analytica, but we haven’t heard much about how exactly the company might actually prevent a repeat of the scandal.
Earlier this year, Facebook CTO Mike Schroepfer said developers would need permission before running certain types of app on the platform. That’s particularly if they needed access to certain types of data, such as Facebook Pages. Schroepfer didn’t go into further detail about what that permission might look like.
Now Facebook has introduced secretive “supplemental terms” that go far beyond what developers previously agreed to when it came to building apps, a source who has seen the rules tells Business Insider. These terms are in addition to the public policies on Facebook’s developer site, but they are not publicly available and developers who sign them are under a strict non-disclosure agreement.
The new terms are designed to help the social network police what data developers are using, and prevent the kind of abuse that led to the Cambridge Analytica scandal.
Specifically, Business Insider’s source says the policies give Facebook explicit permission to audit what information developers are using, their processes, and controls. They also state that the developer’s terms can’t contradict Facebook’s own policies, and require that an app’s data security safeguards are up to scratch.
Facebook did not comment on the terms, but pointed to a short explanation of the supplemental terms on its developer website.
These updates are significant because of the way the Cambridge Analytica scandal turned on one academic, Aleksandr Kogan, breaking Facebook’s terms of service by building an app that harvested and sold millions of social network profiles.
Kogan told British politicians that his app explicitly stated that it would “transfer and sell data”, and said Facebook didn’t notice this contradicted its terms. And according to former Facebook employee Sandy Parakilas, Facebook didn’t have the safeguards in place to prevent developers from siphoning off data.
The new terms look like an attempt to sew up those loopholes.
Facebook is generally on a push to be more transparent after the Cambridge Analytica scandal, but the secrecy around the supplemental terms makes it hard to see how the firm will really audit developers’ use of people’s data.
Chris Vickery, a security expert with UpGuard, is sceptical that the new terms will give much protection.
He told Business Insider: “If they do pick up someone that is suspicious, how are they going to audit their systems? Someone can take the data they’ve vacuumed up, stick on a USB stick, and then put it under their couch… There’s no way Facebook can be sure.
“And if Facebook does find evidence of data that shouldn’t have been collected, how will they guarantee it’s been deleted? Maybe [the developer] copied and shared it. Facebook has no control over that whatsoever.”