Facebook, Inc. will pay a record-breaking $5 billion penalty, and submit to new restrictions and a modified corporate structure that will hold the company accountable for the decisions it makes about its users’ privacy, to settle Federal Trade Commission charges that the company violated a 2012 FTC order by deceiving users about their ability to control the privacy of their personal information.
- Facebook must exercise greater oversight over third-party apps, including by terminating app developers that fail to certify that they are in compliance with Facebook’s platform policies or fail to justify their need for specific user data;
- Facebook is prohibited from using telephone numbers obtained to enable a security feature (e.g., two-factor authentication) for advertising;
- Facebook must provide clear and conspicuous notice of its use of facial recognition technology, and obtain affirmative express user consent prior to any use that materially exceeds its prior disclosures to users;
- Facebook must establish, implement, and maintain a comprehensive data security program;
- Facebook must encrypt user passwords and regularly scan to detect whether any passwords are stored in plaintext; and
- Facebook is prohibited from asking for email passwords to other services when consumers sign up for its services.
NOTE: The Commission files a complaint when it has “reason to believe” that the named defendants are violating or are about to violate the law and it appears to the Commission that a proceeding is in the public interest.
Alleged Violations of 2012 Order
The settlement stems from alleged violations of the FTC’s 2012 settlement order with Facebook. Among other things, the 2012 order prohibited Facebook from making misrepresentations about the privacy or security of consumers’ personal information, and the extent to which it shares personal information, such as names and dates of birth, with third parties. It also required Facebook to maintain a reasonable privacy program that safeguards the privacy and confidentiality of user information.
The FTC alleges that Facebook violated the 2012 order by deceiving its users when the company shared the data of users’ Facebook friends with third-party app developers, even when those friends had set more restrictive privacy settings.
In May 2012, Facebook added a disclosure to its central “Privacy Settings” page that information shared with a user’s Facebook friends could also be shared with the apps used by those friends. The FTC alleges that four months after the 2012 order was finalized in August 2012, Facebook removed this disclosure from the central “Privacy Settings” page, even though it was still sharing data from an app user’s Facebook friends with third-party developers.
Additionally, Facebook launched various services such as “Privacy Shortcuts” in late 2012 and “Privacy Checkup” in 2014 that claimed to help users better manage their privacy settings. These services, however, allegedly failed to disclose that even when users chose the most restrictive sharing settings, Facebook could still share user information with the apps of the user’s Facebook friends—unless they also went to the “Apps Settings Page” and opted out of such sharing. The FTC alleges the company did not disclose anywhere on the Privacy Settings page or the “About” section of the profile page that Facebook could still share information with third-party developers on the Facebook platform about an app users Facebook friends.
Facebook announced in April 2014 that it would stop allowing third-party developers to collect data about the friends of app users (“affected friend data”). Despite this promise, the company separately told developers that they could collect this data until April 2015 if they already had an existing app on the platform. The FTC alleges that Facebook waited until at least June 2018 to stop sharing user information with third-party apps used by their Facebook friends.
In addition, the complaint alleges that Facebook improperly policed app developers on its platform. The FTC alleges that, as a general practice, Facebook did not screen the developers or their apps before granting them access to vast amounts of user data. Instead, Facebook allegedly only required developers to agree to Facebook’s policies and terms when they registered their app with the Facebook Platform. The company claimed to rely on administering consequences for policy violations that subsequently came to its attention after developers had already received data about Facebook users. The complaint alleges, however, that Facebook did not enforce such policies consistently and often based enforcement of its policies on whether Facebook benefited financially from its arrangements with the developer, and that this practice violated the 2012 order’s requirement to maintain a reasonable privacy program.
The FTC also alleges that Facebook misrepresented users’ ability to control the use of facial recognition technology with their accounts. According to the complaint, Facebook’s data policy, updated in April 2018, was deceptive to tens of millions of users who have Facebook’s facial recognition setting called “Tag Suggestions” because that setting was turned on by default, and the updated data policy suggested that users would need to opt-in to having facial recognition enabled for their accounts.
In addition to these violations of its 2012 order, the FTC alleges that Facebook violated the FTC Act’s prohibition against deceptive practices when it told users it would collect their phone numbers to enable a security feature, but did not disclose that it also used those numbers for advertising purposes.
The Commission vote to refer the complaint and stipulated final order to the Department of Justice for filing was 3-2. The Department will file the complaint and stipulated final order in the U.S. District Court for the District of Columbia. Chairman Simons along with Commissioners Noah Joshua Phillips and Christine S. Wilson issued a statement on this matter. Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued separate statements on this matter.