FTC Opens New Probe: Revisits Its Old Charges On Facebook Privacy Practices

Among the host of legal woes Facebook is now confronting amid news about the misuse of its member profiles in election politics: The Federal Trade Commission revealed Monday that it is investigating the social media giant’s privacy practices.

The FTC decision comes in the wake of recent reports about the ease with which political consulting firm Cambridge Analytica gained access to as many as 50 million Facebook user profiles by or before 2015. The data firm, which worked with Donald Trump’s presidential campaign in 2016, allegedly used its access to the Facebook profiles as it developed psychological tactics to influence American voters.

The federal consumer protection agency said in an announcement that it was spurred by news reports to re-examine Facebook, which was the subject of a sweeping FTC complaint in 2011. In 2012, that probe ended in a settlement that still governs Facebook’s privacy policies and operations. If the agency concludes that Facebook failed to live up to that agreement, it could impose fines of $40,000 per violation, according to CNBC.

Many Facebook users were apparently shocked to learn from recent news stories and company statements that in 2013, the social media network was routinely revealing their personal information—and that of their friends—often without fully informed consent, to outside entities that met certain criteria. One of those given entrée was a researcher who shared his trove of millions of profiles with Cambridge Analytica. The researcher wasn’t authorized to do this, Facebook CEO Mark Zuckerberg said in a written statement last week. But he acknowledged that other outside organizations could have amassed similar banks of user data under the Facebook practices prevailing at that time. Zuckerberg said Facebook is now trying to find any remaining caches of that data and force its destruction by entities that might be misusing it.

Outraged users are now pushing the #deleteFacebook movement. But anyone who had read the FTC complaint filed against Facebook in 2011 would not have been surprised to learn about the mechanisms the company has used at times to open up member data to advertisers and to outside apps that users activate through their Facebook accounts.

In that draft complaint, the FTC’s Bureau of Consumer Protection accused Facebook of unfairly deceiving its users by assuring them they could determine who could view their names, locations, photos, friends’ profiles, comments, and other data—while the company was concealing how little control they often had.

When users chose privacy settings to confine sharing of their data only to friends, they could not actually insure this result without tunneling though the Facebook settings pages to close another opening through which their data could be lost. Unless they knew to do this, Facebook could share their data with third-party apps that their Facebook friends were using. That could expose a user’s name, date of birth, marital status, workplace, photos, videos, and other information to outside tech companies, the FTC complaint stated.

It was through avenues like this that researcher Alexandr Kogan collected 50 million profiles by inviting only about 270,000 Facebook users to take an online quiz. Kogan passed tens of millions of these profiles to Cambridge Analytica, though he told USA Today he was not aware that he was violating Facebook’s policies.

According to the FTC complaint in 2011, Facebook also changed its privacy policies on several occasions without making it clear to users that their data would then have less protection, and that their prior privacy settings could be overridden. The agency said the unauthorized sharing of personal data could harm users by various means, such as revealing their political views or sexual orientation to prospective employers or business competitors; by facilitating unwelcome contacts by people who could track them to their home cities; or by allowing the publication of embarrassing photographs.

According to the FTC complaint, Facebook also repeatedly but falsely stated that it would not share users’ personal information with advertisers as it made money by helping marketers find the audiences most likely to be interested in their products.

The agency cited a 2010 blog post it attributed to Sheryl Sandberg, who is currently Facebook’s chief operating officer and a board member:

“We never share your personal information with advertisers. We never sell your personal information to anyone,” the blog post says. “The only information we provide to advertisers is aggregate and anonymous data, so they can know how many people viewed their ad and general categories of information about them. Ultimately, this helps advertisers better understand how well their ads work so they can show better ads.”

In fact, the FTC complaint charged, Facebook shared with advertisers the user IDs of members who had clicked on their ads, enabling the advertiser to access the user’s real name and other profile information.

The consent decree Facebook agreed to in 2012 does not, by its own language, constitute an admission by Facebook that the facts of the FTC complaint are true, or that it had violated the Federal Trade Commission Act through unfair or deceptive practices. The settlement requires Facebook to develop a comprehensive privacy policy “that is reasonably designed to (1) address privacy risks related to the development and management of new and existing products and services for consumers, and (2) protect the privacy and confidentiality of covered information.”

But the FTC provides few specific requirements as to the shape of that policy. As an agency that guards consumers against deceptive practices, its main thrust in the order is to make sure that Facebook’s statements about its privacy policy are sufficiently truthful and plainly worded so that users will be fully aware when Facebook can share their data against their will. Facebook is also required to make periodic reports on its compliance with the decree to the FTC.

Zuckerberg said last week that Facebook had made changes to company privacy practices in 2014 to put some limits on data that apps could access. When it learned in 2015 that Cambridge Analytica had obtained unauthorized access to tens of millions of user profiles from Kogan, Facebook demanded that Kogan and the data firm certify that they had deleted the data, he said. But Facebook did not then publicly disclose the data loss. Investigators are now trying to determine whether Cambridge Analytica’s data cache has actually been destroyed.  Zuckerberg announced new privacy measures Wednesday to try to rebuild trust among users.

But lawmakers may not leave it up to Facebook and other companies to decide what privacy rules they must follow. For example, U.S. Rep. Bobby L. Rush (D-IL) last week introduced a bill that would authorize the FTC to require specific measures to protect the privacy and security of personal data. The bill, called the “Data Accountability and Trust Act,” would also require companies to publicly disclose within 30 days if they sold personal data, or lost it to outsiders due to a security breach or other unauthorized access.

“Facebook took advantage of the industry’s lack of regulation and knowingly allowed their users to believe their data was safe when, in fact, it was not,” Rush said in a written statement. “Strong, national data privacy and data protection rules are long overdue.”

Bernadette Tansey is Xconomy's San Francisco Editor. You can reach her at btansey@xconomy.com. Follow @Tansey_Xconomy

Trending on Xconomy