The Federal Trade Commission alleges that Meta has “repeatedly violated” privacy rules and proposes to toughen the agency’s 2020 order against the company, completely barring it from monetizing data of anyone under 18 in any way, among other new ones. restrictions.
The order in question went into effect in 2020, but was created in 2019 as part of Facebook’s $5 billion settlement after the company violated a former order. Now the FTC says Facebook/Meta has violated the new order, but also the Children’s Online Privacy Protection Act Rule.
“The company’s recklessness has put young users at risk, and Facebook must answer for its failures,” Samuel Levine, director of the FTC’s Bureau of Consumer Protection, said in a press release. (Due to the order and said flaws spanning both company names, both are also used throughout.)
The 2020 order created an independent third-party assessor who would assess whether Meta was adhering to privacy rules, things like putting new products through privacy reviews and restrictions on how facial recognition data and phone numbers are used. .
This adviser recently submitted his report to the FTC, and it is apparently not pretty, as it contains evidence of numerous deficiencies or violations: “The Commission notes that the breadth and significance of these deficiencies pose substantial risks to the public,” the agency wrote. .
Specifically, Facebook promised (in 2018, the timeline is long and confusing) to cut off app developers’ access to user data if that user hadn’t used the app in 90 days. But it did not, the FTC alleges, and allowed some of that data to be used well into 2020.
The company also “misrepresented that parents could control who their children communicated with through its Messenger Kids product.” The contact controls implemented by Facebook were inadequate, allowing children to communicate with unapproved contacts through video calls and group chats.
They may not sound like the most egregious failings, but kids’ tech regulations are strict for good reason, and COPPA violations are serious. When one considers that Facebook not only received a decade-long warning for sloppy privacy practices, but knew the FTC was watching its every move, especially with sensitive data like that of users under the age of 13, one is less inclined. to offer grace. .
This seemingly cavalier approach to enforcing the FTC’s order has led the agency to tighten the screws, with a number of proposed changes to the order, something it can do when warranted by “changes in legal or factual conditions.” or in the public interest. Businesses can consider themselves forewarned that FTC orders are largely living documents.
In this case, order 2020, which affects all Meta businesses (Facebook, Instagram, WhatsApp and Oculus), would be modified to add the following:
- Total prohibition of monetizing data of minors under 18 years of age. This data could only be used to provide services or for security purposes. (And it also doesn’t become legal retroactively when the user turns 18.)
- Not release new or changed products or services without confirmation by the independent reviewer that the new features comply with privacy restrictions.
- If Meta buys a cool new company, this privacy rule now applies to them too.
- Expanded limitations on facial recognition, requiring disclosure and affirmative consent.
- Strengthened requirements in everything related to privacy review, data inventory, access controls, etc.
Today, the FTC is issuing an Order to Show Cause, detailing the issues briefly noted above and not publicly available at the time of this writing. Meta has 30 days to respond, after which the agency “(shall) carefully consider the facts and arguments presented by the parties” and decide whether the extended order is warranted. I asked when the new order might go into effect and will update this post if I hear back.