Here’s How Congress Should Respond to Facebook/Cambridge Analytica

Photo: Todd Barnard. CC-BY-SA.

Facebook and Cambridge Analytica. By now we know the basic facts: Aleksandr Kogan, purporting to be a researcher, developed an authorized Facebook application. As was Facebook’s practice at the time, when users connected the app to their Facebook accounts, the app scooped up not only the users’ personal information, but also their friends’ personal information. In this manner, Dr. Kogan was able to amass information about 50 million Facebook users – even though only 270,000 individuals used the app. Dr. Kogan then, exceeding his authorized use of the data, funneled that information to Cambridge Analytica, a firm that purported to engage in “psychographics” to influence voters on behalf of the Trump campaign.

We also know that Facebook/Cambridge Analytica is hardly unique. Rather, unauthorized access to personal data seems to be a running theme this year – whether it’s in the form of Facebook/Cambridge Analytica where authorized access to data was misused and shared in ways that exceeded authorization, or whether it’s in the form of an old-fashioned data breach ala Equifax or Orbitz. (Did you notice that Orbitz announced this week that it had a data breach, and hackers may have acquired 880,000 people’s payment information?)

In the twenty-first century, it is impossible to meaningfully participate in society without sharing our personal information with third parties. Those third parties should have commensurate obligations to protect that personal information. Unfortunately, it has become increasingly clear that too many third parties are failing to live up to this responsibility. It is therefore incumbent on Congress to step in to protect consumers. Here’s how they should start:

1. Notice and Consent. Until the digital age, individual ownership and control of one’s own personal information was the basis for privacy law in the United States. We should return to this principle. While we cannot avoid sharing information with third parties, we can have greater control over that information. At a minimum, consumers should have a right to know a) what information is being collected and retained about them; b) how long that information is being retained; c) for what purposes that information is being retained; d) whether the retained information is identifiable or anonymized; e) whether and how that information is being used; f) with whom that information is being shared; g) for what purposes that information is being shared; and h) under what rubric that information is being shared (for free, in exchange for compensation, subject to a probable cause warrant, etc.). And, when I say right to know, I don’t mean companies should be able to get away with putting this information in the fine print of a privacy policy that would be forty pages printed in eight-point font. I mean this information should be made available in a user-friendly, easily accessible way. Think small pop-up screen with large print.

But, having that information alone is insufficient. Consumers must also have meaningful opportunities to consent to data collection, retention, and sharing. And, that consent should be as granular as possible. For example, you may use my data for research purposes, but not for targeted advertising – or vice-versa. Or you may only retain my data for two years and no longer. As with notice, the consent I am describing must be real (e.g., not it-was-buried-on-page-thirty-nine-of-a-forty-page-privacy-policy-and-consent-was-implied consent). The General Data Protection Regulation, which goes into effect in Europe in May, will require some kinds of granular consent on the continent and in the UK, so companies already have to figure out how to offer their users opportunities for meaningful consent. There is no reason for them not to do the same in the United States.

2. Security Standards. When we trust a third party with something we own – particularly something personal – we expect that third party to take care of our possession. It should be no different with personal information. Third parties that are stewards of our personal information should be expected to adhere to the latest, state-of-the art security standards. This is particularly true when an individual cannot avoid sharing the information without foregoing critical services or declining to participate in modern society.

3. Meaningful Recourse. When there is unauthorized access to personal information, individuals must be made whole to the greatest extent possible. There are two major barriers to this. The first is the Federal Arbitration Act, which requires courts to honor the forced arbitration clauses in contracts. Remember that forty-page privacy policy you didn’t read? Yes, that’s a contract. Forced arbitration clauses require consumers to settle any dispute they have with a company by arbitration rather than having their day in court – and often consumers don’t even know an arbitration clause is in their contract until they go to sue. This presents three problems: 1) Arbitrators are often more sympathetic to large companies, who are repeat players in the arbitration system, than most juries would be. 2) Arbitration creates no legal precedent, which means that one arbitrator may find in my favor against a big company, but another arbitrator may find against you in the exact same situation – that arbitrator need not adhere to what the first arbitrator did in my case. 3) Frequently, it is not cost-effective for an individual to bring a claim against a large company by herself. The damages she could win likely would not exceed her legal costs. But, when customers can band together in a class action lawsuit, it becomes much more feasible to bring a case against a large company engaged in bad behavior. Forced arbitration clauses preclude class action. Congress should explicitly exempt cases addressing the failure to protect personal information from the Federal Arbitration Act to make sure consumers can have their day in court when their information is misused and their trust abused. Congressman Lieu’s Ending Forced Arbitration for Victims of Data Breaches Act of 2018 is a step in the right direction, although more expansive legislation is required to cover situations like Facebook/Cambridge Analytica.

The other major barrier to meaningful recourse is the difficulty calculating the damages associated with unauthorized access to personal information. While one may be able to quantify her damages when her credit card information is breached or her identity is stolen, it is much harder to do so in a situation like Facebook/Cambridge Analytica. How do you put a dollar amount on having your privacy preferences ignored? Having your personal information revealed to third parties without your knowledge or consent? Having that information used for “psychographics” to influence your behavior in the voting booth? Fortunately, there is a concept called liquidated damages, which was designed to address these sorts of circumstances. Liquidated damages are used when the damage is real, but hard to quantify. In fact, liquidated damages are already used to address other privacy harms. For example, the Cable Privacy Act provides for liquidated damages when cable companies impermissibly share or retain personally identifiable information.

Yes, it’s true that the Federal Trade Commission (FTC) can step in when companies engage in unfair and deceptive practices, but the FTC is likely to only intervene in the most egregious cases. Moreover, the FTC can only extract damages from companies once they have screwed up once, entered into a consent decree with the Agency, and then screwed up again, violating the consent decree. That’s a lot of consumers who have to have their personal information abused before the company feels any pain. Moreover, when the FTC is involved, any damages go to the government, not to making individuals whole.

By contrast, allowing private, class action lawsuits for liquidated damages when companies fail to safeguard private information will create the necessary incentives for companies to take appropriate precautions to protect the information they have been entrusted with. Companies, after all, understand the technology and the risks, and are in the best position to develop safeguards to protect consumers.

Congress is going home for a two-week recess. Now is the perfect time tell your Senators and Representatives that when they return to Washington you expect them to address unauthorized access to personal data by requiring 1) meaningful notice and consent for data retention and sharing, 2) that companies adhere to appropriate security standards, and 3) that consumers have the opportunity for meaningful recourse when their data is abused.

Thanks to the folks at Public Knowledge.

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution 4.0 International License.