Privacy and Security

Crying “Privacy” is Not Enough to Support a Lawsuit

The Mere Collection and Retention of Personal Data Is Not Enough Injury to Support a Lawsuit, Court Rules

A judge in the U.S. District Court for the Northern District of Illinois handed down a decision that may impact future state, or even federal, privacy legislation. Through the case, entitled Rivera v. Google, Inc., the plaintiffs sued the technology giant for allegedly violating Illinois’s Biometric Information Privacy Act (BIPA). The court granted Google’s motion for summary judgment, determining that the plaintiffs did not suffer a “concrete injury,” thus depriving the court of Article III standing.

The case may have a substantial impact on future privacy legislation. The crux of the case is whether the collection and retention of information about a person can form the basis of a lawsuit, absent any evidence that the information was compromised by hackers, used in a manner inconsistent with terms of service, or the like. The court meticulously analyzed the allegations, the evidence presented by both sides, demonstrated an understanding of the underlying technology, and concluded that the mere collection and retention of data, even without individuals’ consent, is not a concrete injury.

By way of background, Illinois has some of the nation’s most stringent biometric privacy laws. According to the Electronic Frontier Foundation, the law

  • Requires private entities, including big for-profit businesses, to obtain consent from a person before collecting or disclosing their biometric identifiers.
  • Requires private entities that possess such identifiers to timely destroy them: when the purpose of collection ends, and in no event more than three years after the last contact with the subject.
  • Requires private entities to securely store such identifiers.
  • Allows parties injured by violations of these rules to file lawsuits to hold businesses accountable.

Google has software that detects images of faces and creates face templates. From there, Google compares the faces within a user’s private account for faces that appear similar and then groups photographs with those identified, similar faces in a user’s private account. Google implements facial recognition software for the convenience of the user and the user may turn off the functionality of this tool at any time.

The problem, according to the plaintiffs, is that Google implemented the technology without obtaining their consent in violation of BIPA. They admitted, though, that they “did not suffer any financial, physical, or emotional injury apart from feeling offended by the unauthorized collection.”

Harm, or injury, is an essential component in a lawsuit. Without an injury, there is no “controversy” as required by Article III, Section 2 of the Constitution. The court’s opinion asked whether the plaintiffs suffered an injury from two different angles, relating to Google’s retention of face scans and its collection of face scans.

Plaintiffs in any action must be able to identify a right or interest that the defendant violated. Many times, these rights or interests are tangible and thus easily identified. A person has a property interest in their house or the goods they own. When someone trespasses or takes another’s goods without permission, the law provides a remedy for the property owner.

Some enforceable rights are created by statute. In this case, Illinois requires companies to “develop a written policy, made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information…” Plaintiffs argued that Google did not have such a written policy and that it violated their “right to control their own biometric identifiers and information” as a result.

The court quickly dismissed the notion that mere retention of data is a sufficient legal injury. Similarly, the court pointed out prior precedent laying out that “mere retention of an individual’s personal data (without disclosure or risk of disclosure) [is] insufficient to confer Article III standing.” In other words, feeling offended is not a legally recognized injury for constitutional purposes without showing a “substantial risk” that a future harm—such as bad actors accessing the personal data—will occur.

The court acknowledged that the “closer question” regarding a concrete injury arose from “Google’s creation of [the plaintiffs’] face templates without their knowledge.” BIPA requires companies to disclose that they are collecting biometric information, what they intend on doing with the collected information and receive “a written release” signed by the consumer. The plaintiffs alleged that Google violated this statutory right by implementing facial templates without permission.

The plaintiffs did not allege that Google improperly shared facial templates with third parties or that the Google Photos database was hacked. BIPA’s legislative history identified the risk of identity theft as the legislature’s primary concern. According to the court, without “evidence of a substantial risk that the face templates will result in identity theft” the plaintiffs failed to allege a concrete injury.

Google was not independently culling information. Nor do many technology platforms independently seek information about consumers. Instead, Google and other technology platforms rely on consumers to generate the information by uploading photos, posting status updates, filling out the profile information, and so on. The voluntary nature of the relationship between individuals and technology companies complicates the typical “privacy” analysis, as most consumers should be aware, at the least, that platforms are likely to use the information within the posts.

The mere collection and retention of data, even personal data, cannot form the basis of a legal action. A company, tech or otherwise, cannot harm individuals simply by collecting information about an individual. Harm comes when a company fails to implement commercially reasonable data protection programs, misuses the data in ways people did not agree to or could reasonably foresee.


In Depth: Privacy and Security

A market environment is essential for future success of the Internet. A consumer and private-sector-driven approach to privacy via self-regulation avoids undue regulatory burden that would threaten a thriving electronic marketplace. The Internet has flourished due in large part to the unregulated environment in which it has developed and grown.

+ Privacy and Security In Depth