Each year, Verisk’s Emerging Issues team works closely with students and faculty at several university insurance and risk management departments on research projects to help nurture the next generation of insurance talent. In one of our most recent collaborations, Illinois State University’s Sejal Pamnani analyzed several lawsuits arising from the use of facial recognition technology.
Biometrics are unique physical characteristics that are used to identify an individual. Facial Recognition Technology (FRT) is one of the many technologies that use biometrics. From unlocking our phones with a quick glance to identifying soldiers during times of war, FRTs have a vast number of applications. FRTs are rapidly gaining popularity, but the ability to identify a person through a picture or video comes with various risks. The complete legal landscape for FRTs has yet to be determined, but trends in settlement severity are slowly emerging. This article will examine various FRT lawsuits dealing with unlawful biometric data collection and their outcomes.
From unlocking our phones with a quick glance to identifying soldiers during times of war, this technology has a vast number of applications.
Overview of Cases
Patel v. Facebook, 932 F.3d 1264 (9th Cir. 2019)
In 2010, Facebook created a new feature called Tag Suggestions. When enabled, Facebook uses facial recognition to scan a user’s photo and uses biometric data like the distance between the eyes, nose, and ears to create a ‘face signature.’ The technology then compares the signature to the face signatures of other Facebook users. The technology will then recommend tagging if a match is found. The face signatures are stored in Facebook’s servers.1 Facebook users in Illinois filed a class action against the company under the Biometric Information Privacy Act (BIPA). Under BIPA, “[a]ny person aggrieved by a violation of this Act shall have a right of action in a State circuit court or as a supplemental claim in federal district court against an offending party.”<span. 2 The plaintiffs alleged that Facebook violated BIPA by collecting, using, and storing biometric information (face signatures) from their photos without consent.3
Miracle-Pond et al. v. Shutterfly, Inc., 68 NE 2d 746 (ND Ill., 2020)
Shutterfly is an online picture storage company that offers photo printing services. Shutterfly used facial recognition technology to create “geometric maps” of their users from photos they upload. The technology analyzes data points from the contours of a face. The geometric maps were then used for a tagging feature.4 In 2019, plaintiffs filed a class action under BIPA. According to BIPA, a company must inform people in writing that it is collecting their biometric information and receive a written release.5 The plaintiffs alleged that they never provided permission for the collection and use of their biometric data.6
TikTok Consolidated Amended Class Action Complaint, NE MDL No. 2948, (N.D. IL, Feb. 25, 2021)
TikTok used facial recognition algorithms to identify the age, ethnicity, and gender of users. Facial biometric data was also collected and used to track and profile users to develop a targeted ad and marketing platform.7 21 people filed lawsuits against the company on behalf of minors. The lawsuits were merged into one multi-district action under BIPA. The plaintiffs claim they did not consent to the collection, storage, and use of their biometric information.8
FTC Complaint in the Matter of Everalbum, Inc. Docket No. 1923172
Everalbum provides a photo storage app called Ever. In 2017, Everalbum launched its “Friends” feature. This feature uses facial recognition technology for automatic tagging. The Friends feature was enabled by default for all users and there was no option to disable the feature.9 From May 2018, Everalbum provided users located in Texas, Illinois, Washington, and the EU with the option to turn on or turn off the facial recognition feature. From July 2018 and April 2019, the company allegedly claimed that they would not apply their FRT to users, unless they chose to activate the feature. However, the ability to choose to enable or disable the feature was not rolled out to all users until April 2019. Everalbum also promised to delete the biometric data of users who deactivated the Friends feature but did not follow through until October 2019.10 The Federal Trade Commission (FTC) filed a complaint against the company, alleging that Everalbum misled users of the Ever app into thinking the company would delete the biometric data of users if they deactivated their account. The FTC also claims that the company used the biometric data for more than a photo tagging feature. Between September 2017 and August 2019, the company extracted millions of facial images from Ever app users and compiled 4 different datasets. These datasets were used to develop facial recognition services sold to external parties.11
Settlement Analysis
With a payout of $650 million dollars, Facebook’s settlement is the largest BIPA settlement to date12, which might be due to the particularly sensitive nature of biometric data. Biometric information is unique to every individual.13 In the event of a data breach, the information about an individual’s face cannot be changed. As part of the settlement, Facebook must also make facial recognition features optional and delete the face signatures of users that do not provide consent.14
The Facebook and Shutterfly lawsuits both dealt with the privacy issues surrounding a FRT based tagging feature and had similar conditions. However, the Shutterfly settlement included additional requirements. The company must publish a supplemental retention schedule and guidelines that will comply with BIPA regulations.15 Shutterfly also needed to notify users about their data collection practices, including the purpose for biometric data collection and the length of time the data will be used.16 Although the monetary payout dropped significantly, the injunctive outcomes of the Shutterfly case suggest that the misuse of FRTs is being viewed with increasing scrutiny in court. This matches with what some law firms hypothesised after the Facebook v. Patel lawsuit.17
Lawyers from the TikTok case believe that this settlement is one of the largest privacy-related payouts. 18 Under the terms, TikTok will no longer use the app to collect and store user’s biometric or geographic information, nor transit or store data outside of the US. TikTok also needs to develop a new training program for employees and contractors about compliance with data privacy laws. Additionally, TikTok must hire, at its own expense, a third-party firm that will review their data privacy training program for 3 years and to provide written verification of the review.19
Finally, the Everalbum settlement does not include any monetary relief, as the FTC does not have the authority to obtain civil penalties for Section 5 violations and their authority to obtain monetary relief is under review.20 Hence, the settlement only includes injunctive relief. Everalbum must delete three categories of information:
- The videos and photos of users that have deactivated their account
- All facial biometric data that was obtained without consent
- All models and algorithms created using the biometric data collected from the Ever app21
Category three is the most important, because prior FTC settlements only require that a company deletes any data that was unlawfully collected. This is reportedly the first time the FTC has required a company to delete any algorithms or work product that used the data that was improperly collected.22 In a statement, FTC Commissioner, Rohit Chopra compared the Everalbum settlement to the case against Google, which resulted in a $170 million fine for collecting personal information about minors but did not require that the company delete their algorithms and models.23 The Everalbum case continues the trend of increased severity of FRT related settlements similar to the Facebook and Shutterfly cases. Moreover, this settlement is the FTC’s first enforcement focused solely on facial recognition technology,24 and thus, likely sets the benchmark for how the FTC may handle future FRT related cases.
Facing the Future
While the full scope of the legal liability for the use of FRTs is still being defined, the severity of the settlements appears to be increasing. The high settlements and demand for injunctive relief suggest that future cases dealing with the misuse of FRTs and unlawful biometric data collection will likely face more scrutiny in court. As these cases move forward, and new cases are brought forth, there may be more clarity as to which practices trigger legal action and how settlements are determined.25
Cases that deal with other types of risks related to FRTs may also arise. For example, in January 2020 the Detroit police arrested Robert Williams for a crime he did not commit. The police department’s decision was based on the results of a facial scan of a picture of the suspect. The American Civil Liberties Union (ACLU) and the University of Michigan Law School’s Civil Rights Litigation Initiative filed a lawsuit on behalf of Williams, alleging that the arrest violated his Fourth Amendment rights.26 This case is still ongoing, and its settlement could serve as a future reference point for how racial bias in facial recognition technology is handled in court.