Everalbum, Inc. has settled Federal Trade Commission (FTC) allegations that it deceived users about its use of facial recognition and improperly retained photos and videos from users who had deactivated their accounts.
As described in the FTC’s complaint, the company used photos uploaded to a consumer app called “Ever” (which has since shut down) to develop facial recognition technologies that it then marketed to enterprise customers under a different name, originally “Ever AI” and now “Paravision.” The proposed settlement would require the company not only to obtain users’ express consent before using facial recognition technology, but also to destroy all algorithms and models it developed using photos it deceptively obtained from Ever users.
In a statement, FTC Commissioner Rohit Chopra called the requirement that Everalbum delete models and algorithms an “important course correction” to previous settlements that allowed companies to retain algorithms and other technologies that had been developed or enhanced using data obtained illegally. Chopra specifically contrasted earlier cases against Google (a $170 million fine for collecting personal information about children in violation of the Children’s Online Privacy Protection Act) and Facebook (a record-setting $5 billion fine for violating a 2012 FTC order by deceiving users about privacy controls) that did not require the companies to relinquish algorithms and models derived from deceptively obtained data.
The Ever app, launched in 2015, began as a straightforward photo storage app. Basic facial recognition features, allowing users to group photos based on the faces of people who appeared in them, were not added until 2017. At first, these facial recognition features were powered by publicly available technology. Eventually, though, the company began developing its own algorithms, and it used its users’ images as training data to improve their accuracy. According to the FTC’s complaint, the company combined user images and publicly available datasets to create four separate training datasets that it used to improve its facial recognition algorithms over several years. In addition to using these algorithms within the Ever app, Everalbum also marketed the resulting technology (but not direct access to user data) to business and government customers for uses including identity programs and access control.
The FTC also alleged that Everalbum deceived users about what would happen after accounts were deleted. Despite representations that the company would delete users’ photos and videos if they deactivated their accounts, the company instead stored user data indefinitely.
The FTC’s proposed settlement requires Everalbum to delete three broad categories of information: (1) all photos or videos uploaded by users who have deactivated their accounts, (2) all face data, including photos or other numeric representations, derived from photos of users who did not affirmatively consent to the use of facial recognition, and (3) all models and algorithms developed in any part using any biometric information (including face data) collected from users of the Ever app. In addition, it also requires Everalbum to seek affirmative consent before using users’ biometric information and not misrepresent its use of this information moving forward.
Privacy researcher and former FTC Chief Technologist Ashkan Soltani called the requirement that Everalbum delete not only deceptively collected data but also the algorithms and models trained on it a “significant precedent” for artificial intelligence and machine learning.
Everalbum’s use of consumer photos to develop facial recognition technology marketed to enterprise customers, including law enforcement and military clients, was first reported by NBC News in 2019.
The FTC voted unanimously to issue the complaint and accept the proposed settlement agreement with Everalbum, which now awaits 30 days of public comment before it is finalized by the FTC. The settlement is the first FTC case primarily focused on misuse of facial recognition.
Zachary Sorenson is a 1L at Harvard Law School.