In the wake of the COVID-19 pandemic, social distancing restrictions have made it hard for consumers to try on clothes, accessories and makeup before making a purchase. As an alternative, consumers have turned to augmented reality (AR) “virtual try-ons” offered by brands like Garnier, Maybelline and Warby Parker. AR is the use of digital information, like text, graphics, audio or other virtual enhancements in conjunction with real-world objects. AR technology allows people to “try on” clothing, makeup, glasses, hair styles and colors, or even eyebrow shapes on their phone or computer by uploading a picture of themselves or activating their live camera and then selecting the items they want to superimpose onto their image. AR try-on features introduce privacy concerns beyond those in a simple photograph because the technology also captures the user’s unique facial measurements, one element of biometric data. Biometric data is information that uniquely identifies an individual and can also include fingerprints, retina scans, voice recognition and even body odors.
Many virtual try-on experiences have been introduced in the last several years. For example, in 2019 Warby Parker introduced a virtual try-on experience that allows customers to experiment with virtual frames using augmented realty. Warby Parker’s phone application uses Apple’s Face ID feature to place 30,000 invisible dots to create a map of a customer’s face and then suggests frames that would look good on the customer. AR technology, like that used by Warby Parker, has experienced an increase in popularity during the COVID-19 pandemic and has raised questions about whether try-on functions are worth the privacy risk. Testing a pair of glasses or a lipstick color by webcam exposes personal and biometric data, which companies may use for a variety of purposes such as retrieving insights into customer’s interests and preferences. Also, data breaches have become more common in recent years, raising concerns over how these companies are storing data and whether such data could be compromised in the event of a cyber-attack.
One of the more notable AR companies is ModiFace. ModiFace was acquired by L’Oréal in 2018 and provides AR experiences for 84 beauty brands that can run on mobile apps, e-commerce sites and even in stores. The company’s own statistics state that its technology can double conversion rates and multiply customer time on mobile apps six-fold. Since its start in 2006, ModiFace has improved its technology to display all types of cosmetics including foundations, eyeliners, mascara, skin care products and hair dyes. ModiFace has also created integrations so its technology can be used during livestreams on Facebook Messenger and live tutorials that teach customers makeup routines by walking them through each step.
ModiFace charges brands between $200,000 and $500,000 per year to use its AR technology depending on the level of integration into mobile apps, websites or store locations. Many brands work with ModiFace to ensure consistency between virtual and real-life looks, making certain that customers receive products that look the same as the product they virtually “tried on.” Brands are then able to track data on what customers are virtually “trying on” and what they end up purchasing. This allows them to adjust their products based on demand and create products that they know customers will want to purchase.
ModiFace’s popularity has not come without legal concerns, however. Technologies like ModiFace’s AR try-on features open the door to a variety of privacy concerns. For example, the U.S. District Court for the District of Illinois was presented with a case against ModiFace and Sephora for alleged violations of Illinois’ Biometric Information Privacy Act (BIPA). The named plaintiff in the class action alleged that she visited a Sephora store in Chicago, Illinois that featured a “Virtual Artist Kiosk.” The kiosks capture customers’ facial geometry so the customer can then superimpose Sephora’s beauty products on their face. According to the complaint, the kiosk captured and stored the plaintiff’s facial scans, then asked her to provide personal information in order to receive a picture of herself with Sephora beauty products superimposed on it. The plaintiff claimed that ModiFace and Sephora violated three BIPA provisions: (1) private entities must release a publicly accessible written policy describing its practices for retaining and destroying biometric data, (2) private entities may not collect, capture, purchase, receive through trade or otherwise obtain biometrics from someone without first informing them that such information is being obtained and the purpose of obtaining such information, and (3) private entities may not disclose or disseminate biometric data without the subject’s consent. While the court dismissed the case against ModiFace for lack of personal jurisdiction in May 2020, Sephora was not so lucky. Sephora agreed to a $1.25 million settlement to be shared by anyone who filed a claim and used a Sephora virtual makeup kiosk dating back to July 2018.
Currently, only three states have laws in effect that specifically regulate the collection and use of biometric information. Illinois is the only state that allows a private right of action, i.e., where consumers can bring a lawsuit for collection of their biometric data. Therefore, if companies like ModiFace can claim lack of personal jurisdiction in Illinois, they can avoid liability to consumers. Alternatively, in Texas and Washington, companies have to follow certain regulations with regard to biometric data; otherwise the Attorney General of each state can recover civil penalties. Beyond statutes currently in effect, Alaska, Arkansas, Arizona, Massachusetts, Michigan, New Hampshire and New York are considering enacting statutes to regulate the collection of biometric data.
As more states enact biometric data statutes, companies like ModiFace, and brands that utilize AR virtual try-ons, should follow certain protocols to ensure compliance with biometric data collection statutes. First, companies should notify customers that they are collecting biometric data. Second, companies should enact, and make publicly available, privacy policies that govern the collection, use and storage of biometric information. Third, companies should protect sensitive data, like biometric data including body or face measurements, with a commitment to reasonable data breach prevention and response actions. Specifically, under Illinois’ BIPA, companies must “store, transmit, and protect” biometric data using the reasonable standard of care for the industry and in a manner consistent with how the company protects other sensitive information.
In addition to statutes governing solely biometric data, the California Consumer Privacy Act (CCPA) addresses information access, user control, protection and non-discrimination. Since the enactment of the CCPA, Californians are now entitled to know what information is being collected about them or whether their information is sold or disclosed (and to whom), prevent companies from selling their personal information, access the information that is being collected from them and receive the same services and prices as those who do not exercise their privacy rights.
Lutzker & Lutzker is experienced in privacy, artificial intelligence and cybersecurity law and will continue to provide updates on privacy laws as they become available. Please contact us if we can be of help in analyzing and addressing any issues concerning AR virtual try-on technology or related areas.