Facial Recognition and the 2021 California Bar Exam

Imagine you are an African American woman with an undergraduate degree from Princeton University and a law degree from Harvard Law School. You passed the Illinois bar exam and have embarked on a promising career at a prestigious “Big Law” firm in Chicago. However, a very talented summer associate is working at the firm and there is a mutual attraction. The firm, like so many employers, frowns upon employees dating one another, so you decide to sit for the Winter 2021 California bar exam to keep your options open. Weeks before the bar exam, the State Bar of California announces that it will use facial recognition security measures, since this year’s exam will be conducted remotely in light of the continuing COVID-19 social distancing guidelines. You are distracted during the last few days of preparation for the exam because you know (from your work in privacy law and artificial intelligence (AI)) that these tools are known to be inaccurate, particularly for persons of color and for women. The day of the exam arrives, and you are locked out of the exam in the first few minutes. The facial recognition software has determined that a man is taking the exam in your place. 

While this scenario is fictional, some of the facts are real. Michelle Robinson Obama graduated from Princeton University and Harvard Law School, then passed the Illinois Bar exam in 1988. She did not move to California, but rather married that persistent young man and became the First Lady of the United States and one of the most recognized faces in the world. Facial recognition software identifies her as male. This month, thousands of black, brown and female applicants for the California Bar could face a similar circumstance. Hopefully, for most individuals, their fears will be unfounded, and the only harm suffered will be a few unproductive hours of study time. But what about others? They may lose the right to take the exam, which is only offered twice a year, or have their reputation tarnished, even if they are ultimately found to have truly taken the exam. Under both scenarios, the launch of their career could be derailed for months, if not longer, because of the use of flawed technology.

Facial recognition can identify or verify the identity of an individual in photographs, video and in real-time. However, facial recognition has proven to be less accurate than fingerprint, palm, iris and voice recognition. Unfortunately, the set of circumstances involving women and persons of color was not chosen at random. The MIT Media Lab Gender Shades project highlighted the bias in facial recognition as it exists today by showing that black female celebrities like Serena Williams, Oprah Winfrey and, yes, Michelle Obama were mistaken for males in their study.

Two other studies by the National Institute of Standards and Technology (NIST) and the American Civil Liberties Union (ACLU) confirm the inaccuracies in gender classifications as well as the distinctions among darker-skinned females, darker-skinned males, lighter-skinned females and lighter-skinned males. In the latter study, 28 members of Congress, a disproportionate number of whom were persons of color, were misidentified. The reason behind this substantial error rate can be traced in part to the fact that algorithms are trained on predominantly white males, by white engineers. There are also “false negatives” and “false positives” that are unrelated to race or gender in the current versions of the technology.

Law enforcement agencies were some of the early adopters of facial recognition technology. However, as we will discuss separately, several departments have walked back their use of such technology. But as of this writing, and despite a threat of litigation by the Lawyers’ Committee for Civil Rights Under the Law regarding the technology’s disparate impact on certain examinees, the California State Bar has not reversed course.

Facial recognition remains a promising application of AI, but there are strong arguments for its limited use until certain privacy, equity and other growing pains typical of an emerging technology are addressed. The privacy issues will require a combined effort from the legal and technological communities to craft standards by which an individual’s image may be used and their identity determined through facial recognition technology. 

One useful preventative measure could involve a publicly available use policy for how facial recognition will be used, the redress available to examinees and the affirmative consent of the applicants to its use. With public pressure, particularly from the legal community, the equity issue is one that can be solved with design improvements, software updates and placing accuracy and equity front and center in the development of the technology. 

At a minimum, thoughtful consideration should be given to using facial recognition in high-stakes situations such as bar exams. Stay tuned for our continuing discussion of this emerging technical, legal and ethical issue.