 {"id":2930,"date":"2020-04-11T00:09:05","date_gmt":"2020-04-11T00:09:05","guid":{"rendered":"https:\/\/live-journal-of-law-and-public-policy.pantheonsite.io\/?p=2930"},"modified":"2020-04-11T00:09:05","modified_gmt":"2020-04-11T00:09:05","slug":"facial-recognition-software-race-gender-bias-and-policing","status":"publish","type":"post","link":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/2020\/04\/11\/facial-recognition-software-race-gender-bias-and-policing\/","title":{"rendered":"Facial Recognition Software, Race, Gender Bias, and Policing"},"content":{"rendered":"<p style=\"text-align: center\">(<a href=\"https:\/\/www.theguardian.com\/technology\/2014\/may\/04\/facial-recognition-technology-identity-tesco-ethical-issues\"><em>Source<\/em><\/a>)<\/p>\n&nbsp;\n\nFacial Recognition Technology (FRT) identifies a person\u2019s face by navigating through computer programs that access thousands of cameras worldwide to identify a suspect, criminal or fugitive. FRT could even accurately identify a person\u2019s from a blurry captured image or instantaneously identify the subject among a <em><a href=\"https:\/\/www.itproportal.com\/2015\/10\/20\/facial-recognition-you-see-in-movies-can-only-be-seen-in-movies\/\">crowd<\/a><\/em>. This is the fantasy portrayed in Hollywood movies. In reality, facial recognition software is inaccurate, bias, and under-regulated.\n\nFRT creates a facial <a href=\"https:\/\/www.americanbar.org\/groups\/criminal_justice\/publications\/criminal-justice-magazine\/2019\/spring\/facial-recognition-technology\/\"><em>template<\/em><\/a> of a person\u2019s facial image and compares the template to millions of photographs stored in databases\u2014driver\u2019s license, mugshots, government records, or social media accounts. While this technology aims to accelerate law enforcement investigative work and more accurately identify crime suspects, it has been criticized for its bias against<em> <a href=\"https:\/\/www.wired.com\/story\/best-algorithms-struggle-recognize-black-faces-equally\/\">African-American Women<\/a><\/em>.  The arm of the United States government responsible for establishing standards for technology\u2014National Institute For Standards and Technology (<a href=\"https:\/\/www.nist.gov\/speech-testimony\/facial-recognition-technology-ensuring-transparency-government-use\"><em>NIST<\/em><\/a>)\u2014conducted a test in July of 2019 that showed FRT\u2019s bias against African-American women. NIST tested Idemia\u2019s algorithms\u2014an Australian company that provides facial recognition software to police in United States\u2014for accuracy. The software mismatched white women\u2019s faces once in 10,000, but it mismatched black women\u2019s faces once in <a href=\"https:\/\/www.wired.com\/story\/best-algorithms-struggle-recognize-black-faces-equally\/\"><em>1000<\/em><\/a> instances.\n\nThis disparity is a result of the failure of current facial algorithms to recognize darker <a href=\"https:\/\/www.wired.com\/story\/best-algorithms-struggle-recognize-black-faces-equally\/\"><em>skin<\/em><\/a> and is not limited to Idemia\u2019s software. In fact, NIST reported similar results from 50 different companies. Moreover, facial recognition software that relies on mugshots to identify suspects captured on camera or video likely includes a disproportionate number of African-American due to higher arrest <a href=\"https:\/\/www.perpetuallineup.org\"><em>rates<\/em><\/a>. This inaccuracy could lead to false arrest. For instance, police used facial recognition software in Fergusson, Missouri to arrest protesters following the death of Michael Brown. Considering the high mismatch rate for African-Americans, police could have arrested peaceful protesters due to <em><a href=\"https:\/\/www.eff.org\/pages\/face-recognition\">false positives<\/a><\/em>. Another prominent instance of facial recognition software\u2019s mismatch is <em><a href=\"https:\/\/aws.amazon.com\/rekognition\/\">Amazon\u2019s Rekognition software<\/a><\/em>\u2014a software that uses learning technology to identify objects and peoples\u2019 faces . The software disproportionately mismatched 28 members of congress. One mismatch was civil rights activist Rep. <a href=\"https:\/\/www.usatoday.com\/story\/tech\/2019\/11\/19\/police-technology-and-surveillance-politics-of-facial-recognition\/4203720002\/\"><em>John Lewis<\/em><\/a> of Georgia.\n\nFacial recognition software could arguable affect police conduct investigations. For instance, facial recognition software failed to identify the <em><a href=\"https:\/\/www.americanbar.org\/groups\/criminal_justice\/publications\/criminal-justice-magazine\/2019\/spring\/facial-recognition-technology\/\">Boston Marathon terrorist<\/a>s<\/em>\u2014Tsarnaev brothers\u2014with pale skin. But it could have falsely identified an African-American in police database, assuming the suspect is black. Infact, a Massachusetts Institute of Technology research described positive identification of African-American women as a \u201c<a href=\"http:\/\/gendershades.org\"><em>coin toss<\/em><\/a>.\u201d While NIST publishes its findings on facial recognition software, little to <em><a href=\"https:\/\/www.perpetuallineup.org\/\">no law exists<\/a> <\/em>to regulate the use of this technology.\n\nNo state in the United States has a <a href=\"https:\/\/www.perpetuallineup.org\"><em>comprehensive<\/em><\/a> law regulating law enforcement\u2019s use of facial recognition software. Without laws limiting the use of this software, it is prone to misuse. Neither search warrant nor <a href=\"https:\/\/www.perpetuallineup.org\"><em>reasonable suspicion<\/em><\/a> is required to use FRT in identifying suspects. Law enforcement agencies may be pressured to make an arrest when FRT identifies a person as fitting a suspect. For instance in the Boston Marathon bombing, <em><a href=\"https:\/\/www.washingtonpost.com\/world\/national-security\/inside-the-investigation-of-the-boston-marathon-bombing\/2013\/04\/20\/19d8c322-a8ff-11e2-b029-8fb7e977ef71_story.html\">it was critical<\/a><\/em> to quickly identify and arrest the suspect before he struck again. Granted, FRT has been useful in a number of situations. For instance, the Los Angeles Police Department quickly apprehended a dangerous suspect wanted in a<em> <a href=\"https:\/\/www.wired.com\/story\/best-algorithms-struggle-recognize-black-faces-equally\/\">fatal shooting case<\/a><\/em> using FRT. But law enforcement agencies are not transparent with the use of this technology. A Georgetown Law Center on Privacy and Technology report finds that only 4 out of 52 agencies surveyed have a publicly available use <a href=\"https:\/\/www.perpetuallineup.org\"><em>policy<\/em><\/a>.  Maryland\u2019s facial recognition software has never been <a href=\"https:\/\/www.perpetuallineup.org\"><em>audited<\/em><\/a> for misuse. Even if suspects have been misidentified, there is no way for the public to know because no record exists.\n\nComprehensive laws guiding the use of FRT ought to be enacted on the federal level as a counterpart to the <em><a href=\"https:\/\/www.lawyers.com\/legal-info\/personal-injury\/types-of-personal-injury-claims\/wiretap-act-privacy.html\">Wiretap Act<\/a><\/em>. Although FRT is still developing, Congress ought to enact laws to guide its use in the right direction, especially in areas of racial and gender bias.\n\n<img loading=\"lazy\" decoding=\"async\" class=\"alignleft  wp-image-2931\" src=\"https:\/\/live-journal-of-law-and-public-policy.pantheonsite.io\/wp-content\/uploads\/2020\/04\/Oluwasegun-Joeseph.jpg\" alt=\"Oluwasegun Joeseph\" width=\"120\" height=\"122\" srcset=\"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-content\/uploads\/sites\/3\/2020\/04\/Oluwasegun-Joeseph.jpg 2179w, https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-content\/uploads\/sites\/3\/2020\/04\/Oluwasegun-Joeseph-293x300.jpg 293w, https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-content\/uploads\/sites\/3\/2020\/04\/Oluwasegun-Joeseph-1001x1024.jpg 1001w, https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-content\/uploads\/sites\/3\/2020\/04\/Oluwasegun-Joeseph-768x786.jpg 768w, https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-content\/uploads\/sites\/3\/2020\/04\/Oluwasegun-Joeseph-1502x1536.jpg 1502w, https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-content\/uploads\/sites\/3\/2020\/04\/Oluwasegun-Joeseph-2002x2048.jpg 2002w\" sizes=\"auto, (max-width: 120px) 100vw, 120px\" \/>About the Author: Oluwasegun Joseph is a Third-year Cornell Law Student who enjoys following developing stories at the intersection of public policy and law. One of his goals is to become a federal prosecutor.\n\n&nbsp;\n\n&nbsp;\n\nSuggested Citation: Oluwasegun Joseph, <em>Facial Recognition Software, Race, Gender Bias, and Policing, <\/em>Cornell J.L. &amp; Pub. Pol\u2019y, The Issue Spotter, (Apr. 10, 2020), <em><a href=\"https:\/\/live-journal-of-law-and-public-policy.pantheonsite.io\/facial-recognition-software-race-gender-bias-and-policing\/\">https:\/\/live-journal-of-law-and-public-policy.pantheonsite.io\/facial-recognition-software-race-gender-bias-and-policing<\/a>.<\/em>","protected":false},"excerpt":{"rendered":"<p>(Source) &nbsp; Facial Recognition Technology (FRT) identifies a person\u2019s face by navigating through computer programs that access thousands of cameras worldwide to identify a suspect, criminal or fugitive. FRT could even accurately identify a person\u2019s from a blurry captured image or instantaneously identify the subject among a crowd. This is the fantasy portrayed in Hollywood&#8230;<\/p>\n","protected":false},"author":1,"featured_media":2934,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[14,15,16,17,18,19,21,24,25,27,28],"tags":[195,609,694,879,1198,1231],"class_list":["post-2930","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-archives","category-authors","category-blog-news","category-certified-review","category-feature","category-feature-img","category-spotters","category-notes","category-policycontributor-blogs","category-recent-stories","category-student-blogs","tag-bias","tag-facial-recognition","tag-frt","tag-jlpp","tag-policing","tag-privacy"],"acf":[],"_links":{"self":[{"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/posts\/2930","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/comments?post=2930"}],"version-history":[{"count":0,"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/posts\/2930\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/media\/2934"}],"wp:attachment":[{"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/media?parent=2930"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/categories?post=2930"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/publications.lawschool.cornell.edu\/jlpp\/wp-json\/wp\/v2\/tags?post=2930"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}