Você está na página 1de 4
Wnited States Senate WASHINGTON, DC 20510 September 17, 2018 ‘The Honorable Joseph Simons The Honorable Maureen K. Olnthausen Chairman Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Ave NW 600 Pennsylvania Ave NW Washington, D.C. 20002 Washington, D.C. 20002 The Honorable Noah Joshua Phillips ‘The Honorable Rohit Chopra Commissioner Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Ave NW 600 Pennsylvania Ave NW Washington, D.C. 20002 Washington, D.C. 20002 ‘The Honorable Rebecca Kelly Slaughter Commissioner Federal Trade Commission 600 Pennsylvania Ave NW Washington, D.C. 20002 Dear Chairman Simons, Commissioner Ohlhausen, Commissioner Phillips, Commissioner Chopra, and Commissioner Slaughter, Facial analysis technologies are a growing presence in Americans’ everyday lives. While they can offer many benefits, we are concerned by the mounting evidence that these technologies can perpetuate gender, racial, age, and other biases.' As a result, their use may violate civil rights laws and could be unfair or deceptive. As the federal agency charged with protecting consumers from unfair and deceptive practices, we write to urge the Federal Trade Commission (FTC) to investigate the risks that biases in facial analysis technologies may pose to Americans, Facial analysis technologies use images or videos of a person’s face to identify them or to infer characteristics about them, such as their mood or health.” Applications of the technology can be found throughout our society, including in hiring and for timekeeping in the workplace, attendance in churches, security in schools and shopping centers, and diagnoses in health care. * Garvie, Clare and Jonathan Frankle, “Facial-Recognition Software Might Have a Racial Bias Problem,” The Atlantic, April 7, 2016. hutps://www theatlantic.com/technology/archive/20 16/04/the-underlying-bias-of-facial- recognition-systems/47699 Buolarmwini, Joy. “When the Robot Doesn't See Dark Skin,” New York Times, June 21, 2018, /www.nytimes.conv2018/06/2 l/opinion/facial-analysis-technolozy-bias. html sandoiu. “Why Facial Recognition Is The Future Of Diagnostics,” Medical News Today, December 9, 2017. butps:/www medicalnewstoday.com/articles/3203 16,php Troublingly, facial analysis technologies do not work equally well across racial, gender, age, and other characteristics, They are often developed with data that disproportionately represents white males, and thns they perform poorly for other groups.> Research from MIT. for example, shows that leading facial recognition algorithms are 30 times more likely to misidentity darker-skinned women than lighter-skinned men.‘ The American Civil Liberties Union recently showed that fucial recognition technology incorrectly identified 28 members of Congress as people who had been arrested,’ Notably, the technology disproportionately misidentified African ‘American and Latino members. Such disparities can encode and amplify gender, racial, and other biases that exist in our society and which many are Working so hard to combat Consider, for example, a situation in which an Affican American female in a retail store is misidentified as a shoplifter by a biased facial recognition technology and is falsely arrested based on this information. Such a false arrest can cause trauma® and substantially injure her future hoysing, employment, credit, and other opportunities.’ Or, consider a scenario in which a young man with a dark complexion is unable to withdraw money from his own bank account ‘because his bank's ATM uses facial recognition technology that does not identify him as their customer. ‘These are just a few scenarios that could be considered unfair or deceptive practices under the Federal Trade Commission Act. As you know, under Section 5 of the FTC Act, a practice is unfair if it causes or is likely to cause substantial injury to consumers that they cannot avoid and that is not outweighed by other benefits to them or to competition. Furthermore, information or the omission of information may be decepfive if affects the consumer’s decision 3 Buolamoini, Joy. “Gender Shades: Intersectional Phenotypic and Demographic Evaluation of Face Datasets and Gender Classifiers,” Master's Thesis, Massachusetts Insitute of Technology, September, 2017, hitps:idam- prod media mitedux/2018/02/05/buolamwini-ms-17_WEMjoGY.pdt "Ngan, Met and Patrick Grother, “Pace Recognition Vendor Test (FRV'T) Performance of Automated Gender ification Algorthuns,” National Insitute of Standards and Technology, NISTIR 8052, Aptil 2015, dx, doi org'10.6028/NISTIR.8052 Phillips, P. Jonathon, eta, “An Other-Race Effect For Face Recognition Algorithms,” ACM Transactions on Applied Perception (TAP), Vol. 8. No. 2, 2011. https. org/citaion.cfin?id=1870082 Klare, Brendan F., etal, "Face Recognition Performance; Role of Demographic information," [EEE Transactions ‘on Information Forensics and Security, vo. 7, no, 6, pp. 1789-1801, December 2012, hutpziecexplore.ieee.org/stampystamp jsp tp Searmumber=6327355@ismmber=6342844 * Buotamovini, 2017. 2 Singer. Natasha, “Amazon's Facial Recogzition Wrongly Identifies 28 Lawmakers, A.C.L.U. Says” New York ‘Times, July 26, 2018. htps:/www.nytimes.com/2018/07/26/echmology/amazon-achi-faeial-recognition- Fields, Gary and John R. Emshwiller. “As Arrest Records Rise, Americans Find Consequences Can Last a Lifetime,” Wall Steet Journal, August 18, 2014. httpsv/wvww,ws).comiarticles/as-arrest-records-rise-americens- find-consequences-can-last~-lifeime-{40841S402 *Rueter, Thad. “Biometric ATMs Use Facial Recognition Instead Of Cards and PINs,” SecurefDNews, July 26, 2017. nips:www.secureidnews.com/news-item/bfometric-atms-use-fecial-recognition-instead-of-cerds-and-pins! to purchase or not purchase a product, and it misleads or is likely to mislead a consumer acting reasonably? Users of facial analysis technologies ~ such as the stores or banks in.the above examples — typically purchase or lease systems developed by other companies. These developers rarely if ever test and then disclose biases in their technology." Without information about the biases in a technology or the legal and ethical risks attendant (0 using it, good faith users may be unintentionally and unfaisly engaging in discrimination. Moreover, failure to disclose these biases to purchasers may be deceptive under the FTC Act. “We fre aware that the FTC held a public workshop on the commercial use of facial recognition in 2012. However, this workshop and its resulting report did not discuss race, gender, age. or other biases in the technology." As facial analysis technology is more widely deploye ‘we cannot wait any longer to have a serious conversation about how’ we can create sound policy 10 addiess these concerns. ‘We are encouraged to see that one of the topics of the FTC's recently announced hearings on “Competition and Consumer Protection in the 21" Cennury” this fall is “the consumer welfare implications associated with the use of algorithmic decision tools, artificial intelligence, and predictive analytics.”" We urge you to ensure that these issues are a primary topic at the Novemiber 13-14 hearing on Algorithms, Artificial Intelligence. and Predictive Analytics. In anticipation of that hearing, we ask that the FTC consider the following questions and provide answers by September 28, 2018: ‘* How many complaints has the FTC received to date related to bias in facial analysis or bias in data mining or artificial intelligence? ‘© What is the FTC’s approach to investigating such complaints, algorithms ¢an be difficult? n that detecting bias in © Will the FTC commit to producing the following as outputs of this fall's hearings: © Amassessment of how facial tecognition and other facial analysis technologies may be biased and perpetuate diserimination? $ Federal TYade Commission, “FTC Policy Statement on Deception,” October I, 1983, hitps:/www.fe-gov/systemfiles/documents/public_statements/410531/8310l4deceptionsmit pdf ° Garvie, Clare, Alvaro Bedoye, and Jonathan Frankle, “The Perpetual Line-Up: Unregulated Police Facial Recognition in America,” Center on Privacy & Technology at Georgetown Lav, Getober 18, 2016, bitps:/ww.perpetualtineup.org! >» Rederal Trade Commission, “Facing Facts: Best Practices for Common Uses of Face Recognition Technologies” October, 2012. htips:/www.fic.gov/sites/defaul¥files/documents/reportsfacing-facis-best-practices-common-ses- facial-recognition-technologies/121022facialtechrpt pdt ® Federal Trade Commission, “Hearings on Contpettion and Consumer Protection in the 2st Century.” undated. inips://wwt. fic. gov/policy/earings-competition-consumer-protection ‘© Anassessment of how the marketing and use of biased facial analysis may violate laws prohibiting unfair and deceptive practices? © A set of best practices on the lawful, fair, and transparent use of facial analysis? ‘Thank you for your attention to this important matter. We look forward to engaging with you as the Commission works to address these issues. Sincerely, ‘ , flct Cloner /Lep Ki ‘D. Harris Richard Blumenthal United States Senator United States Senator 4 Bur W Cory A. Booker Ron Wyden United States Senator United States Senator

Você também pode gostar