Facial Recognition: Inevitable or Invasive?


By Shivaune Field

Facial recognition technology has become ubiquitous in our daily lives, whether we realize it or not. With so much of our information going online, as well as both obvious and subversive surveillance cameras monitoring our daily movement in the physical world, it seems almost impossible to avoid having one’s face detected, recorded, and cataloged.

Federal legislation introduced in Senate this week aims to put a stop to the government’s identification of citizens using the digital technology. The Facial Recognition and Biometric Technology Moratorium Act would prevent local and state police departments, as well as federal agencies, from using facial recognition. The bill has backing in both the House of Representatives as well as the Senate, and if it passes it would be the first piece of federal legislation curtailing the technology.

While Boston and San Francisco have banned the technology outright, California passed a narrow law last year that prevents police departments from using facial recognition with body cameras. There is also a controversial proposal making its way through the Californian legislature currently, that aims to regulate — and therefore expand, say, critics – the use of the technology by both government and commercial entities.

The Coalition for Critical Technology, a consortium of academics committed to the abolition of the ‘Tech To Prison Pipeline’ also spoke out against facial recognition this week. In an open letter published on Medium signed by more than 1000 academics, the group called for the cancelation of a paper that had been accepted for publication by Springer Nature. The manuscript was written by faculty at Harrisburg University who claims to have developed an automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal.” In other words, the Harrisburg researchers believe they have found a way to use AI to predict criminality by simply reviewing an individual’s face.

Springer Nature has since said it won’t publish the research, though the outrage from the academic and wider community continues. A predictive algorithm modeling criminality will only reproduce the biases already reflected in the criminal justice system, notes the Coalition for Critical Technology.

Michigan Robert Williams agrees. He was wrongly arrested by the Detroit police in January after being misidentified by facial recognition technology. He has since written a Washington Post op-ed detailing his experience, titled ‘I Was Wrongfully Arrested Because Of Facial Recognition. Why Are Police Allowed To Use It?’

Microsoft, IBM, and Amazon are also questioning their use of the technology. Microsoft President Brad Smith announced this week that the company “will not sell facial recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology.” IBM has gone a step further noting it will not sell the technology at all.

Reports from Georgetown University and the U.S. Department of Commerce further illuminates the downfalls of facial recognition technology. The National Institute of Standards and Technology is a division of the Department of Commerce and issued a paper in December last year studying facial recognition. The paper noted that “false positives rates often vary by factors of 10 to beyond 100 times” and higher rates of false positives for Asian and African American faces relative to images of Caucasians.

Clearly technology that has error rates that can lead to the incorrect identification of criminals is hugely problematic. We must do better to ensure that if we are using facial recognition and other types of invasive digital technology, it is done with extreme care and caution for the wellbeing of citizens. Becoming aware of the pitfalls is essential, as is designing policy and stopgap measures to combat the injustice that could otherwise become embedded into the digital solutions we create.

Leave a comment

Your email address will not be published. Required fields are marked *