If Police Glasses Come to America Here's What Might Happen

You could get arrested!

Leah Zitter
Read +
Follow Us

Chicago. Technology companies are working with police departments to don surveillance glasses to single out suspects anywhere, anytime, in a heartbeat. The gig works in Dubai and in China - where they also use it for tracking people who use more than the approved amount of toilet paper in public restrooms. Police departments across the U.S. hope to wear body cameras to identify faces of suspects in real time. This would help them catch unsavory characters like the nude gunman of the Waffle House shooting faster.

In theory, this technology sounds wonderful. Missing people could be found in a jiffy. Scientists simply slip their photographs into the feed. The feed, in turn, connects to the device and the A.I. picks them up from a crowd anywhere. Terrorist attacks could be avoided. The feed would tap into the database of the U.S. Bureau of Counterterrorism to root out potential terrorists using facial recognition. Cameras could zone in on real-time identical features and police swoop to pick up the suspect. Humans are biased and make mistakes. In contrast, artificial intelligence is supposed to work on mathematical algorithms to cut down on errors.

To test the capabilities of the system, the BBC sent journalist John Sudworth to Guiyang, a southern city with an urban population of about 3.5 million people, to see if he could get lost in the crowd. Surveillance cameras readily identified Sudworth as a “suspect” and police had him in custody within seven minutes.

The algorithm works. Seemingly, the only ways you can hide is by coloring your face, donning a sombrero, and maybe wearing a mask.

Some People Could Beat The A.I. By Changing Their Facial, Ethnic, or Racial Characteristics.

Back in 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid’s blue Huffy bicycle and a silver Razor scooter. She and her friend grabbed the bike and scooter and tried to ride them down the street. When the 18-year-old girls realized they were too big for the tiny conveyances, they dropped them and walked away. Too late. A witness called the police. Borden and her friend were charged with burglary and petty theft at a total of $80. This was Borden's first crime.

The previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store. Prater had an admirable history of repeated and attempted armed robbery.

An officer using facial recognition glasses in Beijing
An officer using facial recognition glasses in Beijing

Something odd happened when Borden and Prater were booked into jail, a computer program spat out a score predicting the likelihood of each committing a future crime. Borden - who is black - was rated a high risk. Prater - who is white - was rated a low risk.
Two years after, Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for breaking into a warehouse and stealing thousands of dollars’ worth of electronics.

In 2015, ProPublica ran a test on the computerized risk scores assigned to more than 7,000 people arrested in Broward County, Florida, in 2013 and 2014 and checked to see how many were arrested over the next two years.

The results?

  • The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
  • White defendants were mislabeled as low risk more often than black defendants.

In another study, Harvard professor Latanya Sweeney looked at the Google AdSense ads that came up during searches of names associated with white babies (Geoffrey, Jill, Emma) and names associated with black babies (DeShawn, Darnell, Jermaine). She found that ads containing the word “arrest” tagged, at least, 80% of “black” name searches but fewer than 30 percent of “white” name searches. In another story, two men used Google's photo software and found themselves labelled “gorillas” - the data lacked enough examples of people of color.

To the unending disappointment of 5956 entrants of online pageant Beauty A.I., the A.I. algorithm chose 44 winners only one of whom had dark skin. The programme seemed to conflate “beauty” with Western-centric standards. No, the A.I. is neither racist nor bent on punishing people of color for daring to integrate with whites. Rather, machines are programmed by humans and subsequently fed with their programmer’s bias.

Let’s say programmers are building a computer model to identify terrorists. First, they train the algorithms with photos that are tagged with certain names and descriptors that programmers think typify terrorists. Then, they put the program through its paces with untagged photos of people and and let the algorithms single out the “terrorist” based on what they learned from the training data. The programmers see what worked and what didn't and fine-tune from there.

The program is supposed to work, but bias intrudes when the training data is insufficiently diverse, prompting the software to guess based on what it “knows” - as happened with the black men tagged “gorillas”. Mistakes also occur when few pictures of alternate data are included - say, when 85% of the photographs show swarthy bearded males wearing turbans. The terrorist may be a white female, but the smart glasses convict a Muslim male, instead.

Unfortunately, mistaken arrests carry enduring results.

Borden and her friend spent two nights in jail before being released on bond. “We literally sat there and cried” the whole time they were in jail, Borden's friend Sade Jones told ProPublica. Jones, who had never been arrested before, was rated a medium risk and has ever since struggled to find work. “I went to McDonald’s and a dollar store, and they all said no because of my background,” she said. “It’s all kind of difficult and unnecessary.”

While some applaud AI for returning lost kids and for catching potential terrorists, others deem the technology more insidious. It could haul you to jail, instead.