Gomez is one of 28 U.S. members of Congress, which exactly matches the photos of the arrested people, as part of the American civil liberties league’s test of Amazon’s re cognition program last year.

Nearly 40% of the false matches in the tools used by the police in Amazon involve people of color.

These findings have attracted more and more attention from civil liberties groups, legislators and even some technology companies. As technology becomes more mainstream, facial recognition may harm ethnic minorities. Face recognition technology has been used to unlock iPhone and Android phones, and police, retailers, airports and schools are slowly contacting it. However, studies have shown that facial recognition systems are more difficult to recognize women and people with darker skin color, which may lead to disastrous false positives.

“This is an example of how the application of technology in law enforcement can have harmful consequences for communities that have been over regulated,” said Jacob snow, a technology and civil liberties lawyer at ACLU in Northern California.

Facial recognition has its advantages. Maryland police used this technology to find a suspect in the mass shooting in the capital communique; In India, it helped the police identify nearly 3000 missing children in four days; Facebook uses this technology to identify people in photos for visually impaired people; It has become a convenient way to unlock smartphones.

But the technology is not perfect, and there are some embarrassing mistakes. Google photos once identified two blacks as gorillas. In China, a woman claimed that her colleagues could unlock her iPhone X using face ID. When law enforcement agencies use facial recognition to identify suspect or expose people in protest, the risk of mistaken recognition will increase.

“When law enforcement uses this technology to determine whether someone is wanted for a crime, that’s a completely different situation,” Gomez said. “The wrong identity can lead to fatal interaction between law enforcement and that person.”

Legislators are not shocked by the ACLU’s findings and point out that scientists often think more about how to make something work than about how the tools they build affect minorities.

Technology companies responded to criticism by improving the data used to train facial recognition systems, but like civil rights activists, they also called for more government regulation to help protect technology from abuse. Researchers at Georgetown University Law School estimate that one in two American adults is used for the facial recognition network used by law enforcement officials.

Amazon disagrees with ACLU research and believes that the organization used the wrong method when testing the identification system.

“Machine learning is a very valuable tool that can help law enforcement agencies. Although there may be misjudgment, we can’t throw away the oven because we set the wrong temperature to scorch the pizza.” Matt wood, general manager of artificial intelligence, defended in a blog post of Amazon Web services.

Identify problems

Facial recognition services may be more difficult to identify ethnic minorities and women than white men for a variety of reasons.

Claire Garvey, a senior assistant at the center for privacy and technology at Georgetown Law School, said that public photos used by scientists to train computers to recognize faces may include more whites than whites. For example, if a company uses photos in the celebrity database, it will tend to be white because there are not enough minorities in Hollywood.

Garvie said that the engineers of technology companies, mostly white, may also unknowingly design facial recognition systems to better identify certain races. Research shows that it is more difficult for people to recognize the faces of another race, and “cross racial prejudice” may infiltrate artificial intelligence. She added that there are challenges to the lack of color contrast in dark skin and the use of cosmetics by women to cover wrinkles or different hair.

According to a study conducted by researchers in the Media Laboratory of MIT, facial recognition systems made by Microsoft, IBM and face + + have difficulty in determining the gender of black women like African Americans. 35% of dark skinned women are mistaken for Caucasians and 1% of light skinned men.

Another study released by MIT in January showed that Amazon’s facial recognition technology is more difficult than Microsoft or IBM’s tools to determine the gender of black women.

Leave a Reply

Your email address will not be published. Required fields are marked *