City
Epaper

Facial recognition software may omit transgender: Study

By IANS | Updated: October 30, 2019 14:00 IST

Facial recognition software can categorise the gender of many men and women with remarkable accuracy but if that face belongs to a transgender person, such systems get it wrong more than one third of the time, says a new study.

Open in App

"We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders," said study lead author Morgan Klaus Scheuerman from the University of Colorado Boulder.

"While there are many different types of people out there, these systems have an extremely limited view of what gender looks like," Scheuerman added.

Previous research suggests they tend to be most accurate when assessing the gender of white men, but misidentify women of colour as much as one-third of the time.

"We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender," said senior author Jed Brubaker.

"We set out to test this in the real world," Brubaker said.

For the findings, researchers collected 2,450 images of faces from Instagram, each of which had been labelled by its owner with a hashtag indicating their gender identity.

The pictures were then divided into seven groups of 350 images (#women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary) and analysed by four of the largest providers of facial analysis services (IBM, Amazon, Microsoft and Clarifai).

Notably, Google was not included because it does not offer gender recognition services.

On average, the systems were most accurate with photos of cisgender women (those born female and identifying as female), getting their gender right 98.3 per cent of the time. They categorised cisgender men accurately 97.6 per cent of the time.

But trans men were wrongly identified as women up to 38 per cent of the time.

And those who identified as agender, genderqueer or nonbinary - indicating that they identify as neither male or female - were mischaracterised 100 per cent of the time.

The study also suggests that such services identify gender based on outdated stereotypes.

When researcher Scheuerman, who is male and has long hair, submitted his own picture, half categorieed him as female.

"These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recogniwed as a man or a woman. And that impacts everyone," said Scheuerman.

The authors said that they would like to see tech companies move away from gender classification entirely and stick to more specific labels like 'long hair' or 'make-up' when assessing images.

The research is scheduled to be presented in November at the ACM Conference on Computer Supported Cooperative Work in the US.

( With inputs from IANS )

Tags: University Of Colorado BoulderIBMmicrosoft
Open in App

Related Stories

Business‘Microsoft Is a Digital Weapons Manufacturer’: Indian-American Engineer Calls Out Gates, Ballmer, Nadella Over AI Ties to Gaza War (Watch Video)

TechnologyMicrosoft to Bid for TikTok: Will the App Make a Comeback in India?

TechnologyMicrosoft Layoffs: Company Plans Workforce Reduction in 2025, Targets Low-Performing Employees

TechnologyMicrosoft To Train 10 Million People in India on AI Skills by 2030, Says Satya Nadella

NationalPM Narendra Modi Meets Microsoft CEO Satya Nadella, Says Discussion on Tech, Innovation and AI

टेकमेनिया Realted Stories

TechnologyIndia’s microfinance sector projected to grow by 12–15 pc in FY26

TechnologyIndia sees 92 pc surge in job applications by women for enterprise job roles: Report

TechnologyNew study links ultra-processed foods to preventable premature deaths

TechnologyTrade minister vows all-out efforts to boost foreign investment, reshoring firms

TechnologyIndian stock market opens higher, Sensex up 400 points in early trade