Image Image Image Image Image Image Image Image Image Image

Mace & Crown | December 14, 2017

Scroll to top

Top

No Comments

Artificial intelligence recognizing sexuality through pictures

Audra Reigle | Technology Editor

 

Artificial intelligence is expanding into the world of sexuality. It’s getting its own “gaydar,” and its suggested that it could be better than a human’s.

 

A study from Stanford University said that an electronic classifier could correctly determine a gay and heterosexual male 81% of the time when shown one image of a person. It could correctly determine a lesbian and heterosexual female 74% of the time. Humans received 61% for males and 54% for women. The algorithm’s accuracy increased to 91% for males and 83% for females when shown five images of a person.

 

The machine intelligence used more than 35,000 public images posted by men and women on a U.S. dating site, according to The Guardian. “People of color were not included in the study, and there was no consideration of transgender or bisexual people,” the article says.

 

The software used for the study was called VGG-Face, according to The Economist. The software identifies the images with long strings of numbers known as faceprints. “The next step was to use a simple predictive model, known as logistic regression, to find correlations between the features of those faceprints and their owners’ sexuality (as declared on the dating website),” the article says.

 

Sexuality isn’t the only thing AI can tell by looking at your face. Russian app FindFace identifies people with pictures on social network VKontakte with a 70% accuracy rate, according to The Economist. The Chinese government keeps records of their citizens’ faces. The FBI has photos of about half of America’s adult population stored in a database.

 

However, it has its benefits. Faces can be analyzed by machines to provide diagnoses of rare genetic conditions earlier than normally possible. Machines could help autistic people grasp social signals they find elusive.

 

The systems could develop biases though. If the system is only trained on mostly white faces, it will not work as well with faces of different ethnicities. These biases have already shown up in automated assessments in courts on decisions about bail and sentencing.

 

It is clear that artificial intelligence is evolving. While it has its pros and cons, it also has its biases. Those training the AI will need to bring in people of color and those who are transgender or bisexual to get the machine to recognize those faces as well if it wants to eliminate any biases.