LGBT Groups Blast a New Study Claiming Computers Can Tell If You're Gay or Straight

ashley.rae | September 11, 2017
DONATE
Font Size

Gay rights organizations are condemning new research that claims artificial intelligence can correctly predict an individual’s sexual orientation in up to 91 percent of cases for men and 83 percent of cases for women.

In a study, Stanford University’s Michal Kosinski Yilun Wang created a logistic regression that analyzed 35,326 facial images to determine the sexuality of dating site users, matched against their self-described sexual orientations listed on the dating site.

The Economist explains that when the algorithm was shown a photo of a gay and straight man, chosen at random, the algorithm could distinguish between the gay and straight men in 81 percent of cases. When shown photos of women, the algorithm could determine whether the women were straight or gay in 74 percent of cases. When the algorithm was given five photos of the men and women, it could predict sexuality in 91 percent of cases for men and 83 percent of cases for women.

According to the study authors, the algorithm was able to determine who was gay and who was straight due to their “gender-atypical facial morphology, expression, and grooming styles.”

Unlike the algorithm, humans could only accurately predict sexuality in 61 percent of cases for men and 54 percent of cases for women.

In a joint statement, GLAAD and the Human Rights Campaign disputed the fact that the artificial intelligence was able to accurately determine sexuality.

Jim Halloran, GLAAD’s Chief Digital Officer, said the results had to do with beauty standards, not physical traits.

“Technology cannot identify someone’s sexual orientation,” Halloran said. “What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated.”

“This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites,” he continued.

He called the results “reckless” and said they could be used as a “weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous.”

Ashland Johnson, the Human Rights Campaign Director of Public Education and Research, called the study “dangerously bad information” and claimed it could lead to regimes persecuting people who appear to be gay:

This is dangerously bad information that will likely be taken out of context, is based on flawed assumptions, and threatens the safety and privacy of LGBTQ and non-LGBTQ people alike. Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime’s efforts to identify and/or persecute people they believed to be gay. Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world -- and this case, millions of people’s lives -- worse and less safe than before.

GLAAD and the Human Rights Campaign came up with a list of gripes with the research, such as the fact that the study only examined white people and that it “assumed” there are only gay and straight people.

According to the press release, GLAAD and the Human Rights campaign spoke to the researchers about their concerns “several months ago,” but  “[t]here was no follow-up after the concerns were shared and none of these flaws have been addressed.”

Despite the claim that the research is anti-gay, Kosinski, one of the co-authors of the study, told the Guardian the study is “a great argument against all of those religious groups and other demagogues who say, ‘Why don’t you just change or just conform?’ You can’t stop, because you’re born this way.”

Kosinski also told the Guardian that one of the reasons for conducting the study is to show the potentially dangerous applications of artificial intelligence.

“One of my obligations as a scientist is that if I know something that can potentially protect people from falling prey to such risks, I should publish it,” he told the Guardian. “Rejecting the results because you don’t agree with them on an ideological level … you might be harming the very people that you care about.”

Indeed, the Huffington Post claims the authors wrote in their study, “Tech companies and government agencies are well aware of the potential of computer vision algorithm tools,” adding, “In some cases, losing the privacy of one’s sexual orientation can be life-threatening. The members of the LGBTQ community still suffer physical and psychological abuse at the hands of governments, neighbors, and even their own families.” 

Kosinski also added that he hopes the results he found with his regression are not able to be replicated.

“I hope that someone will go and fail to replicate this study … I would be the happiest person in the world if I was wrong,” he said.

On a statement updated on Sept. 11, the study authors criticized GLAAD and the Human Rights campaign for their "knee-jerk dismissal" of their findings:

It really saddens us that the LGBTQ rights groups, HRC and GLAAD, who strived for so many years to protect the rights of the oppressed, are now engaged in a smear campaign against us with a real gusto.

They dismissed our paper as "junk science" based on the opinion of a lawyer and a marketer, who don’t have training in science.

...If our paper is indeed wrong, we sounded a false alarm. In good faith.

But what if our findings are right? Then GLAAD and HRC representatives’ knee-jerk dismissal of the scientific findings puts at risk the very people for whom their organizations strive to advocate.

Thank you for supporting MRCTV! As a tax-deductible, charitable organization, we rely on the support of our readers to keep us running! Keep MRCTV going with your gift here!

donate