AI challenges gender-neutral connotations of ‘people’
What is a person? Theoretically, the term is a gender-neutral one that denotes a member of our species – ‘people’ and ‘person’ can refer to groups or individuals without implying maleness or femaleness. However, a study by New York University has challenged this idea – according to a team of psychology and linguistics researchers, the concept of a ‘person’ or ‘people’ may be gender-neutral in terms of its definition, but not in relation to how we use these words and ideas in society, and that we actually tend to prioritise men when referring to people in general. These findings concur with Richard A. Lovett’s observation for Cosmos magazine: “Whatever terms people may use in describing the average human, they are often mentally defaulting to ‘male’.”
Language has always been gendered, in terms of both the actual structure of the language (as a French and Spanish speaker, I can more than confirm that) and the societal connotations attached to words through their usage throughout time. In English, for example, imagine if I described someone as ‘pretty’ – you’re more likely to picture a female, because over the years, the word has been more associated with femininity than masculinity. It’s here that the concept of neutral words crops up – theoretically, the use of ‘person’ should not skew in favour of one gender, and yet the research suggests this is not the case.
The team employed artificial intelligence algorithms that learn the meaning of words based on how they’re used
According to April Bailey, a postdoctoral researcher in New York University’s Department of Psychology and the lead author of the paper, research in this area has generally been focused on explicitly gendered questions: “Many forms of bias, such as the tendency to associate ‘science’ with men more than women, have been studied in the past, but there has been much less work on how we view a ‘person’.”
So how do you examine a word, and how it is used in wider society? The team employed artificial intelligence algorithms that learn the meaning of words based on how they’re used, and then provided them with a language repository – Common Crawl, which includes more than 630 billion mostly English-language words appearing on nearly three billion web pages. The researchers were particularly interested in how meaning is linked to context and use (or, to put it another way, the words can find their implicit meaning changed, or unfamiliar words defined, through the language and context that surrounds them).
The team conducted three studies. In the first, the similarity in meaning between words for people and words for men/women were compared – they found that ‘people’ words overlapped statistically significantly more with the concept ‘men’ than ‘women’, implying that ‘people’ and ‘men’ are more similar in their meaning. They then examined descriptor words for the three groups, and again found a greater level of overlap between men and people (for traits such as ‘extroverted’ and ‘analytical’). The last study studied the use of verbs, and again found these words were used more similarly to words for men than women. As the title of the Science Advances piece says, “based on billions of words on the internet, people = men”.
“This is the first [research] to study this really general gender stereotype – the idea that men are sort of the default humans” – Molly Lewis, Carnegie Mellon University
Molly Lewis, a psychologist at Carnegie Mellon University who was not involved in the study, said: “This is the first [research] to study this really general gender stereotype – the idea that men are sort of the default humans – in this quantitative computational social science way.” And therein lies one of the most immediate implications of this work. Although past research has uncovered other types of gender bias, this study is novel in its interrogation of the neutral use of ‘person’, a concept that the paper says is “the basis for nearly all health, safety and workplace policy-making enacted in modern societies”. If, in striving for a neutral representation, the very concept of neutrality favours men, that subtly and unknowingly disadvantages women.
It’s also true that, as Bailey notes, the development of AI tools could feature major gender bias, as this technology is trained through similar data to that used in this study. She notes: “AI learns from us, and then we learn from it. And we’re kind of in this reciprocal loop, where we’re reflecting it back and forth. It’s concerning because it suggests that if I were to snap my fingers right now and magically get rid of everyone’s own individual cognitive bias to think of a person as a man more than a woman, we would still have this bias in our society because it’s embedded in AI tools.”
Comments