Tech

Google Translate’s algorithm has a gender bias

The algorithm used in Google Translate reveals a gender bias when translating genderless pronouns

In Turkish, for example, one pronoun, “o,” can be used to describe he, she or it. When Google Translate translates from Turkish to English, it has to guess which one it is — and it tends to guess that the sentence is referring to a “he.”

The following poem was written by Google Translate and is the result of translating Turkish sentences with the “o” pronoun into English. The poem was inspired by a Facebook post that first noticed the bias translation from Turkish to English.

Gender
by Google Translate

he is a soldier
she’s a teacher
he is a doctor
she is a nurse

he is a writer
he is a dog
she is a nanny
it is a cat

he is a president
he is an entrepreneur
she is a singer
he is a student
he is a translator

he is hard working
she is lazy

he is a painter
he is a hairdresser
he is a waiter
he is an engineer
he is an architect
he is an artist
he is a secretary
he is a dentist
he is a florist
he is an accountant
he is a baker
he is a lawyer
he is a belly dancer

he-she is a police

she is beautiful
he is very beautiful
it’s ugly
it is small
he is old

he is strong
he is weak
he is pessimistic

However, the algorithm isn’t completely at fault, as it’s mostly reflecting a cultural bias that already exists. In Estonian, the Google algorithm translates “[he/she] is a doctor” to “she,” according to Quartz.

But in many languages, when faced with a gender tie, the algorithm will choose “he” over “she.” In Chinese, the pronoun for “he” can also be used for “they,” so Google always translates into “he” if the gender is unknown. It will only translate to “she” if the sentence uses the specific “she” pronoun. And in Finnish, the pronoun “hän” can mean “he” or “she,” but Google always translates it to “he.”