We need to talk about bias in machine translation. Translations produced by machines are often biased because of ambiguities in gender, number and forms of address. For example, when translating from English into French, should student be translated as male étudiant or female étudiante? Should you be translated as informal tu or formal vous? Machines often resolve these ambiguities badly and with bias because they don’t know what the user meant.
Fairslator is an experimental application which removes many such biases. Fairslator works by examining the output of machine translation, detecting when bias has occurred, and correcting it by asking follow-up questions such as Do you mean male student or female student? Are you addressing the person casually or politely? Fairslator is a human-in-the-loop translator, built on the idea that you shouldn’t guess if you can ask.
My name is Michal Měchura. I am a freelance language technologist. I started Fairslator because I was frustrated with how badly machine translators handle ambiguous input. No matter how smart the AI gets, some ambiguities will always be unresolvable because there are no clues in the input text. The only way to resolve them is to ask the user to disambiguate. Fairslator is where I’m tinkering with algorithms and UX for doing exactly that.