We need to talk about bias in machine translation. Translations produced by machines are often biased because of ambiguities in gender, number and forms of address. For example, when translating from English into French, should student be translated as male étudiant or female étudiante? Should you be translated as informal tu or formal vous? Machines often resolve these ambiguities badly and with bias because they don’t know what the user meant.
Fairslator is an experimental application which removes many such biases. Fairslator works by examining the output of machine translation, detecting when bias has occurred, and correcting it by asking follow-up questions such as Do you mean male student or female student? Are you addressing the person casually or politely? Fairslator is a human-in-the-loop translator, built on the idea that you shouldn’t guess if you can ask.
My name is Michal Měchura.
I am a freelance language technologist.
I started Fairslator because I was frustrated with how badly machine translators handle ambiguous input.
No matter how smart the AI gets, some ambiguities will always be unresolvable because there are no clues in the input text.
The only way to resolve them is to ask the user to disambiguate.
Fairslator is where I’m tinkering with algorithms and UX for doing exactly that.
October 2024 —
We were talking about bias in machine translation
at a Translating Europe Workshop organised by the European Commission in Prague
as part of Jeronýmovy dny,
a series of public lectures and seminars on translation and interpreting.
Video here »
December 2023 —
Fairslator presented a workshop on bias in machine translation
at the European Commission's
Directorate-General for Translation,
attended by translation-related staff from all EU institutions.
November 2023 —
Fairslator went to Translating and the Computer,
an annual conference on translation technology in Luxembourg,
to present its brand new API.
Proceedings from this conference are here, our paper starts on page 98.
November 2023 —
We were talking about gender bias, gender rewriting and Fairslator
at the EAFT Summit
in Barcelona where we also launched an exciting spin-off
project there:
Genderbase,
a multilingual database of gender-sensitive terminology.
February 2023 —
We spoke to machinetranslation.com
about bias in machine translation, about Fairslator, and about our vision for “human-assisted machine translation”.
Read the interview here:
Creating an Inclusive AI Future: The Importance of Non-Binary Representation »
October 2022 —
We presented Fairslator at the
Translating and the Computer
(TC44) conference, Europe's main annual event for computer-aided translation, in Luxembourg.
Proceedings from this conference are here,
the paper that describes Fairslator starts on page 90.
Read our impressions from TC44 in this thread on
Twitter
and
Mastodon.
September 2022 —
In her article
Error sources in machine translation: How the algorithm reproduces unwanted gender roles
(German: Fehlerquellen der maschinellen Übersetzung: Wie der Algorithmus ungewollte Rollenbilder reproduziert),
Jasmin Nesbigall of oneword GmbH talks about bias in machine translation
and recommends Fairslator as a step towards more gender fairness.
September 2022 —
Fairslator was presented at the
Text, Speech and Dialogue
(TSD) conference in Brno.
August 2022 —
Translations in London are talking about Fairslator in their blog post
Overcoming gender bias in MT.
They think the technology behind Fairslator could be useful in the translation industry
for faster post-editing of machine-translated texts.
July 2022 —
We presented a paper titled A Taxonomy of Bias-Causing Ambiguities in Machine Translation
at a Workshop on Gender Bias in Natural Language Processing
during the 2022 Annual Conference of the North American Chapter of the Association for Computational Linguistics
in Seattle.
May 2022 —
Slator.com, a website for the translation industry, asked us for a guest post and of course we didn't say no.
Read What You Need to Know About Bias in Machine Translation »