Tuesday, 27 November 2018

Google removes gendered pronouns from Gmail’s Smart Compose to avoid AI bias

Gmail’s Smart Compose is one of Google’s most interesting AI features in years, predicting what users will write in emails and offering to finish their sentences for them. But like many AI products, it’s only as smart as the data it’s trained on, and prone to making mistakes. That’s why Google has blocked Smart Compose from suggesting gender-based pronouns like “him” and “her” in emails — Google is worried it’ll guess the wrong gender.
Reuters reports that this limitation was introduced after a research scientist at the company discovered the problem in January this year. The researcher was typing “I am meeting an investor next week” in a message when Gmail suggested a follow-up question, “Do you want to meet him,” misgendering the investor.
Gmail product manager Paul Lambert told Reuters that his team tried to fix this problem in a number of ways but none were reliable enough. In the end, says Lambert, the easiest solution was simply to remove these types of replies all together, a change that Google says affects fewer than one percent of Smart Compose predictions. Lambert told Reuters that it pays to be cautious in cases like these as gender is a “big, big thing” to get wrong.
This little bug is a good example of how software built using machine learning can reflect and reinforce societal biases. Like many AI systems, Smart Compose learns by studying past data, combing through old emails to find what words and phrases it should suggest. (Its sister-feature, Smart Reply, does the same thing to suggest bite-size replies to emails.)
In Lambert’s example, it seems Smart Compose had learned from past data that investors were more likely to be male than female, so wrongly predicted that this one was too.

No comments:

Post a Comment