MICROSOFT HAS been showing off improvements to its facial recognition tools that now responds better to skin tones and gender.
The company says that its latest tweaks have improved the error rates for men and women with darker skin tone by up to 20 times. Overall, error rates for women, regardless of skin tone, are reduced nine times.
Microsoft's John Roach explains: "The higher error rates on females with darker skin highlights an industrywide challenge: Artificial intelligence technologies are only as good as the data used to train them.
"If a facial recognition system is to perform well across all people, the training dataset needs to represent a diversity of skin tones as well as factors such as hairstyle, jewellery and eyewear."
The data has now been added to the Face API in Azure Cognitive Services where you can see the three big changes - better sampling of a wider spectrum of subjects, revised benchmarks, and better organisation of the classifier.
"We had conversations about different ways to detect bias and operationalize fairness. We talked about data collection efforts to diversify the training data. We talked about different strategies to internally test our systems before we deploy them," said Hanna Wallach, a senior researcher at Microsoft's NY research lab and an expert on fairness, accountability and transparency in AI systems.
The gender classifier was apparently the most difficult thing to improve. The old adage of "garbage in - garbage out" has been around as long as computing and in these relatively early days of AI, we're increasingly having to learn that if we put bias in, we'll get bias out - and compensating for that is going to be ruddy hard.
In the example Microsoft gives, an AI will likely learn that a CEO is a man, because all the information it has about CEOs includes pictures of men.
Essentially, AI is currently living through the attitudes of the seventies. It's now up to Microsoft and its ilk to enlighten it. μ
But they didn't get off scot-free
Borkage also downs banks telephone banking service
Not the microwave, calm down
Oh come on, not this again