RESEARCH HAS FOUND that AI machines that have been trained and educated with photographs of people have developed a rather sexist view of women. Which is odd.
According to Wired, University of Virginia computer science professor Vicente Ordóñez was doing some work when he discovered that when a computer saw a kitchen it would associate that with women. This got him wondering whether developers were unintentionally putting a sexual bias into their work.
Between us and you Ordóñez, we think they might be, and you have probably come to the conclusion as well.
Wired says that the researcher had a go on two popular research engines, one of which is used by Microsoft and Facebook, and found that the bias was there too. While shopping and washing were tied to lady types, us manly men got linked to stuff like shooting. Shooting machines, maybe.
Any software using these databases will get the same bias but magnified. A research paper done by the researchers shows a photo of a man standing by a cooker that has been labelled as "Woman" by some stupid box of bolts.
The researchers found a way of addressing the stupid bias, but it sounds like it involves a human being sitting around spotting errors of AI judgement, which is hardly the point. Other experts said that it is critical that this gets fixed because otherwise all smart things will be as bad as your uncle when he has a few shandies in him.
Aylin Caliskan, a researcher at Princeton, told Wired that something has to be done. "Steps can be taken afterwards to measure and adjust any bias if needed," she said.
"We risk losing essential information," she says. "The datasets need to reflect the real statistics in the world."
We are sad to report that they kinda do. µ
A surprisingly busy week in a quiet month
Measures just 15.75mm at its thickest point
Firm expects GPU sales to start drying up