AMAZON HAS KILLED a secret artificial intelligent (AI) powered tool it uses for recruitment due to the tech's hate of women.
OK, hate might be a tad hyperbolic, but the seller of everything found that its AI system, which filters through CV to select the best ones to put forward for human interview, was dismissing women.
But how did some lines of code end up being sexist, you may ask. Well, thanks to it being trained on 10 years of application data that contains mostly male applicants, the system ended up favouring men over women, reported Reuters, which spoke to folks familiar with the AI.
The system worked by assigning a five-star rating to applicants so that the top scoring ones would go on for more serious job consideration.
But the system was found to be penalising CVs with the word 'women' in them, such as an applicant putting that they were the woman's chess club captain, for example.
Amazon did apparently try and edit the AI to view such terms as neutral but by then it could not say for sure that the AI hadn't developed further anti-female bias.
Furthermore, the AI was found favouring so-called masculine terms such as "executed" and "captured". The tech gods only know what it would have done with the phrase "getting smashed with the lads then heading for a cheeky Nandos with some top bants from Bob the Bantersurus"; it probably would have ejaculated a mess of melted silicon.
According to the report, the recruiters thankfully didn't take the AI's recommendations as gospel, but we suspect a few potentially great female workers slipped through the net thanks to the AIs sexism.
The system was also found to be recommending candidates for unsuitable jobs; cue a facepalm moment. As such, it's been canned with Amazon extracting what it learnt from the system to reportedly create a "much-watered down version" of the recruitment system.
The whole situation highlights the dangers of bias in AIs and the need to be bloody careful with the datasets they're trained upon.
But AI is getting smarter, and we won't be concerned about its odd development foibles until the Google Assistant tells us to get in the kitchen and make it a sandwich.
We should be shocked, but...
But the search giant has now squashed the bug
But it's not yet available here in Blighty