A SELFIE IMPROVING APP that probably never needed to exist in the first place has apologised after it suggested via a filter that the lighter you are the hotter you are.
This is some racist bullshit, and it's bound to be explained as a mistake by the company because quite frankly no one would do that in 2017.
You should read the reviews on the Apple App store. They are not complimentary about any of the filters, and many people have noticed that often the controversial filter will just make people lighter or take their glasses off. We don't know what people were expecting to be honest, because this is not the sort of thing that we indulge ourselves in.
The top review for the app on the App Store is a one star one that does not hold back and seems to come from a key part of the FaceApp demographic.
"I was playing around with filters and old pictures and the ‘flash' option in the free game took away the glasses in one picture. Also, on a picture of a friend of mine it lightened her skin, took away her glasses, and changed all of her features so that she was unrecognizable. Additionally, the ‘male' option took away all my hair in close proximity to or touching my face but kept any hair that is longer or far from the face so that it was floating," it says.
"Finally, this only works in photos with only one person who is in the exact middle. Otherwise it says there's no faces in the photo. I don't get that. It seems to me that the real use of this app is to make people smile in pictures so that you can use them, or other things like that. How can it be any real use if you can pretty much only use selfies? Overall, this app only fully works for white men who don't wear glasses. Fail."
Yaroslav Goncharov, CEO and founder of the company which makes the app told BBC's Newsbeat that the company knows that it has done a bad thing, wants to blame the computers, and has changed the name of the filter from ‘Hot' to ‘Spark'.
"We are deeply sorry for this unquestionably serious issue," he said. "It is an unfortunate side-effect of the underlying neural network caused by the training set bias, not intended behaviour. To mitigate the issue, we have renamed the effect to exclude any positive connotation associated with it. We are also working on the complete fix that should arrive soon." µ
We'll soon have EUV to thank for smaller chips and better phones
Just two years after he co-founded the non-profit AI safety group
Firm claims devices will allow 'untethered VR from anywhere in the world'
The file-sharing web and desktop clients could have shared a little too much