YOUR FACE IS KINDA A BIG DEAL, because of privacy and all that, which is why Microsoft wants to have ethical principles in place to prevent its facial recognition tech from being abused.
Bloomberg reported that Microsoft's president and chief legal officer Brad Smith said Redmond has plans to get its ethical principles drafted and have tools and governance systems in place to facilitate its self-imposed rules on its tech. It also plans to commence testing of the principles and systems behind them in March.
Such a move might seem a bit like Microsoft is almost shooting itself in the foot by limiting what people can do with its artificial intelligence (AI) powered tech.
But the desire for such restrictions ins motivated by Microsoft wanting to cut off some of the risks of biassed outcomes for facial recognition, as well as ensure that the tech doesn't abuse one's privacy like ol' Peeping Creepin' Tom from that shabby looking house a few doors down.
Part of this is down to the tendency of facial recognition tech to make mistakes, especially when it comes to people with darker skin - such a problem stems from using data sets that aren't diverse enough and were probably selected by white people who unconsciously didn't notice the lack of diversity.
Thing is, if facial recognition tech fingers you as a criminal, be it by mistake or not, then there are going to be consequences, all of which lead to an invasion of privacy.
As such, Microsoft is pretty keen on keeping its tech out of the hands of law enforcement who don't have proper safeguards in place to prevent tech bias, to the extent that Microsoft had apparently turned down contracts, said Smith, noting it was around the use if tech for public surveillance in a capital""in a country where we were not comfortable that human rights would be protected".
While Smith doesn't want Microsoft to lose customers, he still wants the company to have its self0imposed principles in place and that industry-wide regulation of facial recognition tech is needed.
"You never want to create a market that forces companies to choose between being successful and being responsible and unless we have a regulatory floor there is a danger of that happening," said Smith.
That might be some wishful thinking given it was found that London's Met rozzers had been snooping on the capital's Xmas shoppers with, admittedly a bit crap, facial recognition tech.
But this has been a line Smith has been towing for a while, having already urged governments to get facial recognition tech regulations up and running. µ
The week in Google
The scandal that just keeps giving
Clip to the end....