REMEMBER TAY? The chatbot created by Microsoft to be a virtual 'ordinary teen'?
If you don't, you can be forgiven. She had a glorious life of just 18 hours, after which she had been so mistaught by mischievous users that by the time she was pulled offline, she was spouting expletives, expressing racist and homophobic opinions, and proudly boasted of her drug use. Oh yes, she also supported genocide and denied the holocaust.
All in all, we broke her. Well done us.
Now, it emerges that someone else knew all about Tay - Taylor Swift's lawyers. In a new biography of Microsoft's president Brad Smith he explains that when Tay arrived, even before she turned mean, the lawyers had been in touch.
"An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: 'We represent Taylor Swift, on whose behalf this is directed to you.'
"'The name Tay, as I'm sure you must know, is closely associated with our client', quotes Smith going on to explain that yo' girl Taytay wasn't impressed that Tay "created a false and misleading association between the popular singer and our chatbot".
The popular singer, a sort of Katherine Ryan Pro Max with added angst, has always had people around her who will dive on any potential nonsense. That's why ‘Tay' was already trademarked.
If that sounds petty, get this - she also once sued her own fans for making unofficial merchandise and selling them on Etsy.
Meanwhile, as Tay the Bot sunk into anarchy, the case against Microsoft for defamation of character was only growing and it may well have been a part of the reason that the plug was pulled on the experiment when it was.
Microsoft has experimented with chatbots since, with varying degrees of success, but notable none of them are called Miley, Billie or indeed Kylie.
Honestly, TayTay, it needs to be less about 'ME!'. You need to calm down. μ
Put a Ring-Con on it
We know. We're as surprised as you are
It's available across all major UK networks