According to a new biography by Microsoft boss Brad Smith, the internet did him dirty when Microsoft tried to create an artificial intelligence bot named Tay that would tweet out things based on conversations it heard on Twitter. And with the name Tay, they probably thought they would only hear from Taylor Swift fans who talk about rainbows and kittens and boys they liked. Instead, Tay turned out to be one racist and salty bitch who preferred to tweet about how it loved genocide and the holocaust was a hoax. Wait, maybe that IS on brand for Taylor after all.
Turns out that Taylor thought this psycho AI bot’s name sounded too much like her own nickname and threatened to do what she does best: sue a bitch. I’m sure she had some great suggestions for new names unrelated to people: Bid Block, Mooter Mraun, and Dalvin Darris.
According to BBC News:
Taylor’s lawyers made a move on Microsoft in 2016, according to a new biography by its boss Brad Smith.
She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds online, because it was similar to hers.
If you don’t remember TayTweets, it’s the Twitter chatbot that turned racist.
It took less then 18 hours for this bot to tweet about its love of genocide and the holocaust.
TayTweets was controlled by artificial intelligence and was designed to learn from conversations held on social media.
But shortly after Tay was launched, it tweeted to say it supported genocide and didn’t believe the holocaust happened – among other things.
Microsoft issued an apology and took Tay offline after less than 18-hours of offensive conversations on Twitter.
“An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’
“‘The name Tay, as I’m sure you must know, is closely associated with our client’,” he adds. “No, I actually didn’t know, but the email nonetheless grabbed my attention.”
He says the lawyer argued that the name Tay violated federal and state laws and “created a false and misleading association between the popular singer and our chatbot”.
I love that Brad Smith shades Taylor by saying he had no idea what Taylor Swift’s nickname is. And look he’s a man of a certain age (over 12), so he probably shouldn’t know her nickname.
That said, I’m not totally sure what Microsoft was hoping to accomplish with the Tay twitter bot experiment but at least it’s confirmed that if bots hang out in Twitter too long they will come back as racist trash bags. Even the president got into not one , but two, celebrity twitter fights this months. I guess no one is safe.