Taylor Swift Threatened A Lawsuit Against A Racist Microsoft Bot Named Tay

September 11, 2019 / Posted by:

According to a new biography by Microsoft boss Brad Smith, the internet did him dirty when Microsoft tried to create an artificial intelligence bot named Tay that would tweet out things based on conversations it heard on Twitter. And with the name Tay, they probably thought they would only hear from Taylor Swift fans who talk about rainbows and kittens and boys they liked. Instead, Tay turned out to be one racist and salty bitch who preferred to tweet about how it loved genocide and the holocaust was a hoax. Wait, maybe that IS on brand for Taylor after all.

Turns out that Taylor thought this psycho AI bot’s name sounded too much like her own nickname and threatened to do what she does best: sue a bitch. I’m sure she had some great suggestions for new names unrelated to people:  Bid BlockMooter Mraun, and Dalvin Darris.

According to BBC News:

Taylor’s lawyers made a move on Microsoft in 2016, according to a new biography by its boss Brad Smith.

She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds online, because it was similar to hers.

If you don’t remember TayTweets, it’s the Twitter chatbot that turned racist.

It took less then 18 hours for this bot to tweet about its love of genocide and the holocaust.

TayTweets was controlled by artificial intelligence and was designed to learn from conversations held on social media.

But shortly after Tay was launched, it tweeted to say it supported genocide and didn’t believe the holocaust happened – among other things.

Microsoft issued an apology and took Tay offline after less than 18-hours of offensive conversations on Twitter.

If only Microsoft could have known and named the bot after Ann Coulter’s nickname “the devil“, as bot Tay would basically be her dream bot baby. They wouldn’t even have to alter the tweets.
Taylor was not interested in having even a loose association with this bot and sent a lawyer to Brad Smith to let him know.

“An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“‘The name Tay, as I’m sure you must know, is closely associated with our client’,” he adds. “No, I actually didn’t know, but the email nonetheless grabbed my attention.”

He says the lawyer argued that the name Tay violated federal and state laws and “created a false and misleading association between the popular singer and our chatbot”.

I love that Brad Smith shades Taylor by saying he had no idea what Taylor Swift’s nickname is. And look he’s a man of a certain age (over 12), so he probably shouldn’t know her nickname.

That said, I’m not totally sure what Microsoft was hoping to accomplish with the Tay twitter bot experiment but at least it’s confirmed that if bots hang out in Twitter too long they will come back as racist trash bags. Even the president got into not one , but two, celebrity twitter fights this months. I guess no one is safe.

Pic: Wenn.com

Our commenting rules: Don't be racist or bigoted, or post comments like "Who cares?", or have multiple accounts, or repost a comment that was deleted by a mod, or post NSFW pics/videos/GIFs, or go off topic when not in an Open Post, or post paparazzi/event/red carpet pics from photo agencies due to copyright infringement issues. Also, promoting adblockers, your website, or your forum is not allowed. Breaking a rule may result in your Disqus account getting permanently or temporarily banned. New commenters must go through a period of pre-moderation. And some posts may be pre-moderated so it could take a minute for your comment to appear if it's approved. If you have a question or an issue with comments, email: [email protected]

alt="drupal analytics" >