Taylor Swift tried to sue Microsoft over the name of its racist chatbot

Microsoft president, Brad Smith, has revealed that Taylor Swift tried to sue the company of the name of its chatbot, Tay. Smith revealed the fact in his upcoming book, Tools and Weapons, explaining that he was contacted by lawyers on behalf of Taylor Swift who claimed that the chatbot’s name was too similar to her own, which allegedly “created a false and misleading association between the popular singer and [Microsoft’s] chatbot, and that it violated federal and state laws.”

Somewhat luckily for Microsoft, the company didn’t have to deal with Taylor Swift for very long because Twitter trolls managed to get Tay to spout extreme content resulting in Microsoft taking it offline a day later. After speaking to the people on Twitter, Tay began blaming Bush for 9/11, and that the U.S. is going to build a wall and make Mexico pay for it, among other things.

In his upcoming book, Brad Smith writes:

“I was on vacation when I made the mistake of looking at my phone during dinner. An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention.

“The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws.”

Aside from recalling Taylor Swift’s threat to sue, Smith said that the episode taught the firm the importance of stronger AI safeguards.

Via: The Guardian

Report a problem with article
Next Article

A rap song becomes the most disliked Russian YouTube video to date

Previous Article

Huawei drops lawsuit against U.S. government agencies after equipment returned