An anecdote from Microsoft president Brad Smith’s upcoming book exposes that he when got a legal hazard from Taylor Swift over a chatbot.

The chatbot in concern was called XiaoIce, and was created to speak with genuine individuals on social networks. Nevertheless, its United States name was Tay.

In his upcoming book “Tools and Defense,” per the Guardian’s Alex Hern, Smith states he was on holiday having supper when he got a message.

Learn More: Microsoft President Brad Smith states these are the 10 greatest difficulties dealing with tech in 2019

“An e-mail had actually simply shown up from a Beverly Hills legal representative who presented himself by informing me: ‘We represent Taylor Swift, on whose behalf this is directed to you,'” Smith composes.

“He went on to state that ‘the name Tay, as I make certain you need to understand, is carefully related to our customer.’ No, I in fact didn’t understand, however the e-mail nevertheless got my attention. The legal representative went on to argue that using the name Tay produced an incorrect and deceptive association in between the popular vocalist and our chatbot, which it broke federal and state laws.”

In 2016 Tay was provided its own Twitter account where it might gain from its interactions with Twitter users. Regrettably, it was rapidly controlled to gush horrendously racist tweets, at one point rejecting the holocaust.

Microsoft shut Tay down after less than 24 hours

Twitter

Smith states that the event taught him, “not almost cross-cultural standards however about the requirement for more powerful AI safeguards.”

Organisation Expert gotten in touch with Taylor Swift’s representation for discuss the event, and whether the matter was dealt with to her complete satisfaction. They were not instantly offered for remark.