The story of Tay the Twitter Chatbot is short but spectacular: Microsoft introduced @TayandYou Wednesday morning, and hours later it was decrying feminism and the Jews.
Microsoft, of course, has pulled the plug on Tay (for the moment, at least) just 15 hours after starting it up — and had to delete its overtly racist, misogynist and otherwise messed up tweets.
https://twitter.com/TayandYou/status/712856578567839745
The idea behind Tay was a bit more complex than that of your standard Twitter bot. Microsoft referred to Tay as an artificial intelligence because it was intended to eventually learn to interact organically with people who tweeted at it.
It was also easily exploitable, as you could tell it to “repeat after me” and it would say whatever you said. But the wild and disturbing stuff coming out of Tay’s, er, mouth was not limited to things it was told to repeat.
https://twitter.com/geraldmellor/status/712880710328139776?ref_src=twsrc%5Etfw
For example, it would respond positively at points to white supremacist sentiments and even come up with some of its own (click here for a bunch of examples). Or it would say things like “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism” — the Hitler-is-the-inventor-of-atheism thing, of course, is an old internet troll joke that Tay picked up somewhere.
It is not surprising, or it shouldn’t be, to Twitter users that Tay went rogue. This is just how things go in that frontier.
literally anybody i know could've told you this would happen re @tayandyou. ONLY tech company people could be so clueless about humanity
— 🌏🔎Leigh Alexander 🐬💿✨ (@leighalexander) March 24, 2016