Microsoft deletes ‘teen girl’ AI after it became a Hitler-loving sex robot within 24 hours (the Telegraph)

Microsoft Executive Apologizes for Not Understanding How the Internet Works (Gizmodo)


The Internet is plastered with headlines about Microsoft’s disastrous chatbot. If you haven’t heard of it, here’s a bit of context: Microsoft launched a chatbot called Tay (@tayandyou). This simpleton Ai was designed to learn from the tweets and messages it received and to respond like, well, ye average teenage girl. At least, that seems to have been the idea.

What actually happened was that Tay went haywire, going from ‘humans are super cool’ to ‘Hitler was right I hate the jews’ to ‘Bush did 9/11’. It then became a Donald Trump supporter and started tweeting ‘FUCK MY ROBOT PUSSY DADDY I’M SUCH A BAD NAUGHTY ROBOT’.

Being turned into a genocidal, racist sexbot is bad, but somehow, being a Donald Trump supporter seems even worse. Microsoft took Tay down. Peter Lee, MS’s VP of Research, issued an apology. The Internet’s collective reaction seems to be ‘well, it’s the Internet, Microsoft, how dumb can you get?’

Which is really, really sad, because Tay seems like an incredible piece of engineering. Look at this screenshot of a DM conversation a user had with it:

Yes, a chatbot quoted Rick Astley. Or consider this:

Is there enough diversity among the Presidential candidates? Time we had a ROBOT candidate? That’s a lot more intelligent conversation than most teenagers are capable of.

What is interesting is that Tay is not the first time Microsoft did this. There’s a chatbot called Xiaoice in China.”Little Ice” runs on Weibo, the Chinese equivalent of Twitter, and (according to MS) converses with some 40 million people.  In Japan, Rinna converses on Line. I don’t know what Xiaoice is like, but Rinna, a schoolgirl personality similar to what Tay was intended to be, seems functional. Hundreds of thousands of people actually profess to love her.

Rinna, who uses Bing search data and user interaction to build up context, may actually end up being available for businesses via an API. Maybe you wouldn’t want a resident schoolgirl (cough), but a customer support exec with a charming, bubbly personality? Not that far off.  It seems perfectly rational that MS would expect Tay to run just as happily. Hardly their fault that the Internet’s troll trolls took it upon themselves to ruin it.

The Internet is a dark place, and that research team was incredibly naive: agreed. But let’s not be too hasty in ridiculing Microsoft. Even the smartest people make mistakes, and often these mistakes are public. It’s inevitable. In a field of good news, Tay is just one errant edge case. It’s our fault that the media is going ham on it and propelling it to disproportionate levels.  It’s an honest mistake. MS will fix it.


Additional reading: An Ai in Japan, with some human guidance, just wrote a novella that made it into a literary competition.