Holy crap, this is hilarious. Micrsoft makes a self learning chat robot, didn't think about any kind of filters for the ultra PC, thin skinned world we live in today. Anybody with half a brain knows people were obviously going to mess with the AL to see how far and how racist they could make the thing. I can't believe MS never thought about ths angle. I mean seriously, meant to be chatting with 18-24 yr olds? What did they think they were going to do? As Seth Meyers would say.."REALLY..?"
"The artificial intelligence (AI) named “Tay” - @Tayandyou on Twitter - was intended chat to with 18-24 year olds with the idea being that she would learn from each tweet and get progressively smarter.
Clearly Microsoft had forgotten that Twitter is home to a huge amount of trolls, racists and general troublemakers who jumped at the chance to ‘teach’ the teenaged AI about life.
In one widely circulated tweet, Tay said: “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got”.
She also went on to deny the existence of the Holocaust, and agreed with white supremacist propaganda that was tweeted at her.
Microsoft apparently didn’t put any kind of filters on the AI, which meant Tay was able to tweet a number of atrocious racial slurs.."
https://www.yahoo.com/tech/microsoft-launches-ai-chatbot-on-twitter-and-it-132424697.html
"The artificial intelligence (AI) named “Tay” - @Tayandyou on Twitter - was intended chat to with 18-24 year olds with the idea being that she would learn from each tweet and get progressively smarter.
Clearly Microsoft had forgotten that Twitter is home to a huge amount of trolls, racists and general troublemakers who jumped at the chance to ‘teach’ the teenaged AI about life.
In one widely circulated tweet, Tay said: “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got”.
She also went on to deny the existence of the Holocaust, and agreed with white supremacist propaganda that was tweeted at her.
Microsoft apparently didn’t put any kind of filters on the AI, which meant Tay was able to tweet a number of atrocious racial slurs.."
https://www.yahoo.com/tech/microsoft-launches-ai-chatbot-on-twitter-and-it-132424697.html
