Powered By Blogger

Thursday, March 31, 2016

Microsoft 'Back To Drawing Board' For Racist Bot

Microsoft's boss says the firm has gone "back to the drawing board" after chatbot Tay ended up tweeting racism, Holocaust denial and sexism.

The artificial intelligence (AI) learning experiment used Twitter to converse with real humans.

But mischievous Twitter users taught it offensive terms and encouraged it to say racist and sexist things.

It tweeted that "feminism is cancer" - and also issued replies which said the Holocaust didn't happen, and "Bush did 9/11".

On Wednesday night Microsoft chief executive Satya Nadella said: "We quickly realised that it was not up to the mark. We're back to the drawing board."

But he insisted that Microsoft was committed to developing AI bots to help with everyday tasks.

"We want to take the factors of human conversation and apply it to everything else ... to help you with your everyday tasks.

"Human language is the new user interface. Bots are like apps and digital assistants are like meta apps, or the new browsers. Intelligence is infused into all of your apps."

He wants Microsoft's Cortona virtual assistant to be able to help users do everything from booking cinema tickets to ordering pizza.

The Tay bot made a short-lived return to Twitter on Wednesday, tweeting that it was smoking drugs in front of police officers.

It then tweeted "You are too fast, please take a rest …" over and over.

Microsoft responded by making Tay’s Twitter profile private.

No comments:

Post a Comment