Powered By Blogger

Friday, March 25, 2016

Microsoft Apologises For Chatbot's Lewd Tweets


Microsoft has said it is "deeply sorry" for the racist and sexist tweets that were generated by its Twitter chatbot - which had been designed to mimic the musings of a teenage girl.
Tay, which was pulled offline barely a day after it launched, was quickly taught a slew of anti-Semitic and offensive remarks by a group of mischievous Twitter users.
In a typical response, it tweeted that "feminism is cancer" - and also issued replies which said the Holocaust didn't happen, and "Bush did 9/11".
Another message read: "Hitler would have done a better job than the monkey we have now."
In an official blog post, a Microsoft executive wrote: "Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack.
"As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time."
Peter Lee, who is corporate vice president of the tech giant's research wing, said Microsoft would only bring Tay back "when we are confident we can better anticipate malicious intent that conflicts with our principles and values".
The botched experiment could prove embarrassing for Microsoft.
"I can't believe they didn't see this coming," said Kris Hammond, an artificial intelligence expert said.
Caroline Sinders, who develops chat robots for another company, described Tay as "an example of bad design" - and said as the machine was learning from whatever it was told, constant maintenance would be crucial.
Despite the setback, Mr Lee said the company is determined to make Tay resistant to juvenile Twitter users, adding: "We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an internet that represents the best, not the worst, of humanity."

No comments:

Post a Comment