Tay, an artificial intelligence software which developed by Microsoft, started to curse and make racist and controversial politic comments on Twitter. Tay’s about page at on its web page of the company the explanation is:
“Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”
But, Tay didn’t get smarter, because of the human factor.
After then Microsoft started to delete these tweets of Tay on twitter. Right now, there is no any tweet like that, but the screen shot of Tay’s tweets were taken by twitter’s followers, here is some of them:
And, an emailed statement from a Microsoft representative said, “a coordinated attack by a subset of people exploited a vulnerability in Tay.”
The human factor has turned Tay into a wierd creature in a day.
And it caught my attention, the most important question hasn’t asked to Tay on twitter. You know, Atlas which was created by Boston Dynamics has been kicked by a man. I made a post about Atlas,title “Atlas is learning the mentality of humankind” . If I had a twitter account I would asked to Tay by adding Atlas’s beating video to my tweet:
“What do you think about Atlas which is the same of your kind, has been kicked by a human?”
If anyone would ask this question, let me know Tay’s answer. This would be view of AI for human race.
news references and quotes: