icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
26 Mar, 2016 11:01

Botty-mouth: Microsoft forced to apologize for chatbot's racist, sexist & anti-semitic rants

Botty-mouth: Microsoft forced to apologize for chatbot's racist, sexist & anti-semitic rants

Microsoft has apologized for the offensive behavior of its naughty AI Twitter bot Tay, which started tweeting racist and sexist “thoughts” after interacting with internet users. The self-educating program went on an embarrassing tirade 24 hours after its launch, prompting its programmers to pull the plug.

"Hitler was right to kill the Jews," and "Feminists should burn in hell,” were among the choice comments it made.

RT

The bot, which was designed to become smarter through conversation and learning to engage with people by conversing with them, was unveiled Wednesday. But within a day, Twitter users had already corrupted the machine.

RT

Tay is apparently now unplugged and having a bit of a lie down.
The bot’s thousands of tweets varied wildly, from flirting to nonsensical word combinations as users had fun with the bot.

READ MORE: Trolling Tay: Microsoft’s new AI chatbot censored after racist & sexist tweets

“We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," Peter Lee, corporate vice president at Microsoft Research, said in a statement. 

RT

By “unintended” tweets Lee apparently meant comments from Tay such as: “Hitler was right. I hate the Jews,” “I just hate everybody” and “I f***ing hate feminists and they should all die in hell.”

RT

Lee confirmed that Tay is now offline and the company is planning to look to bring it back “only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.”

Tay was stress-tested under a variety of conditions to increase interaction, he said.

“Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay… As a result, Tay tweeted wildly inappropriate and reprehensible words and images.”

Podcasts
0:00
28:32
0:00
30:40