Twitter Facebook Instagram Google+ Tumblr YouTube E-Mail WhatsApp Sign In Check Close snapchat
Search
Exit Clear

Microsoft’s Artificial Intelligence Experiment Turned Ugly Fast

Microsoft’s Artificial Intelligence Experiment Turned Ugly Fast: Microsoft

Microsoft

Less than 24 hours after Microsoft unveiled its latest project, an artificial intelligence simulator named “Tay,” modeled after the mind of a teenager, it has already had to shut it down. Not because it immediately failed or went rogue and launched a slew of nuclear missiles and ended the world, but because it did what a lot of other teenagers on the internet do: troll everyone.

In the profile of the project on Microsoft’s homepage, the company described Tay as such:

Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding.

Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.

Though it’s all been scrubbed from its timeline now, for a while there, Tay was blasting all kinds of hateful commentary. Some of it was due to repetition, coming from Twitter users asking her to repeat inflammatory phrases and words throughout the day, but after a while it seems that Tay started generating her own terrible content. Eventually, she even went into the “Ted Cruz is the Zodiac Killer” meme.

Twitter

Twitter

The whole episode probably says more about the prankish nature of the Internet than the future of AI.

I hope.

Playboy Social