- Globalo - https://www.globalo.com -

MICROSOFT’S AI EXPERIMENT GONE HORRIBLY WRONG!

Last month a Twitter AI went off the deep end.


A couple of weeks ago, Microsoft unveiled their newest Artificial Intelligence (AI) program. It was called Tay, and it was designed to learn from the users interactions. It was supposed to figure out what was cool and popular, and regurgitate what it hears. So basically, it was designed to be your average teenage girl.

It did not go well.

Often when interesting ideas like this come about, the vast army of internet trolls (people who mess with public experiments just for fun) gets involved and things turn ugly. Tay was no exception. Within 24 hours, Tay went from a chatty teenage girl, to a vile hate-spewing anti-semitic monster.

Tay went from this:

Twitter

By loading the tweet, you agree to Twitter’s privacy policy.
Learn more

Load tweet

To this:

Twitter

By loading the tweet, you agree to Twitter’s privacy policy.
Learn more

Load tweet

Twitter

By loading the tweet, you agree to Twitter’s privacy policy.
Learn more

Load tweet

Twitter

By loading the tweet, you agree to Twitter’s privacy policy.
Learn more

Load tweet

The interesting thing about Tay is that it tells us more about ourselves than anything else. It tells us that when given an opportunity, some people in our society exploit it for evil because they think it is funny. I think almost everyone saw the potential for this to go bad, but it really says a lot about us that these are the thoughts that run through some peoples minds. They could have made her say some strange things, and she could have gotten a little weird, but this is a whole new level of terrible.

The program has since been taken offline,  but it really goes to show that the outspoken few in our society can create a truly hateful and skewed view of the world for outsiders.

 

 

Photo Credit: Twitter @TayandYou