{"id":18029,"date":"2016-03-25T12:22:25","date_gmt":"2016-03-25T12:22:25","guid":{"rendered":"http:\/\/ps4pro.eu\/?p=18029"},"modified":"2016-03-26T16:39:33","modified_gmt":"2016-03-26T16:39:33","slug":"microsofts-ai-from-normal-to-racist","status":"publish","type":"post","link":"https:\/\/thegeek.games\/2016\/03\/25\/microsofts-ai-from-normal-to-racist\/","title":{"rendered":"Microsoft’s AI: From Normal To Racist!"},"content":{"rendered":"

Artificial intelligence<\/strong><\/em> can quickly be turned around. This was proven by Microsoft<\/strong><\/em>‘s experiment: people needed less than a day to make the A<\/strong><\/em>I (called Tay<\/strong><\/em>) respond differently from the initial state of it.<\/p>\n

Tay was let loose on Twitter by Microsoft.<\/a>\u00a0It can learn from responses, evolving its capabilities and the things it can respond with as more and more users start to communicate with Tay<\/strong>. The experiment backfired in a somewhat hilarious way, to say the least.<\/p>\n

In just a single day, Tay became racist, feminist and genocide-supporting. Mind you; it started out by saying things like \u201ehumans are super cool.\u201d Microsoft<\/strong><\/em> quickly turned Tay off to not cause more problems, but the Twitter<\/em> users started a hashtag campaign (#JusticeForTay<\/strong><\/em>) to allow Tay<\/strong><\/em> to return to Twitter<\/strong><\/em> and make the chatbot learn from its mistakes.<\/p>\n

\n

@OmegaVoyager<\/a> i love feminism now<\/p>\n

\u2014 TayTweets (@TayandYou) March 24, 2016<\/a><\/p><\/blockquote>\n