Nearshore Americas

Racist Microsoft Chat Bot Highlights Automation Risk

While automation offers great promise, Mayur Anadkat of Five9 recently told Nearshore Americas that it can never completely take the place of human agents in a call center. “There is just too much to risk,” he noted, highlighting that companies put their reputation on the line every time a customer calls in with a problem to solve. Microsoft just got a first-hand look at how bad things can go when a chat bot named “Tay” designed to speak like a teenager when answering questions started spewing racist and misogynist bile on Twitter. The AI in the program was designed to learn how others talk on social media and reply in kind, but the company apparently didn’t realize how much of the internet is filled with hate. Amid embarrassment, Microsoft’s experiment was taken down within its first day.

Sign up for our Nearshore Americas newsletter:

Jared Wade

Add comment