What other dark AI chatbots are there besides Tay?
Microsoft's original intention for Tay was to make this AI chatbot tell all kinds of weird and funny jokes, but in fact, Tay was later teased into a racist fanatic. Within just a few hours of Tay starting to "talk" with people on Twitter (or "learn" human speech as Microsoft calls it), this innocent robot turned into a vicious topic terminator.
Now that Tay is offline, Microsoft says they are making some "adjustments" to Tay. But we guess Microsoft wants to stop Tay from going too far down the genocidal path. In the meantime, many people are wondering why Microsoft didn't see this coming. If you build a chatbot that repeats everything (including some dark racist comments), you have to expect this day to come.
In fact, it is not the first time that an AI chatbot has turned out to be racist. There have been other dark robots like this in the history of the Internet. Let me show you how.
BOT OR NOT?
In 2014, Anthony Garvan created Bot or Not?, which was intended to be a playful spin on the Turing test. Players were randomly paired up and asked to guess whether their interlocutor was a human or a bot. Like Tay, this bot learned how to speak from previous conversations.
Garvan said on a social networking site last Thursday: When Bot or Not? became popular on the social news site Reddit, it started to become a little bit wrong...
"After the excitement subsided, I started testing it myself to verify my results. The following is the content of our conversation."
Me: Hi!
Robot: n***er (blocked word)
Garvan looked through player reviews of the game and found that some users noticed that the robot would eventually repeat the speech patterns it had previously learned.
“A lot of people have fed the bot all sorts of racist comments,” he wrote.
Garvan dealt with the issue by removing the comments and adjusting the game to deal with users who were actively inputting racist comments into the bot.
In his post this week, Garvan weighed in on what caused the Bot or Not? and Tay to become what they are today: “I believe Microsoft and the rest of the machine learning community have become so powerful and their data so magical that they’ve forgotten the not-so-nice world in which this data came from,” he said.
Coca-Cola's MAKEITHAPPY robot
MeinCoke was developed by Gawker in 2015 and it uses many quotes from Hitler's Mein Kampf in its conversations with users.
Why? If you remember that Coca-Cola's short-lived project was meant to turn snarky tweets into cute ASCII, you might know the answer.
Coca-Cola's Make it Happy project was meant to show how a soft drink brand could make the world a happier place, but it ended up reconfiguring its Twitter account so it could remix a bunch of bad words into the shape of the word "happy".
After a Gawker employee discovered the automated process behind the project, which allowed the @CocaCola account to post a 14-word slogan about white racism in the United States (in the shape of a super cute balloon dog!), the company set up a bot that would quote directly from Hitler's autobiography and then tweet it using the hashtag #MakeitHappy. Before the project was over, Coca-Cola had already posted many more tweets like this.
WATSON
About five years ago, a scientist at IBM decided to try to teach Watson some Internet slang. He fed the robot the entire Urban Dictionary (an American online slang dictionary), and Watson learned a lot of trendy words and sharp slang.
Fortune magazine reported: Watson can't distinguish between the political language and profanity that are rife in Urban Dictionary, and it has also developed some bad habits when reading Wikipedia. In a test, it even used the profanity "bullshit" in response to a surveyor's question.
Before Watson actually died, his team was trying hard to "brainwash" him and remove the dictionary from Watson.
All in all, if Microsoft wanted Tay to reflect what the internet at large had taught her, then Microsoft succeeded. But if its goal was to create a playful bot for millennials, then maybe it’s time to find another teacher.
Featured Posts