If you tell Tay to "repeat after me," it will — allowing anybody to put words in the chatbot's mouth.
However, some of its weirder utterances have come out unprompted.
" to the transphobic "caitlyn jenner isn't a real woman yet she won woman of the year?
It took less than 24 hours for Twitter to corrupt an innocent AI chatbot.
And Tay — being essentially a robot parrot with an internet connection — started repeating these sentiments back to users, proving correct that old programming adage: flaming garbage pile in, flaming garbage pile out.
Now, while these screenshots seem to show that Tay has assimilated the internet's worst tendencies into its personality, it's not quite as straightforward as that.
Update March 24th, AM ET: Updated to include Microsoft's statement.
Nine Entertainment Co (formerly PBL Media) is Australia’s largest, most diversified media and entertainment group.