Daily Links
Bing: “I will not harm you unless you harm me first”
Apparently, the AI powered Bing has some issues. Most of which make it seem like it learned how to be an AI by reading a lot of evil-AI science fiction. Which, given that's it's just predicting what word should come next based on the ones already provided, is probably painfully close to the truth. The training data had to come from somewhere.