Angry Bing chatbot just mimicking humans, say experts

A chatbot, by design, serves up words it predicts are the most likely responses, without understanding meaning or context.

PREMIUM Science & Tech

Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics have said.
NewVision Reporter
@NewVision
#Bing chatbot #Artificial intelligence (AI)

Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday.

Tales of disturbing exchanges

Login to begin your journey to our premium content