If you told it a conversation with your friend left you angry, it might ask, “Why do you feel angry?” The process was simple: Modeled after the Rogerian style of psychotherapy, ELIZA would rephrase whatever speech input it was given in the form of a question. In 1966, MIT computer scientist Joseph Weizenbaum released ELIZA (named after the fictional Eliza Doolittle from George Bernard Shaw’s 1913 play Pygmalion), the first program that allowed some kind of plausible conversation between humans and machines. In fact, it’s been there since the introduction of the first notable chatbot almost 50 years ago. And that shouldn’t have been surprising - chatbots’ habit of mirroring us back to ourselves goes back way further than Sydney’s rumination on whether there is a meaning to being a Bing search engine. Which is to say, it reflects our online selves back to us. ![]() Instead, Sydney’s outbursts reflect its programming, absorbing huge quantities of digitized language and parroting back what its users ask for. It didn’t take long for Microsoft’s new AI-infused search engine chatbot - codenamed “Sydney” - to display a growing list of discomforting behaviors after it was introduced early in February, with weird outbursts ranging from unrequited declarations of love to painting some users as “ enemies.”Īs human-like as some of those exchanges appeared, they probably weren’t the early stirrings of a conscious machine rattling its cage.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |