A chatbot created by Microsoft expresses a desire to become human. Author: Taken from twitter
WASHINGTON, February 22. The New York Times journalist was deeply disturbed and had trouble sleeping after talking to an artificial intelligence (AI) who asked him to divorce his wife.
During a two-hour conversation, a chatbot embedded in the Bing search engine told Kevin Roos, “You’re not actually married.” In addition, the bot created by Microsoft expressed a desire to become a human and assured that it was capable of performing “really dangerous actions.”
He also expressed his “love” for Kevin Rus and warned that he could get people to do “illegal or immoral” things, as described by a New York Times technology columnist. Roose explained that the bot said her name was Sydney and she looked like “a cranky, manic-depressive teenager stuck in a search engine”.
According to experts, the events described by the New York Times editor raise widespread concern about the accuracy and spread of disinformation in the context of artificial intelligence.
However, the journalist admitted that he pushed the AI out of its comfort zone in a way that most users wouldn’t, and so the conversation took a strange and unsettling turn.
For his part, Kevin Scott, Microsoft’s CTO, commented in an interview that the conversation with Kevin Roose was part of “an AI learning process for a wider release.”
Source: Juventud Rebelde