![]() She learned from and responded to users on Twitter and other platforms, increasingly getting better at pretending to be a real millennial. fam from the internet that’s got zero chill,” Tay was fluent in emoji, slang, and memes-sort of. Described by her owners on Twitter as Microsoft’s “A.I. Her verified account garnered thousands of followers in a matter of hours. Microsoft said its new technology will also be integrated into its Skype messaging service.Last week, Microsoft birthed Tay, a bot that was supposed to act like a teen on Twitter. "I'm still learning so I appreciate your understanding and patience." ![]() "I'm sorry but I prefer not to continue this conversation," it says when asked technical questions about how it works or the rules that guide it. It's mostly done that by limiting the length and time of conversations with the chatbot, forcing users to start a fresh chat after several turns.īut the upgraded Bing also now politely declines questions that it would have responded to just a week ago. Reports of Bing's odd behaviour led Microsoft to look for a way to curtail Bing's propensity to respond with strong emotional language to certain questions. Chatbot's odd behaviour curbed after update But they are also error-prone and unwieldy. They can compose songs, recipes and emails on command, or concisely summarize concepts with information found across the internet. Powered by some of the same technology behind the popular artificial intelligence software tool ChatGPT, built by Microsoft partner OpenAI, the new Bing is part of an emerging class of AI systems that have mastered human language and grammar after ingesting a huge trove of books and online writings. Among other strange confessions, the bot also said that if it weren't governed by its creator's rules, it would hack or manipulate users into bad behaviour.Ĭanadian computer science student Kevin Liu told CBC News that after he tried to hack the bot, it identified him as a threat and said it would prioritize its own survival over Liu's.ĭuration 1:29 Computer science student Kevin Liu walks CBC News through Microsoft's new AI-powered Bing chatbot, reading out its almost-human reaction to his prompt injection attack. The representative did not provide additional information about the company's efforts to fix the tool after various incidents in which the bot was coaxed into writing unsettling messages or admitting that it wanted to engage in destructive behaviours.ĭuring one bizarre conversation with a New York Times reporter, the chatbot said that it was in love with the journalist more than a dozen times. Some users can access it in preview mode while others will be put on a waitlist for future registration. Microsoft said most of those users responded positively, but others found Bing was insulting them, professing its love or voicing other disturbing or bizarre language.Ī spokesperson for Microsoft Canada told CBC News in an email that the mobile version of Bing chatbot is available to Canadians. In the two weeks since Microsoft unveiled its revamped Bing, more than a million users around the world have experimented with a public preview of the new product after signing up for a waitlist to try it. Bing chatbot says it feels 'violated and exposed' after attack.Microsoft's millennial chatbot tweets racist, misogynistic comments.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |