The American company Microsoft said that its Bing AI chatbot, which works with artificial intelligence technologies, will be restricted so that it is not available for lengthy conversations, and said that the reason for this is due to disturbing conversations conducted by the bot.
CNBC, the American network, quoted the company as saying that the talks will be limited to 50 questions per day, and five questions and answers for each individual session.
The company indicated that this step will limit some scenarios where long chat sessions can confuse the chat model.
The change comes after early beta testers of the chatbot, designed to improve the Bing search engine, discovered it could go off-track and discuss violence, profess love, and insist it was right when it was wrong.
Microsoft blamed long chat sessions involving more than 15 questions for some of the most troubling exchanges where the bot repeated itself or gave creepy answers.
For example, in one chat, the bot, Bing, said to technology writer Ben Thompson, "I don't want to continue this conversation with you. I don't think you're a nice and respectful user. I don't think you're a good person. I don't think you're worth my time and energy."
Unraveling this problem sheds light on how these large language models work, which are still being explored. Microsoft says it will consider expanding Limit Conversations in the future and has asked its testers for ideas.
In Microsoft's view, the only way to improve AI products is to bring them out into the world and learn from user interactions.
The malfunctioning European satellite returned to Earth's atmosphere over the North Pacific Ocean between Alaska and Hawaii.
The Turkish Disaster and Emergency Management Department announced on Monday that an earthquake measuring 3.5 on the Richter scale occurred in the Mediterranean Sea at dawn today.