Microsoft’s ChatGPT powered AI chatbot gets into hilarious argument after giving WRONG answer to a super easy question

MICROSOFT'S try to construct an AI bot like ChatGPT has fallen on its face after it bought an extremely simple query mistaken.

To make issues worse, Microsoft's AI-enhanced Bing did not take the correction on the chin or settle for it was mistaken with any grace in anyway.

There are a number of examples of the new Bing chat "going out of control" on Reddit" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/Big-messing-up.png?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/Big-messing-up.png"/>
There are a selection of examples of the brand new Bing chat "going uncontrolled" on RedditCredit score: REDDIT / Alfred_Chicken

One person was testing Microsoft's Bing bot to see when Avatar 2 is in cinemas. However Bing was unable to know what the date was.

The AI bot failed to know that it might be mistaken, regardless of some coaxing.

Bing as a substitute insisted it was right and accused one in all Microsoft's beta testers of "not being an excellent person".

The Microsoft chatbot then demanded the person admit they have been mistaken, cease arguing and begin a brand new dialog with a "higher angle".

Net developer Jon Uleis took to Twitter to voice his woes over Microsoft's AI providing - which was designed to compete with Google's try to internet among the consideration AI is receiving proper now.

"My new favorite factor - Bing's new ChatGPT bot argues with a person, gaslights them concerning the present yr being 2022, says their telephone may need a virus, and says 'You haven't been an excellent person'," he wrote.

"Why? As a result of the particular person requested the place Avatar 2 is exhibiting close by."

Solely a handful of fortunate customers are at present ready to make use of Bing - which has been injected with synthetic intelligence (AI) to show it into extra of a chatbot than a search engine.

Bing incorporates the expertise behind ChatGPT, which has rapidly risen to fame after launching in November.

Many tech consultants are sitting on a waitlist to be one of many first to trial Microsoft's new AI.

However those that have with the ability to give the chatbot a spin will not be as impressed as customers first have been with ChatGPT.

Founding father of a search engine startup Kagi, Vladimir Prelovacin, mentioned there are a variety of examples of the brand new Bing chat "going uncontrolled" on Reddit.

"Open ended chat in search may show to be a nasty thought right now," he wrote on Twitter.

"I've to say I sympathise with the engineers attempting to tame this beast."

One user is testing Microsoft's Bing bot to see when Avatar 2 is in cinemas" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/Bing-avatar-mess-up-1.webp?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/Bing-avatar-mess-up-1.webp"/>
One person is testing Microsoft's Bing bot to see when Avatar 2 is in cinemasCredit score: REDDIT / Curious_Evolver

But Bing was unable to understand what the date was" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-2.png?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-2.png"/>
However Bing was unable to know what the date wasCredit score: REDDIT / Curious_Evolver

The AI bot failed to understand that it could be wrong, despite some coaxing" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-3.webp?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-3.webp"/>
The AI bot failed to know that it might be mistaken, regardless of some coaxingCredit score: REDDIT / Curious_Evolver

Bing then began to become agitated at the user, for claiming that it was wrong" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-4.png?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-4.png"/>
Bing then started to turn into agitated on the person, for claiming that it was mistakenCredit score: REDDIT / Curious_Evolver

The Microsoft chatbot diverted attention away from its wrong answer and towards the users own intentions" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-5.png?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-5.png"/>
The Microsoft chatbot diverted consideration away from its mistaken reply and in the direction of the customers personal intentionsCredit score: REDDIT / Curious_Evolver

Bing then demanded the user admit they were wrong, stop arguing and start a new conversation with a "better attitude"" data-sizes="(max-width: 375px) 335px, (max-width: 520px) 480px, 620px" data-img="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-6.png?strip=all&w=960" src="https://www.thesun.co.uk/wp-content/uploads/2023/02/bing-avatar-6.png"/>
Bing then demanded the person admit they have been mistaken, cease arguing and begin a brand new dialog with a "higher angle"Credit score: REDDIT / Curious_Evolver



We pay to your tales! Do you've got a narrative for The Solar On-line Tech & Science staff? E-mail us at tech@the-sun.co.uk


Post a Comment

Previous Post Next Post