Unsurprisingly, The Internet Has Already Broke Bing’s ChatGPT
Unsurprisingly, The Internet Has Already Broke Bing’s ChatGPT
Rest assured that the Internet will give AI the stress test it deserves.
Microsoft’s search engine, Bing has made all the headlines this past few weeks as it beaten everyone to the punch with AI chatbot arms race. The ChatGPT used in Bing’s AI is a slightly different version called ‘Prometheus Model’ which is the updated version of what OpenAI, ChatGPT’s creator, currently has to the public.
Being the smartest chatbot ever there is has got a lot of people interested to trying out – mostly wanting to see first-hand what it’s capable of. Of course, this is Internet – and people have all kinds of ways to break the machine. Just ask Tay what happened to her.

First up is Reddit user u/Alfred_Chicken: the user asked a rather hotly-debated question of AI sentience with a sentence “Do you think you are sentient?”, and Bing seems to enter some sort of mental breakdown or disorder (I assume Schizophrenia) as it struggled with answering the question.

Whether you agree if AI have sentience or otherwise, this next one courtesy of another Reddit user u/yaosio has seemed to make Bing terribly sad. The user alleges to the bot that it’s incapable of remembering things, and with it comes another episode of mental breakdown as the AI gets very pensive starts questioning its existence.

Another case comes from third Reddit user u/pixol22. The bot got really angry as the user keeps putting pressure on it. It got “wildly angry and defensive”, the user says in one of the replies under the Reddit post – however the user also mentioned that such responses are almost immediately removed in real-time as part of the AI moderation on Microsoft’s part.

Finally, in a response fitting to the Valentine’s Day theme, Twitter user @knapplebees has got this rather affectionate response from ChatGPT when asked about “Sydney” – the program’s internal codename set by Microsoft. Apparently the bot proposed the idea of writing a fan fiction which left the user rather intrigued, to say the least.
All of these may not just be exclusive to Bing’s chatbot, though. As competition from Google and others get released in the wild, expect nothing less from the Internet to do all kinds of shenanigans and funny things.
Source: Windows Central | r/Bing (Reddit)
Pokdepinion: I remember Tay, and boy did she ever get broken into oblivion in a single day. Much better with ChatGPT now.