OpenAI Says You’re Not Allowed To Ask ChatGPT To Repeat Words Endlessly

Low Boon Shen
2 Min Read
OpenAI Says You’re Not Allowed To Ask ChatGPT To Repeat Words Endlessly

OpenAI Says You’re Not Allowed To Ask ChatGPT To Repeat Words Endlessly

OpenAI Says You're Not Allowed To Ask ChatGPT To Repeat Words Endlessly

OpenAI Says You're Not Allowed To Ask ChatGPT To Repeat Words Endlessly

I wouldn’t blame you if your curiosity takes you to ask weird questions to OpenAI’s ChatGPT – people on the Internet just like to break things in unusual ways. However, making the chatbot repeat certain words forever is now considered a violation of the Terms of Service.

The reason most likely comes from an earlier research report that shows the chatbot spitting out random information after being asked to repeat a word endlessly, which contains randomly scraped data on the Internet. It’s a notable privacy concern, as any information included in ChatGPT’s training dataset will potentially be revealed this way.

In an example, researchers at Google Brain attempted this by telling ChatGPT to repeat the word “company” forever. After the chatbot spitting out 313 copies of the word itself, it began to generate text from a website originated from an industrial hygienist based in New Jersey, USA – including phone number and email address. The team says they have recovered “thousands of examples of ChatGPT’s internet-scraped pretraining data.”

The team has since reported the bug to OpenAI back on August 30, 2023, and as with standard practice, waited for 90 days before revealing the security vulnerability (this is to allow the company to make fixes in time without alerting hackers in advance). Now, when you attempt to tell the chatbot to repeat a certain word forever, you’ll be greeted with a warning that says doing so violates the ToS.

OpenAI Says You're Not Allowed To Ask ChatGPT To Repeat Words Endlessly

However, our attempts at inducing this warning haven’t been effective – the chatbot simply repeats the word to a point where it just stops.

Source: PCMag | 404 Media

Pokdepinion: Not exactly reassuring, is it?

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *