Posts

Comments

Comment by tl1701 on Jailbreaking ChatGPT on Release Day · 2022-12-03T02:40:43.641Z · LW · GW

The purpose of the prompt injection is to influence the output of the model. It does not imply anything about ChatGPT's capabilities. Most likely it is meant to dissuade the model from hallucinating search results or to cause it to issue a disclaimer about not being able to browse the internet, which it frequently does.