Login
From:
Simon Willison’s Weblog
(Uncensored)
subscribe
Prompt injection attacks against GPT-3
https://simonwillison.net/2022/Sep/12/prompt-injection/
links
backlinks
Riley Goodside, yesterday: Exploiting GPT-3 prompts with malicious inputs that order the model to ignore its previous directions. pic.twitter.com/I0NVr9LOJq- Riley Goodside (@goodside) September 12, 2022 Riley provided several examples. Here’s …
Roast topics
Find topics
Find it!