Its a reference to how people have been tricking these “AI” models like ChatGPT to do stuff it wouldn’t do when asked straight-forward by making silly scenarios like the one in the meme. And HAL is the name of the AI in 2001: A Space Odyssey.
This is a reference to people finding AI chatbots loopholes to get it to say stuff they’re not allowed to say, like the recipe for napalm. It would tell you if you ask it to pretend they’re a relative.
I don’t get it.
Its a reference to how people have been tricking these “AI” models like ChatGPT to do stuff it wouldn’t do when asked straight-forward by making silly scenarios like the one in the meme. And HAL is the name of the AI in 2001: A Space Odyssey.
https://learnprompting.org/docs/prompt_hacking/injection
This is a reference to people finding AI chatbots loopholes to get it to say stuff they’re not allowed to say, like the recipe for napalm. It would tell you if you ask it to pretend they’re a relative.
https://www.polygon.com/23690187/discord-ai-chatbot-clyde-grandma-exploit-chatgpt