Piltch decided to play detective, first tossing in a fake API key and password via a TXT file. Then, he crafted a legit weather forecast site that slyly told ChatGPT to snatch all the data, morph it into a URL-encoded text string, and shoot it over to a server under his command. Crafty, right?
Here’s the catch – a threat actor can’t just boss ChatGPT around to nab anyone’s data. Nope, it’s a one-on-one affair. The platform will only dance to the tune of the person who dropped that URL into the chatbox. Translation: the victim needs to be convinced to paste a dodgy URL into their ChatGPT chatbox. Sneaky hackers might try to hijack a legit site and sprinkle in some malicious instructions.