OpenAI has unveiled new privacy controls for its popular AI chatbot, ChatGPT, giving users the ability to turn off their chat histories to prevent their input from being used for training data. The controls can be accessed in the ChatGPT user settings under a new Data Controls section, where users can toggle the switch off for “Chat History & Training.”
While OpenAI will still store chats for 30 days, it says that it will only review them if it needs to monitor them, and will permanently delete them after the period. The company also announced an upcoming ChatGPT Business subscription that targets professionals and enterprises that need more data control. The new subscription plan will follow the same data-usage policies as its API, meaning it won’t use data for training by default. Finally, the startup has introduced a new export option, enabling users to email themselves a copy of the data it stores. The move comes after three Samsung employees were found to have leaked sensitive data to the chatbot earlier this month.