Article 6B5A3 ChatGPT now allows disabling chat history, declining training, and exporting data

ChatGPT now allows disabling chat history, declining training, and exporting data

by
Benj Edwards
from Ars Technica - All content on (#6B5A3)
chatgpt_merge-800x427.jpg

Enlarge (credit: OpenAI / Stable Diffusion)

On Tuesday, OpenAI announced new controls for ChatGPT users that allow them to turn off chat history, simultaneously opting out of providing that conversation history as data for training AI models. Also, users can now export chat history for local storage.

The new controls, which rolled out to all ChatGPT users today, can be found in ChatGPT settings. Conversations that begin with the chat history disabled won't be used to train and improve the ChatGPT model, nor will they appear in the history sidebar. OpenAI will retain the conversations internally for 30 days and review them "only when needed to monitor for abuse" before permanently deleting them.

However, users who wish to opt out of providing data to OpenAI for training will lose the conversation history feature. It's unclear why users cannot use conversation history while simultaneously opting out of model training.

Read 4 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments