ChatGPT is inescapable at the moment, the AI chatbot can answer your questions, write emails based on Zoom calls, create entire PowerPoint presentations, and much more. However, you might want to think twice about confiding your secrets to it.
A few Samsung workers discovered that last lesson after asking ChatGPT for help with confidential materials that were then available to OpenAI, the company behind ChatGPT (via Mashable).
Think twice about what you share on ChatGPT
Now to be clear this information wasn't leaked onto the open web or to other users of ChatGPT, but it was available for review by the OpenAI team. OpenAI makes it explicitly clear in its privacy policy that anything you input in ChatGPT could be reviewed by OpenAI and used to improve the quality of responses and performance of ChatGPT.
Now if the secret you are sharing with ChatGPT is that you took an extra cookie from the cookie jar, that's probably not going to come back to haunt you, but in the case of these Samsung workers, the stakes were a little higher. Two of them were sharing code with ChatGPT looking for it to check for errors in one case and perform code optimization in the other. The last employee uploaded an entire meeting and asked for ChatGPT to produce notes for it.
Now remove the confidential aspect and it's just astounding that it can do all these things, but this should serve as a reminder that it isn't a magical AI genie carrying out your commands. This information can be seen by employees of OpenAI and is retained by the company. This is a trade-off we make all the time for many of the free services around the web, but it's important not to lose sight of it. Particularly with the conversational nature of services like ChatGPT, Google Bard, and others, it can be easy to forget this is another tech company absorbing your data.
In the case of Samsung, it has since placed an upload limit on what can be sent to ChatGPT and it may implement its own AI chatbot to ensure that nothing like this can happen in the future. It's safe to assume that's going to be a common measure for companies in the coming years as chatbots clearly can offer tremendous value in the workplace, but trying to skirt sharing confidential information in them would be both limiting and too ripe for mistakes.
Cautionary tale aside, if you haven't tried one yet, I would still recommend that you sign up for ChatGPT and give it a shot. It's amazing what can be done with these tools already and obviously we are still in early days. Just be aware of what you're sharing.