

We can see this very clearly in the race between tech giants to stay ahead in the race between large language models (LLMs) like ChatGPT and its competitors. New gadgets and software come with new bugs, especially if they're rushed out the door. While there is no reason to assume that information submitted to ChatGPT will be shared with others, there is also no certainty that it will not. To remind employees about the dangers associated with such actions, several companies have taken action: JPMorgan has restricted workers' use of ChatGPT, for example, and Amazon, Microsoft, Samsung Electronics, and Wal-Mart have all issued warnings to employees to take care in using generative AI services.

In other organizations, an executive cut and pasted their firm's 2023 strategy document into ChatGPT and asked it to create a PowerPoint deck, and a doctor submitted his patient's name and their medical condition and asked ChatGPT to craft a letter to the patient's insurance company.Īll of these actions were performed with the best of the organization in mind, but ended up taking confidential information outside of the company. But what if you use an external tool for those tasks and the tasks involve confidential data that ended up on a server outside of the control of your company? That’s a problem.Īs a news writer at Tom’s Hardware reported there were 3 incidents in 20 days where Samsung staff shared confidential information with ChatGPT. Helping to reduce costs and enhance productivity are both things that your employer will look kindly upon.
