After Samsung Semiconductor let its fab engineers use ChatGPT for assistance, they started using it to quickly fix errors in their source code, leaking confidential information like notes from internal meetings and data related to fab performance and yields in the process. The company now plans to develop its own ChatGPT-like AI service for internal use. But for now, it limits the length of questions submitted to the service to 1024 bytes, reports Economist.
So far, Samsung Semiconductor has recorded three instances of ChatGPT use which led to a data leaks. While three may not seem like a lot, they all happened over the course of 20 days, so the situation is quite disturbing.
In one case, a Samsung Semiconductor employee submitted source code of a proprietary program to ChatGPT to fix errors, which essentially disclosed the code of a top-secret application to an artificial intelligence run by an external company.
The second case was perhaps even more alarming as another employee entered test patterns meant to identify defective chips and requested optimization. Test sequences meant to identify defects are strictly confidential. Meanwhile, optimizing these test sequences and possibly reducing their number can speed up silicon test and verification procedures, which significantly cuts down costs.
Yet another employee used the Naver Clova application to convert a meeting recording into a document, then submitted it to ChatGPT to prepare a presentation.
These actions clearly put confidential information at risk, prompting Samsung to warn its employees about the dangers of using ChatGPT. Samsung Electronics informed its executives and employees that data entered into ChatGPT is transmitted and stored on external servers, making it impossible for the company to retrieve it and increasing risks of confidential information leakage. While ChatGPT is a powerful tool, its open learning data feature can expose sensitive information to third parties, which is unacceptable in the highly-competitive semiconductor industry.
The conglomerate is currently preparing protective measures to prevent similar accidents in the future. If another incident occurs, even after implementing emergency information protection measures, access to ChatGPT may be blocked on the company network. Still, it’s clear that things like generative AI and various AI-enabled electronic design automation tools are an important aspect of the future of chip production.
When asked about the information leakage incident, a Samsung Electronics spokesperson declined to confirm or deny the information, as it was an internal matter.