Privacy Concerns for macOS Users Regarding ChatGPT
Users of ChatGPT on macOS were alarmed to discover that their chat logs were being saved as plain-text files, raising significant privacy worries despite Apple's strict privacy measures.
Upon the launch of ChatGPT on macOS, customers were shocked to find that their conversation logs were being stored on their computers as unencrypted plain-text files.
Pedro José Pereira Vieito first brought attention to this issue in a post on Meta's Threads. The unencrypted storage of chat logs posed a severe security risk, potentially allowing unauthorized access to users' conversations by anyone with physical or remote access to the computer.
He highlighted that despite macOS's security enhancements starting from Mojave 10.14, which necessitate explicit user approval for apps to access private data, OpenAI made a conscious choice to exempt ChatGPT from macOS sandboxing. This decision bypassed protective measures designed to safeguard user data from third-party app intrusion.
In May, ChatGPT became accessible to subscribers on macOS, with non-subscriber accounts opening to the public on June 25. Unfortunately, all chat logs were stored on users' hard drives as plain-text files until July 5.
This meant that every conversation a user engaged in with ChatGPT on the computer was susceptible to access by individuals with physical possession of the device or through remote threats like malware or phishing attacks.
"Sandboxing," a privacy-enhancing feature of Apple's macOS, limits app access to data and software, ensuring data security at the kernel level. Apps downloaded through Apple's app store are automatically sandboxed, guaranteeing that data remains encrypted and protected.
While the extent of impact on users due to this oversight remains unknown, there has been an outpouring of surprise and concern from social media and industry commentators.
For example, a user named GeneralLex commented on a Verge article, expressing shock at finding unencrypted text files in their computer's memory related to ChatGPT. They stated, "I used Activity Monitor to extract the ChatGPT executable from memory and discovered that the chat log is in plain text, unencrypted in memory!"
This incident underscores the critical necessity for robust privacy measures in AI applications and third-party integrations on macOS. While efforts are being made to rectify the issue, questions linger about data security practices and the implementation of adequate safeguards to protect user privacy effectively.