Apple and OpenAI’s partnership has hit a snag following the unsettling discovery that ChatGPT users on macOS found their chats being stored unencrypted. Apple, a company that prides itself on privacy, failed to protect user’s data from third-party integrations like ChatGPT into the macOS, as revealed in data engineer Pedro José Pereira Vieito’s post.
ChatGPT, launched in May and widely accessible by June 25, had been storing chat logs as unencrypted text files. This implied any individual or entity that could access the user’s computer either physically or remotely could get their hands on all their ChatGPT discussions.
Despite Apple’s macOS incorporating a security measure known as “sandboxing” which ensures data is never unencrypted, the issue cropped up since the ChatGPT app came from OpenAI’s website directly, circumventing Apple’s app service. According to Vieito, OpenAI chose to leave the sandbox to store chat logs in unprotected plain text files, undermining inbuilt protections.
This apparent error has caused shockwaves on social media and provoked questions about its origin. It’s suspected OpenAI opted for this avenue to use chat logs to enhance ChatGPT. The app’s terms and conditions require users to specifically decline to share their data. Yet, this incident begs the question of why Apple didn’t intervene earlier and why OpenAI stored sensitive, unencrypted data on user devices. OpenAI and Apple haven’t responded to these queries yet.