← Back to incidents
Microsoft Recall Feature Stored Unencrypted Screenshots with AI Analysis
HighMicrosoft's Recall feature for Copilot+ PCs stored unencrypted screenshots of user activity in accessible databases, creating massive privacy risks. Security researchers' findings led to public backlash and Microsoft delaying the feature.
Category
Privacy Leak
Industry
Technology
Status
Resolved
Date Occurred
May 20, 2024
Date Reported
Jun 3, 2024
Jurisdiction
International
AI Provider
OpenAI
Model
Copilot+
Application Type
embedded
Harm Type
privacy
Human Review in Place
No
Litigation Filed
No
microsoftrecallprivacyencryptionscreenshotscopilotsecuritywindows
Full Description
In May 2024, Microsoft announced Recall as a flagship feature for new Copilot+ PCs, designed to take screenshots every few seconds and use AI to analyze and index the content, making users' digital activities searchable through natural language queries. The feature was positioned as a productivity enhancement that would help users find information from their past computer usage.
Security researcher Kevin Beaumont discovered that Recall stored all screenshot data and AI analysis in unencrypted SQLite databases located in a user's AppData folder. The databases contained not only the screenshots but also OCR text extraction and AI-generated descriptions of the images. Any malware, administrative user, or process with local access could read these databases, potentially exposing passwords, financial information, private communications, and other sensitive data captured in the screenshots.
The privacy implications were severe, as Recall captured everything visible on screen including password fields, banking websites, private messages, and confidential documents. Beaumont demonstrated that the database could be easily accessed and the screenshots extracted, creating what he termed a 'perfect tool' for data exfiltration. The feature had no encryption, limited access controls, and captured sensitive information indiscriminately.
Following widespread criticism from cybersecurity experts, privacy advocates, and users, Microsoft faced intense public backlash. Critics argued that the feature represented a fundamental privacy violation and created significant security risks, especially in shared computing environments or when devices were compromised by malware. The criticism intensified when researchers showed how easy it was to extract sensitive information from the databases.
In response to the security findings and public outcry, Microsoft announced in June 2024 that it would delay the Recall feature and redesign it with enhanced security measures. The company committed to making the feature opt-in rather than enabled by default, implementing encryption for stored data, and adding Windows Hello authentication requirements. Microsoft also announced plans for additional security reviews and testing before any future release.
The incident highlighted broader concerns about AI features that continuously monitor and analyze user behavior without adequate privacy protections. It demonstrated the risks of deploying AI-powered surveillance capabilities without implementing fundamental security controls, particularly encryption and access restrictions for sensitive user data.
Root Cause
Microsoft designed Recall to store AI-analyzed screenshots in unencrypted SQLite databases without proper access controls, making sensitive user data accessible to any process or user with local access to the machine.
Mitigation Analysis
This incident could have been prevented through security-by-design principles including data encryption at rest, proper access controls, privacy impact assessments before deployment, and red team security testing. The lack of encryption and basic access controls suggests insufficient security review for a feature handling sensitive user data.
Lessons Learned
The incident demonstrates the critical importance of privacy-by-design and security-by-design principles when developing AI features that handle sensitive user data. It shows how even well-intentioned productivity features can create significant privacy risks without proper security controls and user consent mechanisms.
Sources
Microsoft pauses Recall rollout after privacy and security concerns
The Verge · Jun 3, 2024 · news
Microsoft's Recall is a privacy nightmare
Ars Technica · Jun 3, 2024 · news