Microsoft just announced Recall will ship as opt-in rather than opt-out. That is a complete reversal from the original design, and it matters more than the technical detail itself. It marks a line in the sand for what users will accept from AI features running on their own hardware.
What Recall actually does
Recall is the AI memory feature announced in May as part of Copilot+ PCs. The premise is genuinely useful: your PC takes periodic screenshots of everything you do and uses an on-device LLM to make it all searchable. You could search "that document I was looking at last Tuesday" and find it, even if you never saved it or cannot remember the filename. Think of it as a photographic memory for your computer.
The reaction from the security community when it was announced was immediate and sharp. A local database of screenshots of everything you do on your computer, including banking, passwords, private messages, medical records, is a single point of failure for your entire digital life. Researchers quickly demonstrated that the Recall database was accessible to any app running as the user, not just Recall itself. The data was not encrypted at rest.
What Microsoft changed
Microsoft pulled Recall from the initial Copilot+ launch in June and spent the summer rearchitecting the security model. The version shipping in October adds encryption for the snapshot database, requires Windows Hello biometric authentication before Recall can be accessed, and moves the whole feature to opt-in. Nothing is captured by default. You have to explicitly enable it.
The anti-cheat filtering that was always in the design, where Recall would skip capturing banking sites and content marked DRM-protected, is still there. But the fundamental shift is that unless you turn it on, your PC does not build this database at all.
The broader lesson
Recall is a case study in what happens when you ship a genuinely innovative feature without thinking through the threat model first. The capability was real. The privacy and security design was not ready for the capability. The fix required delaying a flagship feature by months and rebuilding the security architecture from scratch.
For developers building AI features: the question of what data your feature touches, how it is stored, who can access it, and what an attacker could do with it is not a post-launch concern. It is a design question that shapes whether you can ship at all. Recall had to learn that the hard way in public.