Microsoft’s Recall AI Tool Is Making an Unwelcome Return

Security and privacy advocates are girding themselves for another uphill battle against Recall, the AI tool rolling out in Windows 11 that will screenshot, index, and store everything a user does every three seconds.

When Recall was introduced in May 2024, security practitioners roundly castigated it for creating a gold mine for malicious insiders, criminals, or nation-state spies if they managed to gain even brief administrative access to a Windows device. Privacy advocates warned that Recall was ripe for abuse in intimate partner violence settings. They also noted that there was nothing stopping Recall from preserving sensitive disappearing content sent through privacy-protecting messengers such as Signal.

Total Recall

Following months of backlash, Microsoft later suspended Recall. On Thursday, the company said it was reintroducing Recall. It currently is available only to insiders with access to the Windows 11 Build 26100.3902 preview version. Over time, the feature will be rolled out more broadly. Microsoft officials wrote:

Recall (preview)* saves you time by offering an entirely new way to search for things you’ve seen or done on your PC securely. With the AI capabilities of Copilot+ PCs, it’s now possible to quickly find and get back to any app, website, image, or document just by describing its content. To use Recall, you will need to opt-in to saving snapshots, which are images of your activity, and enroll in Windows Hello to confirm your presence so only you can access your snapshots. You are always in control of what snapshots are saved and can pause saving snapshots at any time. As you use your Copilot+ PC throughout the day working on documents or presentations, taking video calls, and context switching across activities, Recall will take regular snapshots and help you find things faster and easier. When you need to find or get back to something you’ve done previously, open Recall and authenticate with Windows Hello. When you’ve found what you were looking for, you can reopen the application, website, or document, or use Click to Do to act on any image or text in the snapshot you found.

Microsoft is hoping that the concessions requiring opt-in and the ability to pause Recall will help quell the collective revolt that broke out last year. It likely won’t for various reasons.

First, even if User A never opts in to Recall, they have no control over the setting on the machines of Users B through Z. That means anything User A sends them will be screenshotted, processed with optical character recognition and Copilot AI, and then stored in an indexed database on the other users’ devices. That would indiscriminately hoover up all kinds of User A’s sensitive material, including photos, passwords, medical conditions, and encrypted videos and messages. As Privacy Guides writer Em wrote on Mastodon:

This feature will unfortunately extract your information from whatever secure software you might have used and store it on this person’s computer in a possibly less secure way.

Of course this person could manually take a screenshot of all of this anyway, but this feature makes it that even a well-intentioned person might either not be aware it is on, or might wrongly assume it is secure enough.

This feature isn’t fully released yet, but it might be soon.

The presence of an easily searchable database capturing a machine’s every waking moment would also be a bonanza for others who don’t have users’ best interests at heart. That level of detailed archival material will undoubtedly be subject to subpoena by lawyers and governments. Threat actors who manage to get their spyware installed on a device will no longer have to scour it for the most sensitive data stored there. Instead they will mine Recall just as they do browser databases storing passwords now.

Microsoft didn’t immediately respond to a message asking why it’s reintroducing Recall less than a year after the feature got such a chilly reception. For critics, Recall is likely to remain one of the most pernicious examples of enshittification, the recently minted term for the shoehorning of unwanted AI and other features into existing products when there is negligible benefit to users.

This story originally appeared on Ars Technica.