Microsoft's new AI feature, Microsoft Recall, introduced at Build 2024, aims to help users retrieve past digital activities but raises significant privacy concerns.
Underneath the futuristic shine, Recall comes with a potential massive hit to user privacy—and by extension, security. Here are the reasons for the unease—and what you should do when you actually encounter Microsoft Recall in the wild.
It captures screenshots every five seconds, storing between 25GB to 150GB of data locally, including sensitive information like passwords and tax details. Despite encryption measures, the risk remains as files are decrypted when accessed by attackers. Recall is enabled by default, and many users might be unaware of its activity tracking. Users are advised to disable the feature or carefully configure privacy settings.
For more details, read the full article here.