Microsoft Is Dedicated To Building A Dodgy New Database Of Every Windows 11 User’s Online Behaviors

Last year Microsoft announced that it was bringing a new feature to its under-performing Windows 11 OS dubbed Recall." According toMicrosoft's explanation of Recall, the AI" powered technology was supposed to take screenshots of your activity every five seconds, giving you an explorable timeline of your PC's past," that Microsoft's AI-powered assistant,Copilot, can then help you peruse.
The idea is that you can use AI to help you dig through your computer use to remember past events (helping you find that restaurant your friend texted you about, or remember that story about cybernetic hamsters that so captivated you two weeks ago).
But it didn't take long before privacy advocates understandably began expressing concerns that this not only provides Microsoft with an even more detailed way to monetized consumer privacy, it createssignificant new privacy risks should that data be exposed.
Early criticism revealed that consumer privacy genuinely was nowhere near the forefront of their thinking during Recall development. After some criticism, Microsoft said it wouldtake additional steps to try and address concerns, including making the new service opt-in only, and tethering access to encrypted Recall information to the PIN or biometric login restrictions of Windows Hello Enhanced Sign-in Security.
But that (quite understandably) didn't console critics, and Microsoft eventually backed off the launch entirely.
Until now.
Last week, Microsoft, clearly hungry to further monetize absolutely everything you do, announced that were bringing Recall back. Microsoft's hoping that making the service opt-in (for now) with greater security will help quiet criticism:
To use Recall, you will need to opt-in to saving snapshots, which are images of your activity, and enroll in Windows Hello to confirm your presence so only you can access your snapshots."
But as Ars Technica's Dan Goodin notes, even if user A opts out of recall, all the users he's interacting with may not, opening the door to a long chain of potential privacy violations:
That means anything User A sends them will be screenshotted, processed with optical character recognition and Copilot AI, and then stored in an indexed database on the other users' devices. That would indiscriminately hoover up all kinds of User A's sensitive material, including photos, passwords, medical conditions, and encrypted videos and messages."
The simple act of creating this additional massive new archive of detailed user interactions may thrill Microsoft in the era of unregulated data brokers and rampant data monetization, but it creates an entirely new target for bad actors, spyware, subpoena-wielding governments, and foreign and domestic intelligence. In a country that's literally too corrupt to pass a modern privacy law.
It's all very... Microsoft.
It's a bad idea being pushed by a company well aware that King Donald is taking a hatchet to any government regulators that might raise concerns about it. It's another example of enshittification pretending to be progress, and Microsoft isn't responding to press inquiries about it because it knows that barreling forth without heeding privacy concerns is a bad idea. It just doesn't care.