Microsoft says that the Recall index remains local and private on-device, encrypted in a way that is linked to a particular user’s account. “Recall screenshots are only linked to a specific user profile and Recall does not share them with other users, make them available for Microsoft to view, or use them for targeting advertisements. Screenshots are only available to the person whose profile was used to sign in to the device,” Microsoft says.
Users can pause, stop, or delete captured content and can exclude specific apps or websites. Recall won’t take snapshots of InPrivate web browsing sessions in Microsoft Edge or DRM-protected content
Optional local feature. Of course the thread acts like eggs of the universe.
Ok, but now picture on day 1, MS pops a little message box up on every computer with it installed that says, “Enable advanced functionality?” with a teeny tiny link to a long legal document that, somewhere in it, says that actually with advanced features turned on, they do upload all your data.
Because companies do that, all the time. It allows you to both have press releases saying “we collect no data, we love privacy!” but then actually collect and sell data on like 95% of your customers.
I can also tell you from my personal experience of using dozens of enterprise MS applications that they all constantly pester you to set up a cloud account, log into it, and link all of your data and activities into this account. In the last few months, every one of them has added an “optional” co-pilot feature that intrusively tries to get me to use it at every opportunity.
Optional local feature. Of course the thread acts like eggs of the universe.
Ok, but now picture on day 1, MS pops a little message box up on every computer with it installed that says, “Enable advanced functionality?” with a teeny tiny link to a long legal document that, somewhere in it, says that actually with advanced features turned on, they do upload all your data.
Because companies do that, all the time. It allows you to both have press releases saying “we collect no data, we love privacy!” but then actually collect and sell data on like 95% of your customers.
Point out when MS has done this.
Heres an article about them doing it last year, specifically around how much of your data they can use for AI training, so it’s exactly the same thing. The article also mentions several other companies doing it around the same time. https://venturebeat.com/ai/microsoft-changes-services-agreement-to-add-restrictions-for-ai-offerings/
Here’s another article from 12 years ago about MS changes the ToS of their cloud storage policies to allow them to use all your stored files for advertising and “new features”. https://www.csoonline.com/article/545804/microsoft-subnet-microsoft-raises-privacy-issues-with-tweaked-tos-to-share-data-across-the-cloud.html
I can also tell you from my personal experience of using dozens of enterprise MS applications that they all constantly pester you to set up a cloud account, log into it, and link all of your data and activities into this account. In the last few months, every one of them has added an “optional” co-pilot feature that intrusively tries to get me to use it at every opportunity.