Is Microsoft trying to commit suicide? (2024)

The breaking tech news this year has been the pervasive spread of "AI" (or rather, statistical modeling based on hidden layer neural networks) into everything. It's the latest hype bubble now that Cryptocurrencies are no longer the freshest sucker-bait in town, and the media (who these days are mostly stenographers recycling press releases) are screaming at every business in tech to add AI to their product.

Well, Apple and Intel and Microsoft were already in there, but evidently they weren't in there enough, so now we're into the silly season with Microsoft's announcement of CoPilot plus Recall, the product nobody wanted.

CoPilot+ is Microsoft's LLM-based add-on for Windows, sort of like 2000's Clippy the Talking Paperclip only with added hallucinations. Clippy was rule-based: a huge bundle of IF ... THEN statements hooked together like a 1980s Expert System to help users accomplish what Microsoft believed to be common tasks, but which turned out to be irritatingly unlike anything actual humans wanted to accomplish. Because CoPilot+ is purportedly trained on what users actually do, it looked plausible to someone in marketing at Microsoft that it could deliver on "help the users get stuff done". Unfortunately, human beings assume that LLMs are sentient and understand the questions they're asked, rather than being unthinking statistical models that cough up the highest probability answer-shaped object generated in response to any prompt, regardless of whether it's a truthful answer or not.

Anyway, CoPilot+ is also a play by Microsoft to sell Windows on ARM. Microsoft don't want to be entirely dependent on Intel, especially as Intel's share of the global microprocessor market is rapidly shrinking, so they've been trying to boost Windows on ARM to orbital velocity for a decade now. The new CoPilot+ branded PCs going on sale later this month are marketed as being suitable for AI (spot the sucker-bait there?) and have powerful new ARM processors from Qualcomm, which are pitched as "Macbook Air killers", largely because they're playing catch-up with Apple's M-series ARM-based processors in terms of processing power per watt and having an on-device coprocessor optimized for training neural networks.

Having built the hardware and the operating system Microsoft faces the inevitable question, why would a customer want this stuff? And being Microsoft, they took the first answer that bubbled up from their in-company echo chamber and pitched it at the market as a forced update to Windows 11. And the internet promptly exploded.

First, a word about Apple. Apple have been quietly adding AI features to macOS and iOS for the past several years. In fact, they got serious about AI in 2015, and every Apple Silicon processor they've released since 2016 has had a neural engine (an AI coprocessor) on board. Now that the older phones and laptops are hitting end of life, the most recent operating system releases are rolling out AI-based features. For example, there's on-device OCR for text embedded in any image. There's a language translation service for the OCR output, too. I can point my phone at a brochure or menu in a language I can't read, activate the camera, and immediately read a surprisingly good translation: this is an actually useful feature of AI. (The ability to tag all the photos in my Photos library with the names of people present in them, and to search for people, is likewise moderately useful: the jury is still out on the pet recognition, though.) So the Apple roll-out of AI has so far been uneventful and unobjectionable, with a focus on identifying things people want to do and making them easier.

Microsoft Recall is not that.

"Hey, wouldn't it be great if we could use AI in Windows to help our users see everything they've ever done on their computer?" Is a great pitch, and Recall kinda-sorta achieves this. But the implementation is soemthing rather different. Recall takes snapshots of all the windows on a Windows computer's screen (except the DRM'd media, because the MPAA must have their kilo of flesh) and saves them locally. The local part is good: the term for software that takes regular screenshots and saves them in the cloud is "part of a remote access trojan". It then OCRs any text in the images, and I believe also transcribes any speech, and saves the resulting output in an unencrypted SQLite database stored in:

C:\Users\$USER\AppData\Local\CoreAIPlatform.00\UKP{GUID}

And there are tools already out there to slurp through the database and see what's in it, such as TotalRecall.

Surprise! It turns out that the unencrypted database and the stored images may contain your user credentials and passwords. And other stuff. Got a p*rn habit? Congratulations, anyone with access to your user account can see what you've been seeing. Use a password manager like 1Password? Sorry, your 1Password passwords are probably visible via Recall, now.

Now, "unencrypted" is relative; the database is stored on a filesystem which should be encrypted using Microsoft's BitLocker. But anyone with credentials for your Microsoft account can decrypt it and poke around. Indeed, anyone with access to your PC, unlocked, has your entire world at their fingertips.

But this is an utter privacy sh*t-show. Victims of domestic abuse are at risk of their abuser trawling their PC for any signs that they're looking for help. Anyone who's fallen for a scam that gave criminals access to their PC is also completely at risk.

Worse: even if you don't use Recall, if you send an email or instant message to someone else who does then it will be OCRd and indexed via Recall: and preserved for posterity.
Now imagine the sh*t-show when this goes corporate.

And it turns out that Microsoft is pushing this feature into the latest update of Windows 11 for all compatible hardware and making it impossible to remove or disable, because that tactic has worked so well for them in the past at driving the uptake of new technologies that Microsoft wanted its ~~customers~~ victims to start using. Like, oh, Microsoft Internet Explorer back in 2001, and remember how well that worked out for them.

Suddenly every PC becomes a target for Discovery during legal proceedings. Lawyers can subpoena your Recall database and search it, no longer being limited to email but being able to search for terms that came up in Teams or Slack or Signal messages, and potentially verbally via Zoom or Skype if speech-to-text is included in Recall data.

It's a sh*t-show for any organization that handles medical records or has a duty of legal confidentiality; indeed, for any business that has to comply with GDPR (how does Recall handle the Right to be Forgotten? In a word: badly), or HIPAA in the US. This misfeature contravenes privacy law throughout the EU (and in the UK), and in healthcare organizations everywhere which has a medical right to privacy. About the only people whose privacy it doesn't infringe are the Hollywood studios and Netflix, which tells you something about the state of things.

Recall is already attracting the attention of data protection regulators; I suspect in its current form it's going to be dead on arrival, and those CoPilot+ PCs due to launch on June 18th are going to get a hurried overhaul. It's also going to be interesting to see what Apple does, or more importantly doesn't announce at WWDC next week, which is being trailed as the year when Apple goes all-in on AI.

More to the point, though, Windows Recall blows a hole under the waterline of Microsoft's trustworthiness. Microsoft "got serious" about security earlier this decade, around the time Steve Balmer stepped down as CEO, and managed to recover somwhat from having a reputation for taking a slapdash approach to its users data. But they've been going backwards since 2020, with dick moves like disabling auto-save to local files in Microsoft Word (your autosave data only autosaves to OneDrive), slurping all incoming email for accounts accessed via Microsoft Outlook into Microsoft's own cloud for AI training purposes (ask the Department of Justice how they feel about Microsoft potentially having access to the correspondence for all their investigations in progress), and now this. Recall undermines trust, and once an institution loses trust it's really hard to regain it.

Some commentators are snarking that Microsoft really really wants to make 2025 the year of Linux on the Desktop, and it's kind of hard to refute them right now.

Is Microsoft trying to commit suicide? (2024)

References

Top Articles
Latest Posts
Article information

Author: Jeremiah Abshire

Last Updated:

Views: 6107

Rating: 4.3 / 5 (74 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Jeremiah Abshire

Birthday: 1993-09-14

Address: Apt. 425 92748 Jannie Centers, Port Nikitaville, VT 82110

Phone: +8096210939894

Job: Lead Healthcare Manager

Hobby: Watching movies, Watching movies, Knapping, LARPing, Coffee roasting, Lacemaking, Gaming

Introduction: My name is Jeremiah Abshire, I am a outstanding, kind, clever, hilarious, curious, hilarious, outstanding person who loves writing and wants to share my knowledge and understanding with you.