Security Risks Of Windows Copilot Are Unknowable
Arthur T Knackerbracket has processed the following story:
I am still amazed how few people - even in IT - have heard of Windows Copilot. Microsoft's deep integration of Bing Chat into Windows 11 was announced with much fanfare back in May.
Microsoft hasn't been quiet about it - indeed it can't seem to shut up about Copilot this and Copilot that - yet it seems that the real impact of this sudden Copilotization of all the things has somehow managed to fly under the radar.
[...] Microsoft has rushed to get Copilot into its operating system
[...] Windows Copilot looks just like Bing Chat - which may be why IT folks haven't given it a second look. Bing Chat has been available in Microsoft's Edge Browser for months - no biggie.
But Windows Copilot only looks like Bing Chat. While Bing Chat runs within the isolated environment of the web browser, Copilot abandons those safeties. Copilot can touch and change Windows system settings - not all of them (at least not yet) but some of them, with more being added all the time. That means Microsoft's AI chatbot has broken loose of its hermetically sealed browser, and has the run of our PCs.
[...] Every day we learn of new prompt injection attacks - weaponizing the ambiguities of human language (and, sometimes, just the right level of noise) to override the guardrails keeping AI chatbots on the straight and narrow. Consider a prompt injection attack hidden within a Word document: Submitted to Windows Copilot for an analysis and summary, the document also injects a script that silently transmits a copy of the files in the working directory to the attacker.
Read more of this story at SoylentNews.