Article 6KNNP Intel, Microsoft discuss plans to run Copilot locally on PCs instead of in the cloud

Intel, Microsoft discuss plans to run Copilot locally on PCs instead of in the cloud

by
Andrew Cunningham
from Ars Technica - All content on (#6KNNP)
intel-ai-pc-2-800x450.jpeg

Enlarge / The basic requirements for an AI PC, at least when it's running Windows. (credit: Intel)

Microsoft said in January that 2024 would be the year of the "AI PC," and we know that AI PCs will include a few hardware components that most Windows systems currently do not include-namely, a built-in neural processing unit (NPU) and Microsoft's new Copilot key for keyboards. But so far we haven't heard a whole lot about what a so-called AI PC will actually do for users.

Microsoft and Intel are starting to talk about a few details as part of an announcement from Intel about a new AI PC developer program that will encourage software developers to leverage local hardware to build AI features into their apps.

The main news comes from Tom's Hardware, confirming that AI PCs would be able to run "more elements of Copilot," Microsoft's AI chatbot assistant, "locally on the client." Currently, Copilot relies on server-side processing even for small requests, introducing lag that is tolerable if you're making a broad request for information but less so if all you want to do is change a setting or get basic answers. Running generative AI models locally could also improve user privacy, making it possible to take advantage of AI-infused software without automatically sending information to a company that will use it for further model training.

Read 5 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments