Microsoft May Store Your Conversations With Bing If You're Not an Enterprise User
An anonymous reader quotes a report from The Register: Microsoft prohibits users from reverse engineering or harvesting data from its AI software to train or improve other models, and will store inputs passed into its products as well as any output generated. The details emerged as companies face fresh challenges with the rise of generative AI. People want to know what corporations are doing with information provided by users. And users are likewise curious about what they can do with the content generated by AI. Microsoft addresses these issues in a new clause titled 'AI Services' in its terms of service. The five new policies, which were introduced on 30 July and will come into effect on September 30, state that: Reverse Engineering. You may not use the AI services to discover any underlying components of the models, algorithms, and systems. For example, you may not try to determine and remove the weights of models.Extracting Data. Unless explicitly permitted, you may not use web scraping, web harvesting, or web data extraction methods to extract data from the AI services.Limits on use of data from the AI Services. You may not use the AI services, or data from the AI services, to create, train, or improve (directly or indirectly) any other AI service.Use of Your Content. As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service.Third party claims. You are solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services). A spokesperson from Microsoft declined to comment on how long the company plans to store user inputs into its software. "We regularly update our terms of service to better reflect our products and services. Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers," the representative told us in a statement. Microsoft has previously said, however, that it doesn't save conversations or use that data to train its AI models for its Bing Enterprise Chat mode. The policies are a little murkier for its Microsoft 365 Copilot, although it doesn't appear to use customer data or prompts for training, it does store information. "[Copilot] can generate responses anchored in the customer's business content, such as user documents, emails, calendar, chats, meetings, contacts, and other business data. Copilot combines this content with the user's working context, such as the meeting a user is in now, the email exchanges the user has had on a topic, or the chat conversations the user had last week. Copilot uses this combination of content and context to help deliver accurate, relevant, contextual responses," it said.
Read more of this story at Slashdot.