Article 72WZ3 Signal Creator Moxie Marlinspike Wants to Do for AI What He Did for Messaging

Signal Creator Moxie Marlinspike Wants to Do for AI What He Did for Messaging

by
jelizondo
from SoylentNews on (#72WZ3)

jelizondo writes:

Ars Technica published an interesting article about a new AI assistant that provides strong assurances that user data is unreadable even to the platform operator,

Moxie Marlinspike-the pseudonym of an engineer who set a new standard for private messaging with the creation of the Signal Messenger-is now aiming to revolutionize AI chatbots in a similar way.

His latest brainchild is Confer, an open source AI assistant that provides strong assurances that user data is unreadable to the platform operator, hackers, law enforcement, or any other party other than account holders. The service-including its large language models and back-end components-runs entirely on open source software that users can cryptographically verify is in place.

Data and conversations originating from users and the resulting responses from the LLMs are encrypted in a trusted execution environment (TEE) that prevents even server administrators from peeking at or tampering with them. Conversations are stored by Confer in the same encrypted form, which uses a key that remains securely on users' devices.

Like Signal, the under-the-hood workings of Confer are elegant in their design and simplicity. Signal was the first end-user privacy tool that made using it a snap. Prior to that, using PGP email or other options to establish encrypted channels between two users was a cumbersome process that was easy to botch. Signal broke that mold. Key management was no longer a task users had to worry about. Signal was designed to prevent even the platform operators from peering into messages or identifying users' real-world identities.

All major platforms are required to turn over user data to law enforcement or private parties in a lawsuit when either provides a valid subpoena. Even when users opt out of having their data stored long term, parties to a lawsuit can compel the platform to store it, as the world learned last May when a court ordered OpenAI to preserve all ChatGPT users' logs-including deleted chats and sensitive chats logged through its API business offering. Sam Altman, CEO of OpenAI, has said such rulings mean even psychotherapy sessions on the platform may not stay private. Another carve out to opting out: AI platforms like Google Gemini may have humans read chats.

"AI models are inherent data collectors," Em [she keeps her last name off the Internet] told Ars. "They rely on large data collection for training, improvements, operations, and customizations. More often than not, this data is collected without clear and informed consent (from unknowing training subjects or from platform users), and is sent to and accessed by a private company with many incentives to share and monetize this data."

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments