Article 6J4PS Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI

Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI

by
Emilia David
from The Verge - All Posts on (#6J4PS)
acastro_181017_1777_brain_ai_0002.0.jpg Illustration by Alex Castro / The Verge

Google Cloud's new partnership with AI model repository Hugging Face is letting developers build, train, and deploy AI models without needing to pay for a Google Cloud subscription. Now, outside developers using Hugging Face's platform will have cost-effective" access to Google's tensor processing units (TPU) and GPU supercomputers, which will include thousands of Nvidia's in-demand and export-restricted H100s.

Hugging Face is one of the more popular AI model repositories, storing open-sourced foundation models like Meta's Llama 2 and Stability AI's Stable Diffusion. It also has many databases for model training.

There are over 350,000 models hosted on the platform for developers to work with or upload their own models to Hugging...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments