Article 6A1GM Nvidia DGX Cloud: Train Your Own ChatGPT in a Web Browser For $37K a Month

Nvidia DGX Cloud: Train Your Own ChatGPT in a Web Browser For $37K a Month

by
msmash
from Slashdot on (#6A1GM)
An anonymous reader writes: Last week, we learned that Microsoft spent hundreds of millions of dollars to buy tens of thousands of Nvidia A100 graphics chips so that partner OpenAI could train the large language models (LLMs) behind Bing's AI chatbot and ChatGPT. Don't have access to all that capital or space for all that hardware for your own LLM project? Nvidia's DGX Cloud is an attempt to sell remote web access to the very same thing. Announced today at the company's 2023 GPU Technology Conference, the service rents virtual versions of its DGX Server boxes, each containing eight Nvidia H100 or A100 GPUs and 640GB of memory. The service includes interconnects that scale up to the neighborhood of 32,000 GPUs, storage, software, and "direct access to Nvidia AI experts who optimize your code," starting at $36,999 a month for the A100 tier. Meanwhile, a physical DGX Server box can cost upwards of $200,000 for the same hardware if you're buying it outright, and that doesn't count the efforts companies like Microsoft say they made to build working data centers around the technology.

twitter_icon_large.pngfacebook_icon_large.png

Read more of this story at Slashdot.

External Content
Source RSS or Atom Feed
Feed Location https://rss.slashdot.org/Slashdot/slashdotMain
Feed Title Slashdot
Feed Link https://slashdot.org/
Feed Copyright Copyright Slashdot Media. All Rights Reserved.
Reply 0 comments