Article 6HN9X Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs — the world's fastest supercomputer blasts through one trillion parameter model with only 8 percen

Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs — the world's fastest supercomputer blasts through one trillion parameter model with only 8 percen

by
mc@matthewconnatser.net (Matthew Connatser)
from Tomshardware on (#6HN9X)
Frontier, the world's fastest supercomputer, can train a large language model like GPT-4 with just 8% of its GPUs.
External Content
Source RSS or Atom Feed
Feed Location https://www.tomshardware.com/feeds/all
Feed Title Tomshardware
Feed Link https://www.tomshardware.com/
Reply 0 comments