Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs — the world's fastest supercomputer blasts through one trillion parameter model with only 8 percen
by mc@matthewconnatser.net (Matthew Connatser) from Tomshardware on (#6HN9X)
Frontier, the world's fastest supercomputer, can train a large language model like GPT-4 with just 8% of its GPUs.