AMD releases new chips to power faster AI training
by Emilia David from The Verge - All Posts on (#6GYQ3)
Image: PaulSakuma.com
AMD wants people to remember that Nvidia's not the only company selling AI chips. It's announced the availability of new accelerators and processors geared toward running large language models, or LLMs.
The chipmaker unveiled the Instinct MI300X accelerator and the Instinct M1300A accelerated processing unit (APU), which the company said works to train and run LLMs. The company said the MI300X has 1.5 times more memory capacity than the previous M1250X version. Both new products have better memory capacity and are more energy-efficient than their predecessors, said AMD.
LLMs continue to increase in size and complexity, requiring massive amounts of memory and compute," AMD CEO Lisa Su said. And we know the availability of GPUs is the...