Article 6HBQF AMD Says its Mi300X AI Accelerator is Faster Than Nvidia's H100

AMD Says its Mi300X AI Accelerator is Faster Than Nvidia's H100

by
hubie
from SoylentNews on (#6HBQF)

Arthur T Knackerbracket has processed the following story:

AMD has finally launched its Instinct MI300X accelerators, a new generation of server GPUs designed to provide compelling performance levels for generative AI workloads and other high-performance computing (HPC) applications. MI300X is faster than H100, AMD said earlier this month, but Nvidia tried to refute the competitor's statements with new benchmarks released a couple of days ago.

Nvidia tested its H100 accelerators with TensorRT-LLM, an open-source library and SDK designed to efficiently accelerate generative AI algorithms. According to the GPU company, TensorRT-LLM was able to run 2x faster on H100 than on AMD's MI300X with proper optimizations.

AMD is now providing its own version of the story, refuting Nvidia's statements about H100 superiority. Nvidia used TensorRT-LLM on H100, instead of vLLM used in AMD benchmarks, while comparing performance of FP16 datatype on AMD Instinct MI300X to FP8 datatype on H100. Furthermore, Team Green inverted AMD's published performance data from relative latency numbers to absolute throughput.

AMD suggests that Nvidia tried to rig the game, while it is still busy identifying new paths to unlock performance and raw power on Instinct MI300 accelerators. The company provided the latest performance levels achieved by the Llama 70B chatbot model on MI300X, showing an even higher edge over Nvidia's H100.

By using the vLLM language model for both accelerators, MI300X was able to achieve 2.1x the performance of H100 thanks to the latest optimizations in AMD's software stack (ROCm). The company highlighted a 1.4x performance advantage over H100 (with equivalent datatype and library setup) earlier in December. vLLM was chosen because of its broad adoption within the community and the ability to run on both GPU architectures.

Original Submission

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments