Thumbnail 1683097
thumbnail
Large (256x256)

Articles

Nvidia details efficiency of the NVFP4 format for LLM training — new paper reveals how NVFP4 offers benefits over FP8 and BF16
Nvidia has demonstrated that its NVFP4 4-bit floating point format - originally intended for inference - can be used for stable, large-scale training of LLMs with minimal accuracy loss, offering significant gains in compute and memory efficiency over both FP8 and BF16.
1