Article 6FN91 Nvidia banking on TensorRT to expand generative AI dominance

Nvidia banking on TensorRT to expand generative AI dominance

by
Emilia David
from The Verge - All Posts on (#6FN91)
acastro_STK083_03.0.jpg Illustration by Alex Castro / The Verge

Nvidia looks to build a bigger presence outside GPU sales as it puts its AI-specific software development kit into more applications.

Nvidia announced that it's adding support for its TensorRT-LLM SDK to Windows and models like Stable Diffusion. The company said in a blog post that it aims to make large language models (LLMs) and related tools run faster.

TensorRT speeds up inference, the process of going through pretrained information and calculating probabilities to come up with a result - like a newly generated Stable Diffusion image. With this software, Nvidia wants to play a bigger part in the inference side of generative AI.

Its TensorRT-LLM breaks down LLMs and lets them run faster on Nvidia's H100 GPUs. It works with LLMs like...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments