Article 678AP New GeForce RTX 3050 variant offers the same performance but lower power use

New GeForce RTX 3050 variant offers the same performance but lower power use

by
Andrew Cunningham
from Ars Technica - All content on (#678AP)
rtx-3050-msi-800x508.jpeg

Enlarge (credit: MSI)

Nvidia's GeForce RTX 3050 is no one's idea of a powerhouse, but it's a decent 1080p GPU and it's still the cheapest way to buy into Nvidia's RTX 3000-series ecosystem if you want DLSS 2.0 support or Nvidia's ray-tracing implementation. MSI has published specs for a revised version of one of its RTX 3050 GPUs (via VideoCardz), advertising the same general features and performance levels but lowering the power consumption estimate by 15 W.

The lower power consumption appears to come from the GPU's use of a smaller graphics die, called GA107. Older RTX 3050s use the same GA106 die as the RTX 3060 series, but with many of that die's 3,840 CUDA cores switched off. This can allow Nvidia to reuse partially defective GA106 dies, but as chip yields improve and the number of defective dies decreases, it either means shipping fewer RTX 3050s or putting perfectly good chips in cheaper GPUs. The GA107 die includes a maximum of 2,560 CUDA cores, and it apparently needs a bit less power than a GA106 die with the exact same number of cores enabled.

The two MSI cards in question otherwise have almost exactly the same specs, power consumption aside: an 1807 MHz boost clock, 14 Gbps of memory bandwidth thanks to GDDR6 on a 128-bit memory interface, and 2,560 CUDA cores. One other change is that the newer revision has two DisplayPorts and two HDMI ports, rather than three DisplayPorts and one HDMI port, a small change that most likely has nothing to do with the GPU change. Another is that the card now requires a 6-pin power plug, rather than an 8-pin plug.

Read 2 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments