Complete News World

New GPU variant to reduce costs

from Valentine Sattler
Presumably, Nvidia wants to deliver the new AD103-301 GPU for the Geforce RTX 4080 soon. This integrates the circuit previously located on the PCB, which should reduce production costs.

Nvidia’s Geforce RTX 4080 only launched three months ago, so it’s still very early days for an expected update. However, the company now appears to want to change the graphics card’s GPU: according to Videocardz, board partner Gainward recently mentioned the AD103-301 as well as the AD103-300 chip.

New GPU revision

Rumors arose about the upcoming Geforce RTX 4070 previously, which also spoke of two GPUs there: AD104-250 and AD104-251. Twitter user HKEPC has a ready explanation for this, which can probably be applied to the RTX 4080. Accordingly, GPUs with a “1” at the end have an integrated comparator circuit, whereas with previous “0” GPUs this should have been accommodated on the graphics card’s PCB.

Recommended editorial contentHere you will find external content from [PLATTFORM]. To protect your personal data, external integrations are displayed only if you confirm by clicking “Load all external content”:I consent to external content being displayed to me. Thus personal data is transferred to third party platforms. Read more about our privacy policy.

What comparison is used in a particular case is unknown. A common application is digital signal processing. Regardless of the intended use, according to Videocardz, newer GPUs will likely save costs, since add-on integrated circuits are much cheaper than other components on a PCB. For board partners, the production RTX 4080 with the new AD103-301 GPU should be a bit less expensive.

Also interesting: Geforce RTX 4080: How to get stuck into PCGH Extreme step by step

However, the savings will likely be in the cent range, so the change won’t result in a noticeable drop in prices. After all, no noticeable changes are expected from the new GPUs. If the mods were in fact only affecting said comparison circuit, gamers wouldn’t have to pay attention to what GPU is now being used in their graphics card.

Source: Cards video