Modder installs 16GB of GDDR6 memory on the Geforce RTX 3070

Russian hardware enthusiast VIK-on has previously appeared on SweClockers with mods to graphics cards. The last time we reported a prank was in January, when it introduced the Geforce RTX 2070 with 16GB of GDDR6 memory purchased from the Aliexpress marketplace. The score was not yet a performance success, but the card was at least correctly identified.

Anyway, the experience seems to have given VIK on a bleeding tooth, then Video Cards Now he reports that he instead has his teeth in Palit’s Geforce RTX 3070. The recipe is the same as the previous time – a card with 8 GB of GDDR6 memory in the basic version, which has been replaced with memory capsules with dual capacity for a total of 16 GB.


The physical replacement of a BGA-connected memory can be done without much hassle, but a physical upgrade alone is not sufficient for the card to determine the new amount of memory. The usual way to go was through tweaks to the card’s BIOS, but here’s the VIK-on coincidence periodic – Nvidia doesn’t even allow partner manufacturers to tweak the BIOS code on modern graphics cards.

The solution was instead to be hardware-based, with a series of resistors defining the card BeltSettings. Fifth card BeltThen a bit talks about configuring the card memory for the BIOS – for example, 00000 corresponds to 8 GB of memory from Samsung, while 00001 is 8 GB with the Micron transmitter. After a diligent experience, VIK-on has landed that the 00110 configuration supports all 16 GB of memory, see around 5:00 in the Youtube segment at the top of the article.

READ  Twitch will update PogChamp's emoji every 24 hours

With the card running in Windows and identified as RTX 3070 with 16GB of GDDR6 memory, VIK-on is experienced Been seen From the RTX 2070 Experience. The card worked, but it was unstable and underperforming – until clock frequencies were locked in with EVGA Precision, after which the graphics decided to perform slightly better than the RTX 3070 Founders Edition.

Read more about graphics cards:

Leave a Reply

Your email address will not be published. Required fields are marked *