NVIDIA Answers Some Burning RTX 30-Series Questions: PCIe 4, 3080 RAM
The RTX 30-series, comprised of the RTX 3070, 3080, and 3090 GPUs, might just be the biggest tech topic of the week, and with NVIDIA promising some big jumps in capability with these cards, there are naturally some burning questions that prospective buyers want answered. NVIDIA has taken the time to answer some of these questions, giving us a better idea of what will be required to use these cards and why NVIDIA made some of the design choices it did with this new series.
The questions and their answers were posted in an AMA recap thread on the official NVIDIA subreddit. The very first question in that thread – "Why only 10GB of memory of the RTX 3080?" – is one that a lot of folks probably want an answer to, especially since the flagship card in the 30-series, the RTX 3090, has a whopping 24GB of memory.
That's a pretty big rift, but as it turns out, there's only 10GB in the RTX 3080 because of what NVIDIA set out to accomplish with the card. "The goal of the 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price," NVIDIA's Justin Walker said.
He went on to explain that in a number of games, such as Shadow of the Tomb Raider, Assassin's Creed Odyssey, Metro Exodus, Red Dead Redemption 2, and Gears of War 5, NVIDIA was able to achieve frame rates of 60-100fps with graphic settings maxed, 4K resolution, and RTX turned on in games that support it. Despite those high settings, each game only used between 4GB and 6GB of memory, so it looks like 10GB of memory was determined to be sufficient enough for 4K gameplay at max settings.
"Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance," Walker concluded. Indeed, even though the RTX 3090 boasts 24GB of memory, that's also a $1,500 card. The RTX 3080, on the other hand, settled at $700, and while that's still very expensive, it's less than half the cost of the 3090.
Elsewhere in that thread, NVIDIA's Qi Lin confirms that 30-series cards will indeed support 10bit HDR. "In fact," Lin continued, "HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays."
We also learn what kind of performance downgrade owners can expect if they use these cards with PCIe 3.0 instead of PCIe 4.0. "System performance is impacted by many factors and the impact varies between applications," NVIDIA's Seth Schneider said. "The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance. We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases."
So, if you're planning to stick with PCIe 3.0 for the foreseeable future, it sounds like you won't see any major performance downgrades. If you're planning on buying an RTX 30-series card, then that entire AMA is worth reading through, as it covers a lot of different topics. We'll undoubtedly be learning more about the RTX 30-series in the days and weeks to come, so stay tuned for more.