Geeks for your information
The NVIDIA GeForce GTX 1660 Ti Review, Feat. EVGA XC GAMING: Turing Sheds RTX for the - Printable Version

+- Geeks for your information (https://www.geeks.fyi)
+-- Forum: News (https://www.geeks.fyi/forumdisplay.php?fid=105)
+--- Forum: Hardware News (https://www.geeks.fyi/forumdisplay.php?fid=38)
+--- Thread: The NVIDIA GeForce GTX 1660 Ti Review, Feat. EVGA XC GAMING: Turing Sheds RTX for the (/showthread.php?tid=5871)



The NVIDIA GeForce GTX 1660 Ti Review, Feat. EVGA XC GAMING: Turing Sheds RTX for the - harlan4096 - 25 February 19

Quote:[Image: EVGA_1660ti_xc_b_678x452.jpg]

When NVIDIA put their plans for their consumer Turing video cards into motion, the company bet big, and in more ways than one. In the first sense, NVIDIA dedicated whole logical blocks to brand-new graphics and compute features – ray tracing and tensor core compute – and they would need to sell developers and consumers alike on the value of these features, something that is no easy task. In the second sense however, NVIDIA also bet big on GPU die size: these new features would take up a lot of space on the 12nm FinFET process they’d be using.

The end result is that all of the Turing chips we’ve seen thus far, from TU102 to TU106, are monsters in size; even TU106 is 445mm2, never mind the flagship TU102. And while the full economic consequences that go with that decision are NVIDIA’s to bear, for the first year or so of Turing’s life, all of that die space that is driving up NVIDIA’s costs isn’t going to contribute to improving NVIDIA’s performance in traditional games; it’s a value-added feature. Which is all workable for NVIDIA in the high-end market where they are unchallenged and can essentially dictate video card prices, but it’s another matter entirely once you start approaching the mid-range, where the AMD competition is alive and well.

Consequently, in preparing for their cheaper, sub-$300 Turing cards, NVIDIA had to make a decision: do they keep the RT and tensor cores in order to offer these features across the line – at a literal cost to both consumers and NVIDIA – or do they drop these features in order to make a leaner, more competitive chip? As it turns out, NVIDIA has opted for the latter, producing a new Turing GPU that is leaner and meaner than anything that’s come before it, but also very different from its predecessors for this reason.