Review: Nvidia's $999 GTX Titan X Shines in 4K

Mark Walton

VIDEO: Review: Nvidia's $999 GTX Titan X Shines in 4K [04:28]

It wasn't all that long ago that Nvidia released the GTX 980, which, up to this point, remained the fastest single-GPU on the market. At the time, though, thanks to its conservative power requirements, I noted that it wasn't the generational leap in performance one might have hoped for from Nvidia's brand new Maxwell architecture. Six months on, and that generational leap in performance is finally here. Enter the GTX Titan X, a full-fat implementation of the power-efficient Maxwell architecture that pairs a wildly over-the-top 12GB of VRAM with 7Tflops of processing power. The Titan X is big, bold, and badass, but at $999, that power is going to cost you.

Specs

The Titan X is Maxwell without compromise. It's the first GPU from Nvidia that makes use of the GM200 core, which carries 8 billion transistors in a 28nm die. It consists of six Graphics Processing Clusters, each of which contains four Streaming Multiprocessors Units, with 128 cores in each SMM block. That adds up to 24 SMM Units, 3072 CUDA Cores, 192 Texture Mapping Units and 96 ROPs.

Aside from the obvious increases in ROPs and CUDA Cores, GM200 is notable for bringing back the 384-bit memory system of the original GTX Titan, GTX 780 and GTX 780 Ti. Armed with a huge 12GB of 7GHz memory, the Titan X's memory is able to shift data at 336.5 GB/sec, a 50 percent increase over the GTX 980. Whether or not you actually need all that memory is debatable.

While it's certainly possible to max out 4GB of VRAM (just ask a disgruntled GTX 970 owner about that one), the obvious move would be to include 8GB. Even if you ran multiple displays and games at 4K with some seriously beefy textures, that'd be a lot of memory to fill. But hey, while no GPU is ever really future proof, with 12GB of VRAM on board, the Titan X will be able keep up with the ever increasing VRAM requirements of games for far longer than your average graphics card.

GPU

GTX Titan X

GTX 980

GTX 970

CUDA Cores

3072

2048

1164

Base Clock

1000 MHz

1126 MHz

1050 MHz

GPU Boost Clock

1075 MHz

1216 MHz

1178 MHz

Memory

12GB

4GB

4GB

Memory Data Rate

7000 MHz

7000 MHz

7000 MHz

Memory Bandwidth

336.5 GB/s

224 GB/s

224 GB/s

Memory Interface

384-bit

256-bit

256-bit

ROPs

96

64

56

TDP

250W

165W

165W

Fabrication Process

28-nm

28-nm

28-nm

What hasn't increased over the GTX 980 is the clock speed, which is lower by a few hundred MHz. However, as the benchmarks below show, the vastly increased CUDA cores and ROPs more than make up the difference. And hey, if you're into overclocking, the Titan X's 6+2 phase design (6 phases for the GPU, 2 for the memory) means you can crank up the power target to 110%, and the wattage from 250W to a 275W. That leaves quite a bit of headroom for overclocking, even under air. Nvidia claims that speeds of 1.4GHz are possible. While I didn't quite hit that in my testing, hitting 1.2GHz was relatively easy.

The eagle-eyed among you will have noticed that the Titan X's 250W TDP represents a significant increase in wattage over the 165W of the GTX 980. That's still less than the near 300W of the R9 290X, the power-efficient architecture giving Nvidia the space to increase performance, without dramatically heat output and needing a particularly exotic cooling solution. Still, you'll need 6 and 8-pin power from your power supply this time, with Nvidia recommending a minimum of a 600W power supply to get the job done.

One major change from the previous GTX Titan is with the Titan X's double-precision performance. Like the GTX 980, the Titan X's double-precision performance is 1/32 the rate of single-precision performance. That's not of particular importance for gamers, but scientific types making use of double-precision instructions will find the Titan X lacking; perhaps Nvidia grew wary of people using the Titan as a cheaper alternative to its Tesla cards. Whatever the reason, Nvidia is recommending the $3000 GTX Titan Z as the best GPU for those who need double-precision.

Aside from the snazzy black exterior, the Titan X uses the same excellent reference cooler that launched with the original GTX Titan and made its way though Nvidia's high-end models such as the GTX 980. However, unlike the GTX 980, Nvidia is only allowing partners to release cards with the reference cooler, so don't expect the likes of MSI's excellent Twin Frozr cooling system to be hitting Titans anytime soon.

Not that it matters too much. In testing, I found the GTX Titan X to be relatively quiet. However, thanks to increased wattage, it is noticeably louder than the 980 under load. It's not overly distracting by any means, and it's certainly quieter than AMD's cards, but it's something to bear in mind if noise is a big issue for you. In terms of outputs, the Titan X sports three full size display ports, one HDMI 2.0 port, and one dual-link DVI. There's also support for up to four-way SLI, should you have insanely large wads of cash to spare.

Performance

Onto performance then, and no surprises here, the GTX Titan X is the fastest single-GPU I've ever tested. At stock speed and paired with an 8-core Intel i7 5960X overclocked to 4GHz, 16GB of Corsair DDR4 RAM @ 3000 MHz, and two Crucial M550 SSDs in RAID 0, I saw between a 25 and 30 percent boost in performance over the 980, and up to a 40 percent boost over an AMD R9 290X. Naturally, there's really no point in buying this card for 1080p gaming (a GTX 970 would serve you just as well), so all testing was done at 1440p and UHD (4K) resolution.

At 1440p, the Titan X easily ran everything we threw at it, averaging 60fps. Yes, that includes perennial system killer Crysis 3. 1440p is definitely the sweet spot for high-res gaming, particularly if you want 60fps, but there were some impressive results for 4K too.

1440p

GTX Titan X

GTX 980

R9 290X

Heaven @ Ultra, 8X AA

53

38

34

Far Cry 4 @ Ultra, SMAA

85

58

58

Crysis 3 @ Very High, FXAA

60

22

19

Tomb Raider, Ultimate, FXAA, Tres FX

87

73

65

Bioshock Infinite @ Ultra, AA

130

104

90

Battlefield 4 @ Ultra, 4X MSAA, HBAO

86

60

47

Metro Last Light @ Very High

86

65

59

UHD (4K)

GTX Titan X

GTX 980

R9 290X

Heaven @ Ultra, 8X AA

32

23

19

Far Cry 4 @ Ultra, SMAA

46

38

34

Crysis 3 @ Very High, FXAA

28

22

19

Tomb Raider, Ultimate, FXAA, Tres FX

52

39

35

Bioshock Infinite @ Ultra, AA

74

55

46

Battlefield 4 @ Ultra, 4X MSAA, HBAO

61

46

36

Metro Last Light @ Very High

40

32

32

The Titan X is the first card where you can reasonably play in 4K without having to go with an SLI or Crossfire setup. You'll have to make do with a locked 30fps for the most part, or put up with some screen tearing (unless you're using a G-Sync screen), but the slowdowns during busier scenes that have plagued single-GPU 4K setups up to this point aren't an issue with the Titan X. With the exception of Crysis 3, no game dropped below 40fps.

Verdict

Some people might call the Titan X's 12GB of VRAM and powerful innards overkill--and those people might be right. But what the Titan X represents is a glimpse at the future of the GPU, the place where they'll all be in a few years time, minus the price tag. You can even get a glimpse of what those games might look like too, thanks to the likes of Epic's Kite demo, which was powered by a single Titan X. Having seen the demo first-hand, and paused it at random to fly around in real-time, I can tell you that it's light-years ahead of today's games, at least in terms of visuals.

The Titan X is for the guy or gal who wants to be at the forefront of that, and won't accept anything less than the very best in performance. This is an enthusiast grade graphics card, with enthusiast pricing to match. But even if you don't take into account all the great Nvidia extras--GSync, GeForce Experience, DSR, to name but a few--the Titan X is, finally, a generational leap in performance over the 7-series, and the absolute best graphics card money can buy.

Related Articles