Today the benchmark results from the new Nvidia GeForce GTX Titan video card started hitting the web, but most reviewrs that got the card to test seem to use just one monitor – 1080p, 1440p or 1600p. We even get to see dual or tripple SLI setups with GeForce GTX Titan and only a few get to test on multi-monitor surround setups in 2D mode. But can you get the maximum from a GeForce GTX Titan on a single monitor in 2D mode, seems most of the press thinks so, but what about multi-monitor and stereo 3D gaming when using the Titan? Fortunately there is at least some benchmark results that cover stereoscopic 3D gaming, you can see the short video review with benchmarks from Linus Tech Tips embedded above. Interestingly enough a dual GeForce GTX 660 Ti video card setup in SLI ($600-$700 USD) seem to outperform the GTX Titan ($999 USD) in 1080p stereo 3D mode, other reviews show that the same 660 Ti SLi setup gives better results than the Titan in 2D mode as well. This means that for 2/3 of the price of the Nvidia GeFoce Titan you can get more FPS with a 660 Ti setup, this suddenly makes the Titan not seem so powerful, or maybe just the GeForce GTX 660 Ti is just too good especially if you couple two of these.
Before seeing the first reviews I was actually thinking about replacing my two trustworthy water-cooled GeForce GTX 580 video cards running in SLI with a single GeForce GTX Titan, but now I’m not so sure about that now. I’ll wait for more benchmarks and hopefully more in stereoscopic 3D mode using 3D Vision and 3D Vision Surround, and so should you if you are considering upgrading to a GTX Titan or something newer as graphics hardware than what you already have, but suddenly going for two GeForce GTX 660 Ti cards in SLI instead does sound like a very attractive idea…
Nvidia has just officially announced the GeForce GTX Titan, a new video card that is based on the much talked about GK110 GPU. And while Nvidia was trying to be very secretive about this launch in the last few days pretty much all of the important details have leaked on the Internet anyway. The new GeForce GTX Titan is just more of the same and what actually could’ve been the GTX 680 as hardware, the Kepler architecture pushed to the “acceptable” maximum it can offer in a 250W TDP. This means that Nvidia had to lower the GPU clocks, including the ones for the GPU Boost to levels that are significantly lower than the GTX 680, but thanks to the significantly increased number of CUDA cores the new video cards will be outperforming a single GTX 680 and are supposed to get close to the performance offered by the dual-GPU GTX 690 cards (even outperform them if there is no SLI scaling under certain games). It will be interesting to see how overclock friendly will the GTX Titan be, as if it can overclock to levels achievable by the GTX 680, then it can easily outperform a GTX 690 with SLI scaling well. The GeForce GTX Titan comes with 384-bit memory interface and 6GB of video memory, clocked at 6008MHz DDR (GDDR5), and while that amount of VRAM certainly is appealing it also arises the question what game can take advantage of the full capacity, probably none so far. Though when talking about multi-monitor Surround or even 3D Vision Surround gaming there are certainly times when 2GB of video memory may not be enough, but still games are far from needing such huge amounts of video memory, so it is probably more of a marketing thing than something that is needed.
* The numbers in red and green represent the upgrade or downgrade of the specific parameter in the GeForce GTX Titan as compared to the GTX 680!
The new GTX Titan also comes with GPU Boost 2.0, with the main difference in version 2.0 being that the GPU temperature now has the priority over the power consumption, as was the case with the previous version of the technology. GPU Boost 2.0 is supposed to automatically boost the core clock speed as long as temperature is below 80 degrees Celsius (the default value, but it will be user adjustable). There is also a new feature being introduced called Display Overclocking that should allow the user to increase the pixel clock of the display over the standard 60Hz to lets say 80Hz for example with Vsync remaining enabled. This is an interesting feature that needs to be explored as some people are already overclocking their monitors (some models that support it) to achieve higher refresh rates even with older GPUs. In the end the new GTX Titan just offers a lot more raw power for people that need it and can afford it and with a few new extras that you may or may not need. For stereoscopic 3D gamers the extra graphics power is always welcome, especially for ones that use multi-monitor stereoscopic 3D setups. But will the GeForce GTX Titan be better than two GTX 680s in SLI (hardly if it can be outperformed by GTX 690) that you can get for slightly less than what you will have to play a single Titan, this is yet to be seen.
The new GeForce GTX Titan is supposed to appear on the market starting February 25th in the US with a suggested retail price of $999 USD, the same launch price as the multi-GPU GeForce GTX 690. Considering the fact that the Titan may have a limited number of units available the price may spike a lot over the suggested one, so beware if you want to be one of the first owners of GeForce GTX Titan. For now we’ll have to wait a few more days, until Thursday at least, when we expect to start seeing the first reviews of the new card that include benchmark results and comparisons.