The recent announcement of the new top model single-GPU graphics processor GTX Titan from Nvidia came not only with the expected performance boost, but also with a few new features one of which was the Display Overclocking as a part of the new GPU Boost 2.0. Unfortunately Nvidia did not share much details about the Display Overclocking feature, the first reviews also did not help much in providing us with more details and with the GTX Titan still very hard to obtain, even though it already is on sale it is out of stock pretty much everywhere or is being sold at significantly inflated price where limited quantities may be available, we are still far from finding out what exactly does this feature do and to see how well it works…
Display overclocking has become quite popular last year when a Korean company has released IPS-based monitors with driver boards capable of driving the LCD panels at up to 120Hz and sometimes even more, the infamous Catleap Q270 with 2B revision of the electronics. Unfortunately these were in very limited quantities and are hard to find nowadays, even though there is really big interest and demand that is also being fueled by the fact that there are still no stock monitors available that user something different than TN panels and offer refresh rate of 120Hz or up. Unfortunately overclocking these monitors was not an easy task as there are some limits regarding the supported pixel clock rates in the video drivers for both AMD and Nvidia and you had to play with the individual timings of the display with the help of a tool such as PowerStrip or the custom resolution settings of the video drivers. So what we are suspecting that Nvidia has done with the Titan is to just simplify the process, removing the pixel clock limits from the video drivers (no need to patch them anymore) and providing an easy to use interface to overclock the display with the new GTX Titan, but that is yet to be confirmed.
As a part of the latest EVGA Precision X GPU tweak software package there is a tool called EVGA Pixel Clock OC that is supposed to give you this simple solution to overclock the refresh rate of the monitor without having to play with a lot of settings or manually create a custom refresh rate, everything you get is a slider that you can use to adjust the refresh rate and find what is the maximum that your display can work at. As expected this tool does not work on other video cards aside from the GTX Titan, so there is no way to test it yet and see how well it works or if there are any issues using it, but if it works well this can really be a feature that deserves more attention. There are already a lot of gamers going for 120Hz and even the newer 144Hz refresh rate capable 2D/3D monitors and are finding the advantage that these provide for gaming, but with display overclocking you may be able to get some more Hz out of your current LCD monitor as well and that may also improve your gaming experience.
Update: It seems that the EVGA Pixel Clock OC does work on cards other than the GTX Titan after all, I’ve just managed to make it work on GeForce GTX 580 using the 314.07 and 314.09 (modified) drivers. So apparently you can try to overclock your monitor with it even if it you still don’t have a GTX Titan yet, though I’m not sure if the pixel clock limitations are still active. Though on an Asus VG278H the maximum results was 122Hz, you can download and try the tool included with full EVGA Precision X or download only the EVGA Pixel Clock OC tool and try it on your PC. Feel free to report the results in the comments below, including on what GPU and monitor as well as driver version you’ve used.
Tags:120hz·Display Overclocking·EVGA Pixel Clock OC·EVGA Precision X·GTX Titan·Nvidia GTX Titan·PowerStrip
Nvidia has just officially announced the GeForce GTX Titan, a new video card that is based on the much talked about GK110 GPU. And while Nvidia was trying to be very secretive about this launch in the last few days pretty much all of the important details have leaked on the Internet anyway. The new GeForce GTX Titan is just more of the same and what actually could’ve been the GTX 680 as hardware, the Kepler architecture pushed to the “acceptable” maximum it can offer in a 250W TDP. This means that Nvidia had to lower the GPU clocks, including the ones for the GPU Boost to levels that are significantly lower than the GTX 680, but thanks to the significantly increased number of CUDA cores the new video cards will be outperforming a single GTX 680 and are supposed to get close to the performance offered by the dual-GPU GTX 690 cards (even outperform them if there is no SLI scaling under certain games). It will be interesting to see how overclock friendly will the GTX Titan be, as if it can overclock to levels achievable by the GTX 680, then it can easily outperform a GTX 690 with SLI scaling well. The GeForce GTX Titan comes with 384-bit memory interface and 6GB of video memory, clocked at 6008MHz DDR (GDDR5), and while that amount of VRAM certainly is appealing it also arises the question what game can take advantage of the full capacity, probably none so far. Though when talking about multi-monitor Surround or even 3D Vision Surround gaming there are certainly times when 2GB of video memory may not be enough, but still games are far from needing such huge amounts of video memory, so it is probably more of a marketing thing than something that is needed.
Specifications of GeForce GTX Titan:
Graphics Card – GeForce GTX Titan 6GB
CUDA Cores – 2688 +1152
Texture Units (TMU) – 112 +112
Raster Operator Units (ROP) – 48 +16
Graphics Clock (Base) – 837 MHz -169
Graphics Clock (Boost) – 876 MHz -182
Standard Memory Configuration – 6144 MB GDDR5 +4096
Memory Interface Width – 384-bit +128
Memory Clock – 3004 MHz (6008 effective)
Memory Bandwidth – 288.4 GB/sec +96.2
Texture Filtering Rate (Bilinear): 128.8 GigaTexels/sec +58.7
Fabrication Process – 28 nm
Transistor Count – 7.1 Billion +3.56
Connectors – Dual-Link DVI-I, Dual-Link DVI-D, HDMI 1.4 High Speed, DisplayPort 1.2
Form Factor – Dual Slot
Power Connectors – 1 x 6-pin, 1 x 8-pin PEG
Power Consumption -250W TDP +80
GPU Thermal Threshold – 95 degrees Celsius -3
Bus Interface – PCI Express 3.0
* The numbers in red and green represent the upgrade or downgrade of the specific parameter in the GeForce GTX Titan as compared to the GTX 680!
The new GTX Titan also comes with GPU Boost 2.0, with the main difference in version 2.0 being that the GPU temperature now has the priority over the power consumption, as was the case with the previous version of the technology. GPU Boost 2.0 is supposed to automatically boost the core clock speed as long as temperature is below 80 degrees Celsius (the default value, but it will be user adjustable). There is also a new feature being introduced called Display Overclocking that should allow the user to increase the pixel clock of the display over the standard 60Hz to lets say 80Hz for example with Vsync remaining enabled. This is an interesting feature that needs to be explored as some people are already overclocking their monitors (some models that support it) to achieve higher refresh rates even with older GPUs. In the end the new GTX Titan just offers a lot more raw power for people that need it and can afford it and with a few new extras that you may or may not need. For stereoscopic 3D gamers the extra graphics power is always welcome, especially for ones that use multi-monitor stereoscopic 3D setups. But will the GeForce GTX Titan be better than two GTX 680s in SLI (hardly if it can be outperformed by GTX 690) that you can get for slightly less than what you will have to play a single Titan, this is yet to be seen.
The new GeForce GTX Titan is supposed to appear on the market starting February 25th in the US with a suggested retail price of $999 USD, the same launch price as the multi-GPU GeForce GTX 690. Considering the fact that the Titan may have a limited number of units available the price may spike a lot over the suggested one, so beware if you want to be one of the first owners of GeForce GTX Titan. For now we’ll have to wait a few more days, until Thursday at least, when we expect to start seeing the first reviews of the new card that include benchmark results and comparisons.
Tags:3d vision·GeForce GTX Titan·GTX Titan·Nvidia GeForce GTX Titan