3D Vision Blog

A normal user's look into the world of 3D Stereo Technologies

3D Vision Blog header image 2

The New Nvidia GeForce GTX 680 (Kepler) Finally Making an Appearance

March 22nd, 2012 · 16 Comments · GeForce 3D Vision

Nvidia has just officially announced their new flagship GPU, the GeForce GTX 680 based on the new Kepler architecture and manufactured using 28nm process. The video cards based on the new graphics processors offer increased performance over the previous generation of Fermi and the flagship GTX 580 as well as numerous improvements and new features. Below I’ve prepared a short list of thee specifications of the new GTX 680 GPU as compared to the previous GTX 580 single GPU flagship from Nvidia, so you can easily compare what has changed.

Nvidia GeForce GTX 680 Specifications:

Graphics Card: GeForce GTX 680 2GB
Graphics Processing Clusters: 4
Streaming Multiprocessors: 8 -8
CUDA Cores: 1536 +1024
Texture Units: 128 +64
ROP Units: 32 -16
Graphics Clock: 1006 MHz +234
GPU Boost Clock: 1058 MHz
Memory Clock (Data rate): 6008 MHz +2000
L2 Cache Size: 512KB -256
Total Video Memory: 2048MB GDDR5 +512
Memory Interface: 256-bit -128
Total Memory Bandwidth: 192.26 GB/s -0.14
Texture Filtering Rate (Bilinear): 128.8 GigaTexels/sec +79.4
Fabrication Process: 28 nm
Transistor Count: 3.54 Billion +0.54
Connectors: Dual-Link DVI-I, Dual-Link DVI-D, HDMI 1.4 High Speed, DisplayPort 1.2
Form Factor: Dual Slot
Power Connectors: 2x 6-pin
Thermal Design Power (TDP): 195 Watts -49
Thermal Threshold: 98 degrees C
Bus Interface: PCI Express 3.0

* The superscript numbers in green and red show the change as compared to the specs of GTX 580.

I don’t want to go too much into detail about changes in the new Kepler architecture, as I’m sure not a lot of people are actually interested too much into such technical details. I just want to mention a few things and then will go to the more interesting part, namely the new features that the GTX 680 offers over the previous generation. There has been a significant change in the basic modules that build-up the GPU in the Kepler and as a result you get more CUDA cores and there is no more a separate shader clock available, there is just one clock frequency for the GPU, although there are changes in how it functions as well. And while the number of CUDA cores has been essentially tripled you should not expect to get three times the performance of a GTX 580 with a single GTX 680, as there are other important things that are responsible for the overall performance that a video card can provide… there are the Texture and ROP units as well, and then there is the memory frequency and bandwidth. Looking at the specs of the new GTX 680 you may notice that the operating frequency of the memory chips has been increased significantly, but the width of the memory bus has been reduced, so essentially the memory bandwidth remains pretty much the same as it was with the GTX 580. So instead of tripling the performance you should expect something more like 1.5x up to 2x the performance of the previous generations of GPU, depending on the usage scenario of course, although that would require some extra testing to confirm, especially when using in stereo 3D mode.

And now a it for some of the new features. One interesting thing is the GPU Boost functionality that is supposed to control in real time the operating frequency of your graphics processor, so that it can maximize the performance you get whenever you need it. Automatically increasing the working frequency of the GPU when a certain application is not optimally loading the graphics card, so you can squeeze some more performance. And since the GPU Boost cannot be disabled by the user, it will ultimately change the way you overclock the video card, especially considering that there are no more two different frequencies for the GPU anymore. And while you cannot disable the GPU Boost, you can control how it works, making you get the most out of your video card in terms of performance even when you overclock it. But thanks to the GPU Boost function and the extra electronics used to monitor the current utilization, temperature and power consumption of the GTX 680 you also get some neat new extras such as the ability to limit the maximum framerate in a 3D application to lets say 60 or 120 fps (NVIDIA Frame Rate Target). So you can look at the GPU Boost as not only something that can help you get the most out of your GPU, but also a function that can help you save power and resources when you actually don’t need them. Because when you limit the maximum framerate there is usually no need for the video card to use all of its processing power to maintain that framerate and thus it will run cooler and more silent.

Another new thing is the improvement in the Anti-Aliasing modes that you get at your disposal in order to get rid of the jaggies and get smoother looking image in games. Aside from the FXAA mode that is also supported, the new GTX 680s introduce the two new TXAA modes that bring better quality AA than MSAA with less performance hit. Another interesting new feature is the Adaptive VSync that can help you smooth out the transitions when the framerate drops below a certain level, something that with no VSync or with normal VSync usually leads to tearing of the image. And while Adaptive VSync may not be able to completely eliminate the tearing when framerate drops significantly it can help reduce it greatly making it not so apparent and even hardly noticeable if you are not paying special attention in most of the cases. So another good thing if you are a gamer and going for GTX 680 if you are not a gamer may seem like something a bit pointless to do.

And here comes another very interesting new feature – the Single GPU 3D Vision Surround. Since the GeForce GTX 680 is now capable of driving four independent monitors at the same time you are now able to create a 3D Vision Surround with just a single video card, no more need of at least two GPUs in SLI to drive the 3D Vision Surround. Have in mind though that the GTX 680 has Dual-Link DVI-I, Dual-Link DVI-D, DisplayPort 1.2 a HDMI 1.4 High Speed interfaces. Obviously you can’t use the HDMI 1.4 HS interface for a 3D Vision Surround setup, so the third display needs to be connected either through the DisplayPort or with a DP to DL-DVI adapter. The HDMI 1.4 High Speed interface should be capable of providing more than the 1080p 24Hz 3D mode that the normal HDMI 1.4 interface currently supports, however you would also require a 3D monitor supporting it and there are still no such consumer products available apparently. There are also some improvements in the Surround support, for example you can use a fourth accessory display together with the surround for showing your email for example or something else while playing, although switching to that monitor can be a bit tricky. Also you finally get the taskbar displayed only on the center display when using a Surround setup, and the ability to maximize windows only in a single display and not on all three (user selectable) and these are apart of software improvements actually, so you should be getting them available on older hardware as well. There is also a new Bezel Peek function to allow you to briefly see in-game menus or objects that may appear hidden due to the use of bezel correction by using a hotkey, there is also faster display acceleration when using only a single display in a surround setup as well as an improvement in the list of resolutions you get active when using a Surround setup, so you will not be bothered by a huge list resolutions that you need to go through. One thing that I’ve almost missed is the DirectX 11.1 support, but should you actually care that it is supported by the hardware, not really at the moment as it is nothing major for now.

The new GeForce GTX 680 from Nvidia is definitely a good improvement not only in terms of performance, but also in terms of new features that can help you get the most out of your gaming experience, including in stereoscopic 3D mode as well. It is more powerful and more energy efficient as compared to the previous generation and brings some new useful features that are surely going to be interesting for gamers. The new GeForce GTX 680 should be available with a price of about $499 USD already and I hope to be able to soon get the card to test and provide you with some benchmarks of the 680 in stereoscopic 3D mode, so stay tuned for more about that…

Tags: ··············

16 responses so far ↓

  • 1 Razor Kid // Mar 22, 2012 at 18:47

    I’m glad the card is finally out and not higher than $499.99, I’m making a new build and it’s the perfect time for this. With all the new benchmarks out, I wish someone would remember the S3D users out there. I want those 3DVision benchmarks!

  • 2 Arioch // Mar 22, 2012 at 20:05

    I got a pair of them coming my way from Newegg tommorrow – can’t wait!

  • 3 claydough // Mar 22, 2012 at 22:06

    The first round of card were supposed to be the affordable 560ti replacement in the $3xx dollar range!
    The highend card normally released first was supposed to release later in the year. ( the reverse release strategy due to economic factors )
    the performance was so good that this card got the 680 designation and is priced at the insulting price of $499.

    If true, I will easily boycott this card?
    I am hoping, those rumors are actually just that. Puts me in a tight spot when I usually get better Maya performance from nvidia drivers.
    Also hard when I consistently buy the highend nvidia card. Where my financial resources are already strained by consistently going for broke with tri-sli. Only to be rewarded at the highend with such a story of unfathomable greed!?

    can I get an amen?

  • 4 eqzitara // Mar 22, 2012 at 22:11

    I just got a 580 5 days ago… I am such a jerk but I am gonna return it. Will sell my sli one on ebay.

  • 5 Bloody // Mar 22, 2012 at 23:11

    I have two GTX 580s in SLI and I’m still not convinced if a single GTX 680 will be a better solution performance wise, especially for stereo 3D. Need to do some testing, but I expect that two GTX 680s will be much better… however two cards, also paired with full cover water cooling blocks (EK is releasing theirs on 2nd of April) can be a serious hit on the upgrade budget.

  • 6 claydough // Mar 22, 2012 at 23:53

    Hopefully the rumors of gtx 780 at the end of the year were true?
    Even if this card was supposed to be the 560ti replacement. I would be happy with kepler if the 780 was released with the following benchmarks:

  • 7 eqzitara // Mar 23, 2012 at 00:32

    I been way out of the specs scene. I know that having a pci express 3.0 mobo didn’t really matter for 7970. Does it matter at all for 680’s?

  • 8 Bloody // Mar 23, 2012 at 01:28

    Probably won’t matter much for the GTX 680 as well, but cannot say for sure… I haven’t tested the card yet.

  • 9 eqzitara // Mar 23, 2012 at 04:24

    I got a reponse from an nvidia guy on the forums. He basically said their are no TRUE pci express 3.0 mobo’s yet. Down the line when they come out it should make a difference.

  • 10 moe // Mar 23, 2012 at 04:31

    Search Linus tech tips on YouTube, he made a 3d vision bencmark for the gtx 680, and it gets twice the fps of the gtx 580 in 3d, and comes really close to the performance of the 590

  • 11 badelhas // Mar 23, 2012 at 04:55

    i would love to have 3 of these paired with 3 f35 as3d full hd 3d 120hz projectors. Can you imagine that?!

  • 12 eqzitara // Mar 23, 2012 at 06:23

    I find it ironic that when nvidia decides to do a picture of skyrim in 3d as part of advertising. It breaks.

  • 13 Dogor // Mar 23, 2012 at 21:57

    Just got home from work and the ups was outside my house! am gonna clean up my rig and install the new babys!

  • 14 massaker // Mar 24, 2012 at 05:38

    @Dogor have you some 3D-Benchmarks /comparison to your previous rig yet??? Can’t wait to see how 680er performs in 3D! :D

  • 15 Branden Jew // Mar 28, 2012 at 14:24

    Please benchmark Battlefield 3 in 3D Vision 2 with GTX 680 and SLI using Ultra settings. Lets see what happens. My two 580’s only get 40FPS.

  • 16 Benoit Michel // Mar 29, 2012 at 14:54

    About GTX680 and Nvidia 3D Surround:
    According to Nvidia, any three digital outputs may be use; practically speaking, it means you have to use both DVIs and the DisplayPort.
    Besides that, the only practical option to set-up 3D Vision Surround is through the DisplayPort connector on which you can daisy-chain the three 3D screens, but you need daisy-chainable display port screens (not so common to say the least), but several expected in 2012).
    Final alternative if your displays are not supporting the daisy-chain DisplayPort 1.2 specification, you may add a hardware splitter such as the Hydra display (http://cirago.com/wordpress/products/hydradisplay/hdx3dv01/) with the rather annoying limitation to 3840×1024 pixels however…

Leave a Comment