3D Vision Blog

A normal user's look into the world of 3D Stereo Technologies

3D Vision Blog header image 2

The New Nvidia GeForce GTX 780 GPUs Just Got Official

May 23rd, 2013 · 5 Comments · GeForce 3D Vision

nvidia-geforce-gtx-780


Today Nvidia has officially announced their new flagship GPU, the first form the 700 series – GeForce GTX 780 based on the GK110 Kepler architecture and manufactured using 28nm process. You can say that the new Nvidia GeForce GTX 780 is a more affordable lite version of the GTX TITAN announced not long ago, based on the same GK110 GPU, but with a little less video memory and a bit less CUDA cores. Other than that the things are looking very similar to what the GTX TITAN offers. You should have in mind though that the new GTX 680 is not here to replace the GTX TITAN, but to succeed the older GTX 680 (GK104) and looking at the specs it does that quite nicely…


Nvidia GeForce GTX 780 Specifications:

Graphics Card: GeForce GTX 780 3GB
Graphics Processing Clusters: 4 or 5
Streaming Multiprocessors: 12 -4
CUDA Cores: 2304 +768
Texture Units: 192 +64
ROP Units: 48 +16
Graphics Clock: 863 MHz -134
GPU Boost Clock: 1058 MHz -158
Memory Clock (Data rate): 6008 MHz
L2 Cache Size: 1534KB +1024
Total Video Memory: 3072MB GDDR5 +1024
Memory Interface: 384-bit +128
Total Memory Bandwidth: 288.4 GB/s +96.14
Texture Filtering Rate (Bilinear): 165.7 GigaTexels/sec +36.9
Fabrication Process: 28 nm
Transistor Count: 7.1 Billion +3.56
Connectors: Dual-Link DVI-I, Dual-Link DVI-D, HDMI 1.4 High Speed, DisplayPort 1.2
Form Factor: Dual Slot
Power Connectors: 1x 8-pin, 1x 6-pin
Thermal Design Power (TDP): 250 Watts +55
Thermal Threshold: 95 degrees C -3
Bus Interface: PCI Express 3.0

* The superscript numbers in green and red show the change as compared to the specs of GTX 680.

What is new here is the GPU Boost 2.0 support that was introduced with the GTX TITAN and even though the new GTX 780 brings higher power consumption and more performance it still manages to operate silently under load. It seems however the silent operation is a somewhat compromise with he Boost performance as the GPU operated with higher temperatures under load and thus can hardly use the maximum GPU Boost. However playing a bit with the fan settings and the GPU Boost 2.0’s 80 degrees default temperature limit for the Boost frequency can help get the full potential from the GTX 780 even if you don’t plan to overclock it.

I’ve been playing with a GeForce GTX 780 card in the last few days, so you can expect some benchmarks about the performance of the card in stereoscopic 3D mode using 3D Vision in the next few days. For now I can tell you that I really like what I’m getting in terms of performance from the GTX 780 and I’m definitely planning to replace my two thrust worthy GTX 580 GPUs running in SLI with it. I’ve decided to skip the GTX 680 in terms of upgrade and while the GTX TITAN was a bit more expensive that it was worth, now the price of the GTX 780 and the performance it offers make it a great product for playing games in high-resolution 2D or in Full HD stereoscopic 3D mode. For multi-monitor setups, both in 2D and in stereo 3D, going for two cards would still probably be a better option in order not to have to eventually make some compromises.

Hopefully the first Full Cover water cooling blocks for the GeForce GTX 780 will not take a lot of time before appearing on the market, because if you combine the new GPU with a good water cooling solution it will turn out to be a perfect combination – silent and cool operation and very high performance as both the GTX 780 GPU and the video memory is very overclocker-friendly, all you need to do is have it cooled properly. But even with the stock air cooling you can get good overclock performance if you don’t mind having the cooling of the card running a bit noisier as the stock cooler has a lot of potential for overclocking if you don’t mind the noise. The new GeForce GTX 780 hits the right spot for a really high-performance GPU that is still quite affordable, ideal solution for the more demanding stereoscopic 3D gamers that already have powerful 3D Vision-ready computers setup for S3D gaming.

Tags: ···


5 responses so far ↓

  • 1 djnforce9 // May 23, 2013 at 17:39

    Very Nice! I wondered when the successor would be announced although the GTX 680 (which is what I have) is still not that old (got it just last year soon after it came out) and probably not going to be considered “under-powered” any time soon. It’s still a very powerful graphics card and no current game has been able to max it out yet so I have no idea what it’s full capabilities even are yet (ok, maybe the elemental demo shows me but it will be quite some time before actual games start looking like that).

    I also wonder if and when the next iteration of DirectX will come about as that will also bring about yet another new line of graphics cards. DirectX 11.1 already does quite a lot mind you.

  • 2 Daniel Stange // May 23, 2013 at 23:52

    I think i’m going to wait for the 8 series as I hear that is when they are going to make a larger changed to the architecture of the card. This still seems like a really great card though.

  • 3 eqzitara // May 24, 2013 at 04:14

    Im waiting till they actually sponsor games to upgrade Nvidia GPU. Seriously since getting 600 series only nvidia games where it mattered/sponsorships as a non-mmo player were Max Payne 3, Metro Last Light, Assassin’s Creed 3, Borderlands 2. Buying a 700 series right now just sounds frustrating.
    They are exceeding their grasp with ideas like geforce experience, shield, shadowplay. It seems like they are forgetting what matters.

  • 4 sammaz // May 24, 2013 at 06:09

    :(

    Just bought EVGA 680 4GB with backplate 2 months ago…450.00$

    What are prices on this?

  • 5 Reaper // May 24, 2013 at 07:55

    Bloody, now that some time has passed… Oculus Rift dev kit, still being used or collecting dust?

    If being used, what are your thoughts now that it has been a while and had more time to adjust?
    Low rez still bothersome?

Leave a Comment

Current ye@r *