3D Vision Blog

A normal user's look into the world of 3D Stereo Technologies

3D Vision Blog header image 4

AMD FreeSync is Now Official and Should Be Open For Everyone

March 19th, 2015 · 5 Comments · General 3D News

amd-freesync-gaming

AMD FreeSync technology is apparently now official, bringing an alternative to Nvidia’s G-Sync. Both technologies are implementations around the industry standard DsiplayPort specifications in their revision 1.2a and more specifically around the DisplayPort Adaptive-Sync. AMD’s implementation however does not rely on expensive hardware DRM module like Nvidia (the G-Sync module itself), so it should not increase the price of the display additionally. In theory AMD FreeSync should work on all DisplayPort 1.2a-equipped monitors if you have a compatible AMD GPU, though the company is not very clear on that subject. The list of compatible AMD GPUs with gaming support for FreeSync include AMD Radeon R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 (the status of 7800 and 7900 series or 280X is not very clear).

amd-freesync-monitors

There is still no official WHQL driver available, but apparently AMD FreeSync required Radeon 15.2 beta drivers or newer to be supported. AMD has released a list of 11 gaming-oriented monitors from multiple partners including Acer, BenQ, LG Electronics, Niexeus, Samsung and Viewsonic that come in different sizes and with different features. What AMD is still lacking compared to Nvidia is support for stereoscopic 3D gaming along with FreeSync – there are multiple Nvidia G-Sync compatible models that also support stereoscopic 3D gaming. Should that matter however when Nvidia is apparently abandoning stereoscopic 3D support for some time already and the company is doing this for a second time since it was founded (history repeating itself). We are already eager to see what does AMD have in store for us with their FreeSync implementation…

Update: After trying out Acer ХВ280НК 4K G-sync monitor with AMD Radeon R9 280X and 290X I can say that I’m not very happy with both AMD and Nvidia. The G-Sync monitor works just fine on Nvidia hardware with G-sync and without. On Radeon 280X (not officially compatible with FreeSync according to AMD!) the monitor works just fine, but no option to enable FreeSync in the drivers as expected. Connecting the monitor to a AMD FreeSync compatible GPU, namely Radeon R9 290X the drivers still show no option to enable FreeSync in the drivers, nor the display is detected as capable of supporting it. The problem with Sapphire R9 290X 8GB and the Acer ХВ280НК monitor is that the display is not working properly in this combination, there is picture, but the monitor constantly goes blank for a bit at irregular intervals, just as if it is loosing the input signal and getting back signal – happens in both 2D and 3D mode. The tests were performed using the AMD Catalyst 15.3.1 Beta drivers supplied by AMD for trying out the new FreeSync feature.

Update 2: It seems that if you want to be able to use AMD’s FreeSync technology you would still have to buy a new display that features DisplayPort 1.2a interface and also buy a new graphics card if you are using R9 280X, one of the most popular GPUs from AMD. It will not work on your older hardware as most likely you don’t have DP 1.2a capable monitor anyway, unless you bought a really recently announced model, so you might want to wait for one of the new gaming models that are officially compatible with FreeSync as listed by AMD. Also since Nvidia’s G-Sync technology uses DisplayPort 1.2 interface the officially licensed G-Sync monitors will apparently not work with FreeSync as well.

→ 5 CommentsTags:·······

The New Nvidia GeForce GTX 680 (Kepler) Finally Making an Appearance

March 22nd, 2012 · 16 Comments · GeForce 3D Vision


Nvidia has just officially announced their new flagship GPU, the GeForce GTX 680 based on the new Kepler architecture and manufactured using 28nm process. The video cards based on the new graphics processors offer increased performance over the previous generation of Fermi and the flagship GTX 580 as well as numerous improvements and new features. Below I’ve prepared a short list of thee specifications of the new GTX 680 GPU as compared to the previous GTX 580 single GPU flagship from Nvidia, so you can easily compare what has changed.


Nvidia GeForce GTX 680 Specifications:

Graphics Card: GeForce GTX 680 2GB
Graphics Processing Clusters: 4
Streaming Multiprocessors: 8 -8
CUDA Cores: 1536 +1024
Texture Units: 128 +64
ROP Units: 32 -16
Graphics Clock: 1006 MHz +234
GPU Boost Clock: 1058 MHz
Memory Clock (Data rate): 6008 MHz +2000
L2 Cache Size: 512KB -256
Total Video Memory: 2048MB GDDR5 +512
Memory Interface: 256-bit -128
Total Memory Bandwidth: 192.26 GB/s -0.14
Texture Filtering Rate (Bilinear): 128.8 GigaTexels/sec +79.4
Fabrication Process: 28 nm
Transistor Count: 3.54 Billion +0.54
Connectors: Dual-Link DVI-I, Dual-Link DVI-D, HDMI 1.4 High Speed, DisplayPort 1.2
Form Factor: Dual Slot
Power Connectors: 2x 6-pin
Thermal Design Power (TDP): 195 Watts -49
Thermal Threshold: 98 degrees C
Bus Interface: PCI Express 3.0

* The superscript numbers in green and red show the change as compared to the specs of GTX 580.



I don’t want to go too much into detail about changes in the new Kepler architecture, as I’m sure not a lot of people are actually interested too much into such technical details. I just want to mention a few things and then will go to the more interesting part, namely the new features that the GTX 680 offers over the previous generation. There has been a significant change in the basic modules that build-up the GPU in the Kepler and as a result you get more CUDA cores and there is no more a separate shader clock available, there is just one clock frequency for the GPU, although there are changes in how it functions as well. And while the number of CUDA cores has been essentially tripled you should not expect to get three times the performance of a GTX 580 with a single GTX 680, as there are other important things that are responsible for the overall performance that a video card can provide… there are the Texture and ROP units as well, and then there is the memory frequency and bandwidth. Looking at the specs of the new GTX 680 you may notice that the operating frequency of the memory chips has been increased significantly, but the width of the memory bus has been reduced, so essentially the memory bandwidth remains pretty much the same as it was with the GTX 580. So instead of tripling the performance you should expect something more like 1.5x up to 2x the performance of the previous generations of GPU, depending on the usage scenario of course, although that would require some extra testing to confirm, especially when using in stereo 3D mode.

And now a it for some of the new features. One interesting thing is the GPU Boost functionality that is supposed to control in real time the operating frequency of your graphics processor, so that it can maximize the performance you get whenever you need it. Automatically increasing the working frequency of the GPU when a certain application is not optimally loading the graphics card, so you can squeeze some more performance. And since the GPU Boost cannot be disabled by the user, it will ultimately change the way you overclock the video card, especially considering that there are no more two different frequencies for the GPU anymore. And while you cannot disable the GPU Boost, you can control how it works, making you get the most out of your video card in terms of performance even when you overclock it. But thanks to the GPU Boost function and the extra electronics used to monitor the current utilization, temperature and power consumption of the GTX 680 you also get some neat new extras such as the ability to limit the maximum framerate in a 3D application to lets say 60 or 120 fps (NVIDIA Frame Rate Target). So you can look at the GPU Boost as not only something that can help you get the most out of your GPU, but also a function that can help you save power and resources when you actually don’t need them. Because when you limit the maximum framerate there is usually no need for the video card to use all of its processing power to maintain that framerate and thus it will run cooler and more silent.

Another new thing is the improvement in the Anti-Aliasing modes that you get at your disposal in order to get rid of the jaggies and get smoother looking image in games. Aside from the FXAA mode that is also supported, the new GTX 680s introduce the two new TXAA modes that bring better quality AA than MSAA with less performance hit. Another interesting new feature is the Adaptive VSync that can help you smooth out the transitions when the framerate drops below a certain level, something that with no VSync or with normal VSync usually leads to tearing of the image. And while Adaptive VSync may not be able to completely eliminate the tearing when framerate drops significantly it can help reduce it greatly making it not so apparent and even hardly noticeable if you are not paying special attention in most of the cases. So another good thing if you are a gamer and going for GTX 680 if you are not a gamer may seem like something a bit pointless to do.



And here comes another very interesting new feature – the Single GPU 3D Vision Surround. Since the GeForce GTX 680 is now capable of driving four independent monitors at the same time you are now able to create a 3D Vision Surround with just a single video card, no more need of at least two GPUs in SLI to drive the 3D Vision Surround. Have in mind though that the GTX 680 has Dual-Link DVI-I, Dual-Link DVI-D, DisplayPort 1.2 a HDMI 1.4 High Speed interfaces. Obviously you can’t use the HDMI 1.4 HS interface for a 3D Vision Surround setup, so the third display needs to be connected either through the DisplayPort or with a DP to DL-DVI adapter. The HDMI 1.4 High Speed interface should be capable of providing more than the 1080p 24Hz 3D mode that the normal HDMI 1.4 interface currently supports, however you would also require a 3D monitor supporting it and there are still no such consumer products available apparently. There are also some improvements in the Surround support, for example you can use a fourth accessory display together with the surround for showing your email for example or something else while playing, although switching to that monitor can be a bit tricky. Also you finally get the taskbar displayed only on the center display when using a Surround setup, and the ability to maximize windows only in a single display and not on all three (user selectable) and these are apart of software improvements actually, so you should be getting them available on older hardware as well. There is also a new Bezel Peek function to allow you to briefly see in-game menus or objects that may appear hidden due to the use of bezel correction by using a hotkey, there is also faster display acceleration when using only a single display in a surround setup as well as an improvement in the list of resolutions you get active when using a Surround setup, so you will not be bothered by a huge list resolutions that you need to go through. One thing that I’ve almost missed is the DirectX 11.1 support, but should you actually care that it is supported by the hardware, not really at the moment as it is nothing major for now.

The new GeForce GTX 680 from Nvidia is definitely a good improvement not only in terms of performance, but also in terms of new features that can help you get the most out of your gaming experience, including in stereoscopic 3D mode as well. It is more powerful and more energy efficient as compared to the previous generation and brings some new useful features that are surely going to be interesting for gamers. The new GeForce GTX 680 should be available with a price of about $499 USD already and I hope to be able to soon get the card to test and provide you with some benchmarks of the 680 in stereoscopic 3D mode, so stay tuned for more about that…

→ 16 CommentsTags:··············

AMD Has Announced The New Radeon HD 7970 High-End GPUs

December 22nd, 2011 · 5 Comments · General 3D News


It seems that AMD wanted to kind of make happy the gamers around the world right before the holidays, by announcing their next generation Radeon HD 7970 graphics processors based on 28nm production technology. And while the announcement is now a fact, the video cards based on this new GPUs are still not available on the market, they should start appearing sometime in January next year, so don’t be too eager… yet. The new GPU should be the first GPU based on 28nm technology to hit the market, the first with support for the new PCI Express 3.0 and the first one supporting DirectX 11.1 expected with Windows 8). But these are the more general “firsts”, there are some interesting things regarding stereoscopic 3D support as well that will be available in the new Series 7000 GPUs from AMD…

If you remember not long ago AMD has released Catalyst 12.1 Preview driver that has finally added support for CrossfireX (multi-GPU configurations) in stereo 3D mode as well as support for the optional HDMI 1.4a 1080p 30Hz 3D mode. And while these features are still only available in a beta driver, they should be finding their place in the next official Catalyst release that should also have the support for the new Radeon HD 7970 GPUs. But in the new graphic processors AMD has more new features prepared for gamers using AMD-based hardware to play in stereoscopic 3D mode, hardware-based features that kind of build on top of the newly introduced software-based ones.

With the new video cards based on Radeon HD 7970 graphics processors you should be able to use multi-monitor setups (Eyefinity) in stereoscopic 3D mode as well and with a single video card (over DisplayPort 1.2). So with this AMD if finally kind of catching up with Nvidia by offering alternative to the 3D Vision Surround, a technology that has been available for a while. And while you should be able to run multiple monitors in stereoscopic 3D mode with Eyefinity from a single video card you would probably want to have two of these for best experience, especially for playing games, and with the addition of CrossfireX support in stereo 3D mode you should finally be able to do that.

The other new interesting feature coming with the Radeon HD 7970 video cards will be the support of the higher bandwidth HDMI 3 GHz specifications that will allow you to get 1080p 3D mode with 60Hz per eye. And while this sounds very nice as a feature to have, you should not be to eager to try it out as this is support only on the video card’s side, you would also need a 3D monitor or a 3D HDTV that will support this mode. Unlike with the optional HDMI 1.4a 30Hz 3D mode that some 3D-capable products support even now, for the new 1080p 60Hz 3D mode there will be some time needed before display products supporting it will start coming out on the market.

And now we are hitting one of the still problematic areas for AMD, namely not having a wider choice of compatible 3D-capable active displays, as for the compatibility with passive 3D solutions things are pretty much Ok. But hopefully next year we’ll start seeing some positive development in that area as well, I’m saying hopefully as AMD is still a bit slow in successfully convincing partners to make compatible hardware just the same way that Nvidia had when 3D Vision was initially announced… so it just takes more time and effort. But the big question now is how soon and with what will Nvidia respond to the new AMD Radeon HD 7970 GPUs, what do you think?

For more information and the specifications of the new Radeon HD 7970 GPUs…

→ 5 CommentsTags:·········