AMD FreeSync technology is apparently now official, bringing an alternative to Nvidia’s G-Sync. Both technologies are implementations around the industry standard DsiplayPort specifications in their revision 1.2a and more specifically around the DisplayPort Adaptive-Sync. AMD’s implementation however does not rely on expensive hardware DRM module like Nvidia (the G-Sync module itself), so it should not increase the price of the display additionally. In theory AMD FreeSync should work on all DisplayPort 1.2a-equipped monitors if you have a compatible AMD GPU, though the company is not very clear on that subject. The list of compatible AMD GPUs with gaming support for FreeSync include AMD Radeon R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 (the status of 7800 and 7900 series or 280X is not very clear).
There is still no official WHQL driver available, but apparently AMD FreeSync required Radeon 15.2 beta drivers or newer to be supported. AMD has released a list of 11 gaming-oriented monitors from multiple partners including Acer, BenQ, LG Electronics, Niexeus, Samsung and Viewsonic that come in different sizes and with different features. What AMD is still lacking compared to Nvidia is support for stereoscopic 3D gaming along with FreeSync – there are multiple Nvidia G-Sync compatible models that also support stereoscopic 3D gaming. Should that matter however when Nvidia is apparently abandoning stereoscopic 3D support for some time already and the company is doing this for a second time since it was founded (history repeating itself). We are already eager to see what does AMD have in store for us with their FreeSync implementation…
Update: After trying out Acer ХВ280НК 4K G-sync monitor with AMD Radeon R9 280X and 290X I can say that I’m not very happy with both AMD and Nvidia. The G-Sync monitor works just fine on Nvidia hardware with G-sync and without. On Radeon 280X (not officially compatible with FreeSync according to AMD!) the monitor works just fine, but no option to enable FreeSync in the drivers as expected. Connecting the monitor to a AMD FreeSync compatible GPU, namely Radeon R9 290X the drivers still show no option to enable FreeSync in the drivers, nor the display is detected as capable of supporting it. The problem with Sapphire R9 290X 8GB and the Acer ХВ280НК monitor is that the display is not working properly in this combination, there is picture, but the monitor constantly goes blank for a bit at irregular intervals, just as if it is loosing the input signal and getting back signal – happens in both 2D and 3D mode. The tests were performed using the AMD Catalyst 15.3.1 Beta drivers supplied by AMD for trying out the new FreeSync feature.
Update 2: It seems that if you want to be able to use AMD’s FreeSync technology you would still have to buy a new display that features DisplayPort 1.2a interface and also buy a new graphics card if you are using R9 280X, one of the most popular GPUs from AMD. It will not work on your older hardware as most likely you don’t have DP 1.2a capable monitor anyway, unless you bought a really recently announced model, so you might want to wait for one of the new gaming models that are officially compatible with FreeSync as listed by AMD. Also since Nvidia’s G-Sync technology uses DisplayPort 1.2 interface the officially licensed G-Sync monitors will apparently not work with FreeSync as well.
Interesting new development in the world of VR as a new open platform for Virtual Reality gaming has been announced – the Open-Source Virtual Reality (OSVR), with the goal to push the VR gaming experience forward. OSVR should provide both hardware and software support at every level of virtual reality gaming. Starting with some of the most popular game engines, including Unity 3D and Unreal 4 Engine, OSVR also works with device plugins from hardware market leaders like Bosch and Razer and the latest from Sixense and LeapMotion. Moreover, OSVR is designed to support all VR devices, including the Oculus DK 2 and Vrvana’s Totem headset. Razer designed OSVR Hacker Dev Kit to scheduled to ship in June 2015 with a price of just $199.99 USD to allow more people to be able to have access to VR-capable open-surce hardware. The dev kit is supposed to be equipped with a 5.5-inch Full HD display with 60 fps and a face mask design similar to that of Oculus Rift with high FOV and maybe even better optics than the one used in DK2.
The list of current supporters of OSVR is already quite big and will most likely continue to grow, it includes HMD manufacturers such as Sensics who are specializing in high-end professional solutions as well as some Game developers, Input device manufacturers and others. While Oculus is doing nice progress it seems that the VR revolution cannot be left in the hands of just a single company and since it can take some time it is nice to see that industry is trying to join hands in making available VR technology to more people and at a more affordable price – something that is a must if we really like to see VR gaming getting mainstream adoption in a few years. The OSVR initiative is definitely something to keep your eye on if you are interested in virtual reality gaming, it will be interesting to see how other manufacturers of VR solutions will also join in.
Today Nvidia has launched their new Maxwell high-end GPUs – GTX 980 and GTX 970 and while they offer a nice performance boost over the previous generations at a reduced power usage level making them a really attractive upgrades, thees products came with an interesting VR-related announcement as well. Apparently Nvidia is already working together with companies developing Virtual Reality products such as Oculus Rift in order to provide the users with a better experience. While we may need some more time before VR becomes more mainstream and reaches a really great level of experience it seems that things are really moving at a good pace already. It is also interesting to note that Oculus Rift and other VR headsets may actually turn out to be the saviors of the 3D Vision technology as well, but we’ll have to see about that.
With VR Ditect Nvidia is trying to address multiple things related to the VR experience such as to lower the latency, improve quality and provide more content that will work well on your Oculus Rift or other VR headset without having to be specially made for them. With a VR headset the latency is much more important than when using a traditional display as any perceived delays can throw off the VR experience, potentially causing the user to get motion sickness.
The standard VR pipeline from when you move your head to when you actually see the response on your VR display is about 50 milliseconds. Nvidia’s goal is to reduce this latency as much as possible so gamers feel even more immersed. A large portion of this is the time it takes the GPU to render the scene as well as OS overhead (about 32 ms) and as a result of this effort, they have managed to cut 10ms of latency out of the standard VR pipeline. Using a new MFAA Anti-Aliasing filtering method rather than MSAA for better image quality, they have managed to reduce GPU render time by an additional 4ms, and even further with another technique they are also working on caled asynchronous warp. Rather than requiring the GPU to re-render each frame from scratch, with asynchronous warp the GPU takes the last scene rendered and updates it based on the latest head position info taken from the VR sensor; head tracking input is literally sampled moments before you see it. By warping the rendered image late in the pipeline to more closely match head position, discontinuities between head movement and action on screen are minimized, dramatically reducing latency even further. So they have managed to actually reduce latency from 50ms down to 25ms and that should dramatically improve the VR experience for the user.
Besides latency, another major obstacle that must be overcome to provide an immersive VR experience is performance; not only is a high frame rate needed, it’s also important that the frames are delivered to the user’s eye in a smooth fashion. While one GPU can be used to drive an Oculus Rift, enthusiasts will want two GPUs to ensure the best performance with maximum game settings enabled. That would be especially true if we go from a screen resolution higher than the currently used Full HD on the Oculus Rift DK2 or we have a separate 1080p screen for each eye. Traditionally Nidia SLI relies on alternate frame rendering (AFR) where each GPU renders alternating frames that are presented to the user. For VR scenarios however, Nvidia is implementing a new VR SLI profile where each GPU will render per display; the display responsible for the left eye will be handled by one GPU, while the second GPU will be responsible for the display on the right eye. This solution should provide lower latency and ultimately better performance for the user.
Display resolution is another critical feature for VR. With the displays in a VR headset resting extremely close to the user’s eyes, higher resolution can remarkably improve the VR experience. Dynamic Super Resolution (DSR) can be used to provide an improved resolution quality with today’s existing VR displays by rendering the game images at a higher resolution and then scaling them down to the native VR display resolution preserving more details and smoother image.
Another challenge VR must overcome to be more widely adopted is lack of content. Outside of a handful of tech demos, there aren’t that many applications that support VR headsets. To solve this issue Nvidia is leveraging their extensive experience with the 3D Vision technology to bring VR support to existing games that are already compatible with 3D Vision, and adapt them to work with VR. They plan to use GeForce Experience to optimize game settings and handle configuration automatically, and aside from converting the games to the output format required by the VR headset and providing stereoscopic 3D rendering they will also map mouse and keyboard commands to VR inputs like head movement, similar to what was done with the Gamepad Mapper on the Nvidia Shield Portable.
All of the above sounds very interesting and promising, but what is still missing is information on when all of these features under the VR Direct technology will be actually available to users to try them out – for example developers that already have their hands on the Oculus Rift DK2.