3D Vision Blog

A normal user's look into the world of 3D Stereo Technologies

3D Vision Blog header image 4

First Impressions from the HTC VIVE and Comparing it to the Oculus Rift

July 7th, 2016 · 1 Comment · 3D / AR / VR / HMD


Purchasing and getting an HTC VIVE is still pretty much equally hard as getting an Oculus Rift, though thankfully being a Kickstarter backer has secured an Oculus Rift earlier. So while still not having an HTC VIVE (also still not being sold to many countries around the world), I’ve had the chance to try the device and play a bit with it on a recent event where Nvidia was demonstrating their VR Funhouse demo on a GTX 1080 equipped system. After being able to play around and try some VR demos on the two devices I’m able to compare them better and to give you an idea on what you can expect and what are the differences at the moment…

The HTC VIVE is a bit bulkier and heavier VR headset compared to the Oculus Rift, but that does not seem to be a problem when using it. The VIVE headset does not come with headphones attached, so you will need to find a good pair that won’t interfere with the rest of the headset and will be comfortable. Even though the Rift comes with some sort of headphones attached, you’d probably want to detach them and still use a separate headset with it anyway. The only other major difference is the presence of a camera on the front of the HTC Vive, though I have yet to see it in action, in theory it could allow for some nice AR experiences or just to be able to switch to real world view without having to remove the headset.


The OLED displays inside both competing products does seem to be pretty much the same – 1200×1080 per eye with a 90 Hz refresh rate, pretty much the same type of lenses and you get about 110 degrees FOV. The end results is pretty much the same in terms of graphics quality and experience, and with the negative side effects minimized for both devices (it is not much harder to get dizzy using the VR headsets). So where is the major difference then? It is in the fact that the HTC Vive currently ships with controllers for each hand of the users while with the Oculus Rift you don’t get these yet and they are supposed to start shipping their touch controllers sometime by the end of the year.

The presence or the lack of hand controllers at this point in time pretty much defines major difference. For the moment the Oculus Rift provides more stationary VR experiences where you probably sit in your chair while using the device, while the HTC Vive is focusing on more interactive and “active” action where you move around. Of course when Oculus ships their touch controllers things will probably get similar for both devices, especially when developers see that there is not much point in making exclusive titles for one of the devices with the VR market still so small.

The HTC Vive does seem at this point a bit more bothersome to setup and it requires more free space around your computer, so that you will be able to move around freely without hitting obstacles. It comes with two sensors that are used to setup a “working perimeter” and track your movement in the real world and then translate this into virtual world movement. The Oculus Rift currently uses only one sensor, but the touch controllers will come with an additional sensor for better tracking, so setting up things will probably look the same for both with the hand controllers. The HTC Vive controllers do seem a bit large and not so comfortable, while the Oculus touch doe seem to be more compact and it could turn up a more convenient solution, but we’ll have to wait and see.


A bit about the Nvidia VR Funhouse demo that I’ve tried. It is a fun little demonstration of the level of interaction in the virtual world that you can achieve and it is Ok as a demonstration, but you probably will not be willing to replay it often. Not that you can since it is not yet publicly available, although it seems that Nvidia will be releasing it as a demo and in the form of a source code as well for developers. The VR Funhouse demo is quit heavy as it employs a lot of Nvidia technologies to make it look and feel as realistic as possible, maybe apart from the graphics that is a bit more cartoon-ish. Still the experience feels quite nice while you run through the various short games that you play inside the demo – hitting stuff, breaking stuff, shooting stuff, throwing stuff. It is easy and fun, though you might replay again some of the first games in order to get the hang of things initially. The only thing that felt a bit weird was the virtual world object placement relative to the hand controllers, in some of the games inside the VR Funhouse the positioning on top of the controllers made it feel a bit weird to get a good feel on where to throw the object exactly.

The other demo I’ve also tried, besides the VR Funhouse, was Tilt Btush by Google that lets you paint in 3D space. This was quite impressive when you get the hang of it, though the use of the two touch controllers for choosing tools and colors may need some getting used to. Since this demo also relies on the use of hand controllers it is only available on the HTC Vive, but it will most likely also support Oculus Rift once the touch controllers become available for it as well.

In short, there is a bit of a difference at the moment in what you can do with the Oculus Rift and the HTC Vive, but once the Rift gets its touch controllers available both should be pretty much on the same level. The visual quality and experience with the headset is pretty much on the same level, though still the resolution is a bit lower than it might be in order for the user not to be bothered by pixels and jaggy edges for example. For this to happen however we’ll need some more time, because the graphics processing power needs to catch up a bit more to be able to handle higher resolution displays with high framerates without problems. If you ask what should you get, the HTC VIVE or the Oculus Rift, my answer will probably be which one you are able to actually get first as these are still hard to get ad you need to wait before your order will be fulfilled. For the moment the HTC VIVE offers a more active and interactive approach to VR experiences, while the Oculus Rift is in a more passive choice, but that difference will probably not be present in a couple of months anyway. Price wise both should be pretty much the same when you add in the Oculus touch controllers in the calculation, so again buy the one that you can get your hands on faster…

→ 1 CommentTags:·········

Nvidia has Released GameWorks VR and DesignWorks VR SDKs for VR Developers

November 20th, 2015 · 1 Comment · 3D / AR / VR / HMD

Today Nvidia has officially released the 1.0 version of two powerful VR software development kits (SDKs) – the Nvidia GameWorks VR and Nvidia DesignWorks VR that are targeted at headset, game, and application VR developers in order for this relatively new category of display devices to offer better performance and user experience. Delivering good VR games and experiences is a complex challenge, especially since immersive VR can require multiple times the graphics processing power compared to traditional 3D apps and games you not only need a good GPU in terms of performance, but also one that is optimized for VR. With these SDKs developers on Nvidia hardware should now have the tools to create amazing VR experiences, increase performance, reduce latency, improve hardware compatibility and accelerate 360-degree video broadcasts. Both SDKs deliver a comprehensive set of APIs and libraries for headset and app developers, including the new Multi-Res Shading Technology. Available publicly for the first time, Multi-Res Shading is an innovative rendering technique that increases performance by as much as 50 percent while maintaining image quality. Also the 1.0 SDK releases also add support for the new Windows 10 operating system.

GameWorks VR
For game and application developers, the GameWorks VR SDK includes:
– Multi-Res Shading — an innovative rendering technique for VR in which each part of an image is rendered at a resolution that best matches the pixel density of the warped image required by the headset. It uses the NVIDIA Maxwell chip architecture’s multi-projection capability to render multiple-scaled viewports in a single pass, delivering substantial performance improvements.
– VR SLI — provides increased performance for VR applications where multiple GPUs can be assigned a specific eye to dramatically accelerate stereo rendering.
– Context Priority — provides control over GPU scheduling to support advanced VR features such as asynchronous time warp, which cuts latency and quickly adjusts images as gamers move their heads, without the need to re-render a new frame.
– Direct Mode — treats VR headsets as head-mounted displays accessible only to VR applications, rather than a typical Windows monitor, providing better plug and play support and compatibility for VR headsets.
– Front Buffer Rendering — enables the GPU to render directly to the front buffer to reduce latency.

DesignWorks VR
For developers of professional VR applications in markets such as manufacturing, media and entertainment, oil and gas, and medical imaging, NVIDIA DesignWorks VR builds on the core GameWorks VR SDK with the addition of powerful tools, such as:
– Warp and Blend — new APIs that provide application-independent geometry corrections and intensity adjustments across entire desktops to create seamless VR CAVE environments, without introducing any latency.
– Synchronization — techniques to prevent tearing and image misalignment while creating one large desktop that is driven from multiple GPUs or clusters. Various technologies like Frame Lock, Stereo Lock, Swap Groups and Swap Barriers are available to help developers design seamless and expansive VR CAVE and cluster environments.
– GPU Affinity — provides dramatic performance improvements by managing the placement of graphics and rendering workloads across multiple GPUs.
– Direct for Video — enabling VR and augmented reality environments such as head-mounted displays, CAVES/immersive displays and cluster solutions.

VR developers can download the GameWorks VR SDK at https://developer.nvidia.com/gameworksVR.
DesignWorks VR can be accessed by registering at https://developer.nvidia.com/designworks-vr.

AMD has also been more active on VR support lately with the recent announcement of their AMD LiquidVR Technology for Developers. One of the key technology goals of LiquidVR is to reduce unwanted processing latency (reduce motion-to-photon latency) and deliver a consistent frame rate. AMD recently released the Alpha version of its LiquidVR SDK to select technology partners. The LiquidVR SDK is a platform designed to simplify and optimize VR development.

The four major features of LiquidVR SDK include:
– Asynchronous Shaders: more efficient GPU resource management.
– Affinity Multi-GPU: faster multi-GPU performance.
– Latest Data Latch: reduced motion-to-photon latency.
– Direct-To-Display: seamless plug and play experience.

The AMD LiquidVR SDK is not yet publicly available to all interested developers, but you can get more details about it at AMD LiquidVR Technology for Developers.

Now the big question that remains is how soon users are going to have their hands on the new VR headset hardware such as the consumer version of the Oculus Rift that should be released sometime in the Q1 2016 or the alternatives such as HTC VIVE and others that might be coming with their own hardware. The developer hardware that has been available with most notable wider availability of the two generations of dev kits of the Oculus Rift has sparked the interest and demand for VR headset in many users that simply cannot wait to get their hands on the hardware and experience the promised great VR experiences as well as play great games in a new more realistic way.

→ 1 CommentTags:·············

Nvidia Introducing VR Direct Technology with New GPU Launch

September 19th, 2014 · 4 Comments · 3D / AR / VR / HMD, General 3D News


Today Nvidia has launched their new Maxwell high-end GPUs – GTX 980 and GTX 970 and while they offer a nice performance boost over the previous generations at a reduced power usage level making them a really attractive upgrades, thees products came with an interesting VR-related announcement as well. Apparently Nvidia is already working together with companies developing Virtual Reality products such as Oculus Rift in order to provide the users with a better experience. While we may need some more time before VR becomes more mainstream and reaches a really great level of experience it seems that things are really moving at a good pace already. It is also interesting to note that Oculus Rift and other VR headsets may actually turn out to be the saviors of the 3D Vision technology as well, but we’ll have to see about that.


With VR Ditect Nvidia is trying to address multiple things related to the VR experience such as to lower the latency, improve quality and provide more content that will work well on your Oculus Rift or other VR headset without having to be specially made for them. With a VR headset the latency is much more important than when using a traditional display as any perceived delays can throw off the VR experience, potentially causing the user to get motion sickness.

The standard VR pipeline from when you move your head to when you actually see the response on your VR display is about 50 milliseconds. Nvidia’s goal is to reduce this latency as much as possible so gamers feel even more immersed. A large portion of this is the time it takes the GPU to render the scene as well as OS overhead (about 32 ms) and as a result of this effort, they have managed to cut 10ms of latency out of the standard VR pipeline. Using a new MFAA Anti-Aliasing filtering method rather than MSAA for better image quality, they have managed to reduce GPU render time by an additional 4ms, and even further with another technique they are also working on caled asynchronous warp. Rather than requiring the GPU to re-render each frame from scratch, with asynchronous warp the GPU takes the last scene rendered and updates it based on the latest head position info taken from the VR sensor; head tracking input is literally sampled moments before you see it. By warping the rendered image late in the pipeline to more closely match head position, discontinuities between head movement and action on screen are minimized, dramatically reducing latency even further. So they have managed to actually reduce latency from 50ms down to 25ms and that should dramatically improve the VR experience for the user.

Besides latency, another major obstacle that must be overcome to provide an immersive VR experience is performance; not only is a high frame rate needed, it’s also important that the frames are delivered to the user’s eye in a smooth fashion. While one GPU can be used to drive an Oculus Rift, enthusiasts will want two GPUs to ensure the best performance with maximum game settings enabled. That would be especially true if we go from a screen resolution higher than the currently used Full HD on the Oculus Rift DK2 or we have a separate 1080p screen for each eye. Traditionally Nidia SLI relies on alternate frame rendering (AFR) where each GPU renders alternating frames that are presented to the user. For VR scenarios however, Nvidia is implementing a new VR SLI profile where each GPU will render per display; the display responsible for the left eye will be handled by one GPU, while the second GPU will be responsible for the display on the right eye. This solution should provide lower latency and ultimately better performance for the user.

Display resolution is another critical feature for VR. With the displays in a VR headset resting extremely close to the user’s eyes, higher resolution can remarkably improve the VR experience. Dynamic Super Resolution (DSR) can be used to provide an improved resolution quality with today’s existing VR displays by rendering the game images at a higher resolution and then scaling them down to the native VR display resolution preserving more details and smoother image.

Another challenge VR must overcome to be more widely adopted is lack of content. Outside of a handful of tech demos, there aren’t that many applications that support VR headsets. To solve this issue Nvidia is leveraging their extensive experience with the 3D Vision technology to bring VR support to existing games that are already compatible with 3D Vision, and adapt them to work with VR. They plan to use GeForce Experience to optimize game settings and handle configuration automatically, and aside from converting the games to the output format required by the VR headset and providing stereoscopic 3D rendering they will also map mouse and keyboard commands to VR inputs like head movement, similar to what was done with the Gamepad Mapper on the Nvidia Shield Portable.

All of the above sounds very interesting and promising, but what is still missing is information on when all of these features under the VR Direct technology will be actually available to users to try them out – for example developers that already have their hands on the Oculus Rift DK2.

→ 4 CommentsTags:·······