FOVE is one of the first companies to come out with a device that is involved majorly in tracking the eye movement. It works by illuminating the eye with invisible infrared light from the side and using a dark-spot tracking system, it collects images from the front. Theoretically, since everyone's eyes reflect the same colour of light under IR illumination, processing becomes much easier. With some top notch features that it offers compared to contemporary HMDs, it seems pretty attractive for a price of $600. But is eye tracking really necessary in the first place? And should it be implemented as a standalone tracking method? Turns out that eye tracking, though not mandatory for VR, only enhances the virtual experience and blurs out the line between the virtual and real world making the virtual environment look more realistic.
I must admit that in my mind I almost eliminated the need for eye tracking when I first thought about this idea. I always had an impression that eye tracking wasn't needed at all and I had one simple reason for this: Let's take a scenario where you're gazing at the real environment in front of you. Keeping your head fixed means that you have a fixed field of view. Now, you will only change your field of view if you move your head around. Moving your eyes around just allows you to focus on objects in your current field of view. You still don't see objects outside your FOV. To do that you'd have to move your head. Now, the head tracking gears currently available mimic the exact same feature of the real world. So, why do we need eye tracking when head tracking does the job perfectly? The answer lies in its advantages. In the example that I took above I missed one subtle point; when you focus at something, your brain sharpens that object for you and the remainder of the background seemingly appears slightly blurred. This point has major computational and graphical benefits. From a VR headset point of view, this means that the computer does not need to render all of the environment at the highest resolution for a good VR experience. You could only consider those objects that are currently focused on by the user without compromising on the virtual experience and the way to do this is by tracking what the eye sees. This saves computational resources and allows applications to run smoothly. Besides the benefit mentioned above there are other advantages that eye tracking provides. Avatars when applied with eye movements that reflect the actual eye movements of the wearer, enhance the interpersonal relationship that the wearer has with his avatar.
With eye tracking, you could gaze at someone in the virtual world and they could respond appropriately to your gaze. This is one of the strong points that makes a realistic VR experience more compelling.
Users with physical disabilities could greatly benefit from this technology.
However, like all other technologies, FOVE also comes with its drawbacks: Firstly, our eyes though they allow us to see, are not fully responsible for what we have access to in our field of view i.e. our eyes cannot change our field of view. Our FOV changes depending on the orientation of our head because of which we can see more things. Secondly, eye gestures are limited and that limits the functionalities that can be possible with devices that solely use eye tracking. We can't do as many things with our eyes compared to what is possible with our hands which are made use of via controllers with HMDs. Moreover, there isn't any standard on what a particular eye gesture means. It would be difficult to translate all controls that are possible with HMD's and controllers to eye controls. So there's a possibility of misunderstanding eye movements. For example: How would you simulate a back and forth motion in the virtual environment using your eyes? FOVE uses gaze as a gesture for selecting items from a menu. But do they really know for sure that by gazing, the intent of the user was definitely to select that item or is it just that the user wants to get a better view of the text. Thirdly, using eye gestures can be frustration. Spending long periods on an eye tracking device would strain the eyes and wearer's interest in using the device would decrease over time. Plus, medical concerns like convergence-accommodation reflex [4] that shadow the VR industry could be magnified in a case where the eyes are majorly involved in the VR experience. A video that I came across on YouTube which is a demonstration of Tobii's eye tracking gear shows something that I'd like to call unfair advantage in gaming. The video snippet is linked below: The player uses his eyes to aim at the target to be attacked but even in real life we know that irrespective of where you looked (you would look at the target though) that doesn't completely decide whether you're going to hit the object or not. It depends on your throwing skills and that throwing is tracked through wand controllers and the path that the projectile takes is governed by the physics engine in unity. So you would get a hit only if the physics to get the hit is right and not by just looking at it with an intent to shoot it. Conclusively, I'd say that eye tracking is needed in virtual and even possibly augmented reality but it should not be implemented as a standalone technology but should be integrated with the well established head tracking technology that use HMDs. In my short research on this topic I've come to the conclusion that implementing eye tracking as a standalone technology would not achieve the same results as with head tracking technology. But it would do wonders if the two are integrated. Resources: 1. https://www.getfove.com/ 2. Hands-On with FOVE Eye Tracking VR Headset: https://www.youtube.com/watch?v=gLNZyC67Srw 3. https://abeer-cs491.weebly.com/students-choice.html 4. http://www.news.com.au/technology/gadgets/wearables/health-warning-as-immersive-virtual-reality-craze-linked-to-vision-problems/news-story/a67849532e82d857be7a3524c91ef11e
0 Comments
|
|