Due to rapid advances in Virtual Reality (VR) and Augmented Reality (AR) and the resurgence of head-mounted displays to enhance the AR/VR experience, the interaction in these types of environments has become a very important challenge. It is now necessary to obtain a complete interaction system that avoids the use of joysticks and other accompanying devices or peripherals to get a more natural interaction inside the application. Ideally this kind of interaction needs to cover the natural, continuous movements of the user’s hands and fingers, with less, or no, limitations, giving the user the sensation of realism and identifying his/her own hands with the virtual ones. There are several areas of research around this interaction with the main objective to obtain the most natural and robust method to track user’s hands and fingers inside technological applications. Depending on the hardware used, it is possible to find different solutions with alternative technologies:
Computer Vision techniques
This technology is related to the use of cameras and image analysis for tracking the user’s hand and finger movements inside the application. In this case, the most common solutions use Infrared (IR) cameras to capture and recognize different hands gestures. Also there are some other alternatives, all of them based on computer vision using different kinds of cameras:
- IR cameras situated on the wrist to control the hand position
- Depth of field cameras placed on the shoulder to detect finger touches
- Fisheye lens cameras attached to the chest to detect whole-body movement
- Infrared proximity sensors to obtain and transmit different types of inputs.
We can find also some commercial products, such as Leap Motion’s, in which a small sensor is placed on the head-mounted display (or connected to your computer) that allows the application to track and detect hand and finger movements.
Advantages and Problems
The main advantage of using this technology is the possibility of tracking accurately hand movements, hand postures, finger movement and finger gestures. The main problems that can result from the use of this technology are the sensitivity to light conditions changes and the occlusion issues due to the necessity of having the hands and fingers in the line of sight of the camera. These characteristics can limit the use of this technology in certain environments (dark spaces or hands moving in wider areas out of the camera field or entering in covered spaces such boxes, pockets and more).
In researching alternative methods to Computer Vision technology, there have been several research studies trying to take advantage of sounds or electric field analysis. In these alternatives, it is necessary to use sensors placed on the user’s body to obtain the information necessary to track the position of specific parts of the body:
- Use of Electromyography (EMG), for example, to sense tongue movements
- Analysis of bio-acoustic sounds propagated through the body that can be used to identify concrete finger taps from the user’s hands
- The analysis of the electric fields can give information about body and hand postures
Advantages and Problems
In all of these cases, the most important problem is the impossibility to control and sense all the natural movement in an interaction. These approaches only can track discrete, pre-defined gestures and, therefore, introduce limitations in the user interaction with the final application.
Another possibility to control and track the hands of the user for a natural interaction can be the integration of some kind of instruments in the environment, such as radio signals or Doppler-shifted signals.
Advantages and Problems
In this intervention on the environment, outside of the user’s body, it is necessary to have very accurate sensors with very precise calibrations to obtain good results. Also, they can suffer from occlusion issues due to possible problems in the reflection of the signal from the user’s body.
Magnetic Field Sensing
In this case, systems use magnetic fields (MF) to track continuous, accurate and occlusion free finger movements. There have been several approaches in this area such as:
- Polhemus: Based on MF sensing, can track objects with six degrees of freedom.
- Razer Hydra: A commercial solution, which combines MF with inertial measurement units (IMUs) to enable game controller based tracking.
These earlier systems have several problems due to the characteristics of the hardware setting used:
- They require strong base stations, needing large installations to generate the 3 axis magnetic field to truck all the movement
- The base station used needs to be in a static position to get good results in the avoidance of drifting issues
- The user must be near to the base station so the range of movements must be limited. In this case, these two magnetic approaches are completely unusable in a portable situation or a wearable application.
One new approach that tries to solve the limitations of the first MF systems is Finexus, developed by the University of Washington together with Oculus Research .
The Finexus system uses four magnetic sensors to track electromagnets placed on each fingertip, which have the size of a fingernail. This approach does not need the direct view between the electromagnets and the sensor, solving the problem of occlusion introduced by the Computer Vision methodology. Also, the size of electromagnets used make this a very good option in the case of portable or wearable solutions. In this case, as the system is placed on the hand and fingers, the maximum distance allowed between sensors and electromagnets is around 12 centimetres (researchers are improving the system to increase this distance to 25 centimetres). This limits its usage for the whole body, but it is enough for accurate, continuous and reliable tracking of the hand and finger movement.
How does Finexus work?
It is possible to understand the Finexus functionality as a GPS system. In this case, the system first calculates the distance between the electromagnet and the four magnetic sensors. With this, it is possible to calculate the intersection of those distances that helps to determine the position of the electromagnet in the space (trilateration). As each electromagnet operates at a different frequency, it is possible to identify each one individually.
Finexus can simultaneously track multiple fingertips thanks to the use of AC-driven electromagnets operating at different frequencies. Bandpass filters are then applied, centered in those frequencies, making it possible to extract and differentiate the magnetic field coming from each individual electromagnet.
The hardware used in the Finexus system includes a PCB, electromagnets and sensors. The PCB has four magnetometers and three microcontrollers and is combined with two sensor boards designed to create the system coordination necessary for the trilateration process. The electromagnets used in the system are wrapped by a 300-turn coil of copper magnet wire and has a cutoff section area of 0.25 cm2 with a ferrite core design.
Depending on the purpose of the electromagnetic hardware used, it is important to find the adequate components to get the best results in tracking. In the case of Finexus, they built their own hardware, ad-hoc, but there are different commercial solutions which can cover the specific necessities of the system to be developed.
The Finexus project uses a standard calibration procedure based on the Earth’s magnetic field to calibrate the magnetometers. This method includes the acquisition of data while the PCB is rotated randomly at a fixed location. It is important, in this process, to keep the PCB away from surrounding electronic devices to minimize noise. With the calibration of the sensors, system error is minimized and obtaining best tracking results.
This tracking system enables the interaction using the user’s hands without occlusion problems and its main application is focused on VR and AR systems. In these two fields, the sector that can realize the greatest benefits with this tracking system is computer gaming with major possibilities related to the natural interaction of the user with the game in all kinds of environments.
Also, given the precise tracking capability of Finexus, it is possible to introduce it in areas such as multimedia, music, education or clinical medicine. Examples of its possible usages can include:
- In-air writing for multiple purposes
- Gesture based gaming control
- Tracking to support complicated inputs for a system, requiring fine-grained actions (typing on a keyboard, painting, playing the piano, etc.)
Due to the Finexus system’s resistance to ambient noise, it is now possible to have a hand and finger tracking system with very reliable information without occlusion issues and multipoint tracking without any interference.
 David Kim, Otmar Hilliges, Shahram Izadi, Alex Butler, Jiawen Chen, Iason Oikonomidis, Patrick Olivier. Digits: Freehand 3D Interactions Anywhere Using a Wrist-Worn Gloveless Sensor. In Proc. of UIST’12, pp. 167-176.
 Chris Harrison, Desney Tan, and Dan Morris. Skinput: appropriating the body as an input surface, In Proc. of CHI’10, pp.453-462.
 Ruth Ravichandran, Elliot Saba, Ke-Yu Chen, Mayank Goel, Sidhant Gupta and Shwetak N Patel, WiBreathe: Estimating Respiration Rate Using Wireless Signals in Natural Settings in the Home, IEEE PerCom’15, pp.131–139.
 Ke-Yu Chen , Shwetak N. Patel , Sean Keller, Finexus: Tracking Precise Motions of Multiple Fingertips Using Magnetic Sensing, Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, May 07-12, 2016, Santa Clara, California, USA