Unity for motion capture possibilities

Premo Group

Unity and its motion capture possibilities

Lucía Vera 0


UNITY is one of the most used environments to develop games and graphic applications. Have you heard about it? Let’s review the basis of Unity and the future of this powerful tool.


Unity is a platform to develop all kind of projects related to 2D and 3D graphics, offering a complete set of tools to create multipurpose applications. The tool emerged as a very powerful graphic engine for games development. But, as it was growing in users and improving its functionalities, the areas of use and deployment increased. Nowadays, Unity is one of the most used engines for games, Virtual and Augmented Reality applications, simulations and creation of advanced user interfaces and virtual experiences. This platform is based on an editor where it’s possible to create and set up all the components necessaries for your scene. It’s available on Windows, Mac and Linux, and  it makes possible to add simple components and configure a basic scene., But if you need more specific objects, you can design and model them into your modelling program, or using specific plugins inside Unity, and then import them to the unity assets in an easy way.

The Unity editor is based on a scene, where you can see in graphic mode all the entities included in your application. These entities are organized in a hierarchy, where you can see the list of components and names. All the information included in a project is organized in assets (folders and elements inside each folder), visible in the Project area (see parts in Image 1).

Unity program interface

Image 1: Unity editor parts

When a specific element in the hierarchy or in the scene is selected, the information associated to it appears in the inspector window. Position, rotation, scale or texture are part of the information that can be viewed in the inspector. But not only that.

Image 2: Inspector in the Unity Editor


It’s also possible to change the behaviour of this element in the scene, adding specific components in its inspector, such as scripts, an animator, colliders, and many more. A script is a piece of code that can be created by the developer ad-hoc, using one of the supported programming languages. There is a component in Unity to edit and create scripts, called Monodevelop. , If you prefer any other code editor, for example, Visual Studio, you can configure Unity to attach the scripts to this external tool to edit them.

unity code

Image 3: Part of a script written in C#.

To see how the scenes are going to behave, you can enter in Game mode inside Unity by using the play button. In this case, the application simulates how it will behave when ran on a specific platform. This can be seen in the Game window included in the Editor. This window is interactive, and one may use specific interaction behaviours using the mouse (the behaviours need to be added to to your elements in the scene). It’s possible to add tactile interaction, using specific components in Unity, and then you can try them using specific tactile elements (screens or frames) or simulate the interaction using the mouse. All of them can be tested inside the Game window in the Editor.

Unity rendering

Image 4: Scene window inside the Editor.

To the possibilities offered by Unity regarding 3D environments, we must add powerful tools to create Advanced User Interfaces for your applications or games, due to the specific components integrated in Unity to create interactive interfaces.

Also, Unity offers a big collection of assets, inside its Asset Store, where you can try to find the specific functionality that you need for your application, buy or download it (if iit’s free) and import to your project as new assets. After that, all the components included in the package will be available for its use inside your project.


One of the most powerful characteristics of Unity, is the support offered by the tool for developing applications in more than 25 different platforms, across mobile, desktop, console, TV, VR, AR and Web. 

Platforms supported by Unity

Image 5: Platforms supported by Unity


If you would like to learn more about Unity and all the functionalities available, it’s possible to check all its latest features in the Unity web page



It is easy to start using Unity. You only have to access its website and see the different plans that are available. If you‘re a beginner, you can try the personal version, available to use for free if your revenue of funding (raised or self-funded) does not exceed $100K per year. There are other options, depending on your profile and the benefits you would like to acquire using Unity.

After selecting the corresponding plan, the installation is very easy: one only has to follow the specific steps described in the installation application. If you are a beginner and you need to learn more about Unity and its usage, there are many tutorials online that accessible from the Unity web site: Also, it’s possible to find more tutorials to help you develop specific functionalities inside Unity on Youtube.



Unity is the most extended tool to develop both Virtual and Augmented reality experiences. It includes specific functionalities to help in the development of very realistic applications in these fields. 

For VR, Unity includes:

  • High definition render pipeline for VR.
  • Extended Reality Interaction Toolkit.
  • Particle system.
  • Spatial audio.
  • Stereo instancing.
  • And the possibility of developing the application once and use it for multiple VR devices.

For AR, Unity includes:

  • AR Foundation: a specific framework for developing AR applications, and use it inside mobile and wearable AR devices.
  • The possibility of using Unity as a library in your native mobile app.
  • Extended Reality Interaction Toolkit.
  • Responsive AR ads.
  • And new features that will be included in the coming versions.


Also, in the Asset Store you can find a big variety of components to help in the development of this kind of applications. Unity has support for many head mounted displays to increase your immersive experience inside the tool. You only need to download and import the specific plugin for your immersive device and build the corresponding application in the correct platform where it will be running. The results obtained depends on the quality of your models and graphics, and the “realistic” behaviour you have added to the virtual components in the scene.



Motion tracking is the process that allows us to capture and follow the position of specific objects inside the real space and use this information for our application (virtual, augmented or other). Using this kind of systems, it’s possible not only to track generic objects, to control where they are and react depending on their location in the space, but it’s also possible to control a human body motion and translate its movement to a virtual character in a synthetic setting. This can be done using specific motion tracking systems. In our blog we published a post where we analysed the different tracking systems available for virtual reality applications. You can learn more by following the link on the image below.

3DCoil banner


Depending on the system used, it’s possible to integrate the tracking capabilities inside a Unity project. In the case of Optical motion capture systems, such as OptiTrack, they offer the option to get the tracking data from their tool called Motive and use it onto your application or integrate a specific plugin in a tool, for example Unity or Unreal, and get directly the data inside the project using streaming rigid body data (see data in 

Unity 3d models

Image 6: Motive Motion tracking tool for Optitrack systems.


Another example is the Leap Motion hardware that helps capture the motion of your hands inside an application. The main purpose of this is to integrate a natural interaction in Virtual and Augmented Reality applications, using the user’s hands instead of controllers or other devices. This component can be easily integrated in Unity and add powerful interaction to your application directly using the hands and fingers. In this case, this device is a motion capture system with a specific purpose, to control the movement of the user’s hands and fingers.

Leap device

Image 7: Leap Motion device.


In the area of Electromagnetic motion tracking systems, PREMO has a EMTS demo kit to add control of your objects movement inside an application. This system is able to position an object in the 3D space, giving the absolute position and rotation in their three axis, giving a full six degrees of freedom (6DOF) tracking system. This hardware is called AmfiTrack and is available from the PREMO web page:

Unity boxes rendering

Image 8: a) Amfitrack components from PREMO group. b) 6 Degrees of freedom space.


This hardware is an embedded stand-alone system, with very high precision and low cost components, offering a very interesting electromagnetic tracking system. Some of their specific features are:

  • Highly scalable/customizable 3D positioning/orientation electromagnetic tracking system.
  • High 6DOF accuracy performance @ low hardware cost.
  • Flexible distance between source and sensor.
  • Lowest hardware cost for consumer product applications.
  • Wireless system using worldwide approved 2.4GHz RF link.
  • Battery powered (low power consumption) system components.
  • 100% embedded software for position/orientation calculations (no external PC and software needed).
  • Flexible sample rate: Sub 1 Hz to x kHz.
  • Capability of multiple EMF source and sensor system (multiple systems can work within tracking range of each other simultaneously).
  • Python script for grabbing Real Time data to Windows or Linux available on request when purchasing dev kit.


Due to their attractive usability and applicability, a direct integration of the AmfiTrack system components is available in Unity. This will allow any developer to integrate Electromagnetic Motion Tracking Systems in their applications using Unity in a very easy way. This kind of systems give a very important advantages, due to their low cost, easy installation and no line-of-sight issue, when compared to camera-based tracking systems. 

It’s possible to try a Unity Demo downloading it from the AmfiTrack web page on  PREMO’s website:

In this demo, it’s possible to see the quality of the hardware, the possibilities inside Unity and the precision of the data. Here you can see a video with one of the first presentations of the product:



The AmfiTrack system has a fully developed driver component for Unity. Once this software is installed, it’s possible to link the 6DoF pose data from any of the sensors to any entity in Unity. The link is very easy to setup, just by using the inspector panel. The demo in the video has the flashlight linked to one of the sensors and the camera to another. 

The data conversion and processing from Amfitrack USB port hub is completely seamless to Unity real-time 3D environment, keeping the developer focused on the application development instead of worrying about low level hardware system details. This is a big step forward into the full OEM AmfiTrack system integration on final customer applications.

Due to its flexibility, Unity can be considered one of the most powerful tools to develop applications in the area of 3D and 2D. Also, it’s becoming one of the most used tools in the game development industry and VR and AR applications.


Leave a Reply

Your email address will not be published. Required fields are marked *

Free ebook: 10 Key criteria for choosing EM Tracking sensors

Keep up to date with the virtual reality market at all times.

Discover more

Find out more Contact us

Find out more Contact us
Check out the Privacy Policy

+More information-

Need more information?

I wish to receive information from Grupo Premo
By checking this box you confirm that you have read and agree to our terms and conditions and Privacy Policy

Our webstore uses cookies to offer a better user experience and we consider that you are accepting their use if you keep browsing the website. Cookies Policy ACCEPT

Aviso de cookies