How to make VR application with Nuitrack
It was 2017th year. The Earth was inhabited by strange creatures, in many ways they were not different from people, but lots of them do not know how to program at all.
And so, my friend, once you visited this page, perhaps you want to develop a VR application for the mobile platform with full body. You think that this is too difficult and take a lot of time, but you are mistaken... Now we will tell you basic things you need to know for becoming the real master Shifu in the mobile VR.Each type of visual aid has pros and cons that must be evaluated to ensure it will be beneficial to the overall presentation. Before incorporating visual aids into speeches, the speaker should understand that if used incorrectly, the visual will not be an aid, but a distraction.
What do we need?
First, the hardware part which consist of

• VicoVR sensor, or OrbbecPersee; https://vicovr.com/
• A sufficiently productive smartphone with Android OS on board and a screen more than 5 inches for comfortable perception (for example, Samsung GalaxyS6);
• A suitable helmet. For complete happiness, you can use the GearVR kit.
Secondly, the software part -
• Game engine Unity; https://store.unity.com/
• Development environment (MonoDevelop "out of the box" is enough);
• For VicoVR, you need to install VicoVRManager on your phone, and NuitrakManagerfor OrbbecPersee; https://play.google.com/store/apps/details?id=com.vicovr.manager
• NUITRACK ™ SDK- the sweetest part https://www.vicovr.com/developers/downloads/unity-3d-sdk
How it works?
in the words of a peasant or practice makes perfect
The sensor with the help of a color camera and a depth sensor generates data in the form of a simple image and a depth matrix, i.e. distances to each point of the field of view of the sensor. The latter can be imagined as black-and-white image, where lighter colors correspond to closer points, and dark are the most distant.
Crib:

The sensor should be installed in the way that you are in its full-scale view, preferably even with your hands up. Clear the area around you for not to kick your cat.


Further, these Bluetooth data is transmitted to the smartphone where NUITRACK™ enters the game, which with it's super advanced and secret algorithm determines the key points of your body and forms a shape skeleton as a set of 3-component vectors with 0 at the point of sensor's location of the. Remember that sensor recognizes up to 19 points of the skeleton. As a bonus, we get a lot of additional data that can be used at their own discretion, namely:

• skeleton- nuitrack.SkeletonTracker;
• depth matrix - nuitrack.DepthSensor;
• color image - nuitrack.UserFrame;
• gestures - nuitrack.GestureRecognizer;
• Palm compression - nuitrack.HandTracker.

There is nothing to worry about, all these operations are performed automatically at a rate of 30 pictures per second, just take it and use it.


The stones are different...
And once an important point about it is VR applications. Many people think that VR gives a complete immersion, and the developer can do incredibly epic scenes to surprise the user, however, there are pitfalls and a lot of tricks familiar to PC boyars, which you will have to forget regarding to VR, namely:

  • do not shake the camera during the game, for example, to create the effect of speed or explosion, it will quickly cause nausea,
  • do not move the camera in a world with variable speed and direction, it will also cause dizziness,
  • do not rotate the camera beyond matching the rotation of the user's head, if you do not want to blow up his vestibular system (we are not preparing cosmonauts)
  • know that the control will be quite unusual, as everything will have to be done with hands, feet and eyes, think through the interface and gameplay based on this,
  • place the active elements of the interface and the game world in the accessible area around the user,
  • don't make buttons small, as it will be hard to hit,
  • make the elements interactive so that hands or feet do not fall into the geometry of other objects,
  • remember the angle of the sensor is limited, so don't force user to leave the scope,
  • as the field of view is much smaller than the space around, take care of tips which are needed to pay the user's attention to key points,
  • for applications with body tracking, try to place the user face to the sensor, this ensures that all the limbs are in sight.

Get to the point...
The first step is to integrate the NUITRACK SDK into the Unity project, which is quite easy. In the Unity editor, select: Assets -> Import package -> Custom Package ... ->nuitrack.unitypackage -> Import. We will not elaborate on the details of the composite sub-capt components, because of is available here: https://www.vicovr.com/developers/docs/unity3d-manual
Note:

Nuitrack does not track the rotation of the head due to technical limitations, so you need to use the GoogleVR plugin for simple helmets or OcculusVR for the GearVR platform.
Further, for convenient data conversion from nuitrack, there is an NuitrackUtils helper class in which the extension methods are described:

• Convert any vector nuitrack.Vector3 to Unity vector UnityEngine.Vector3. Note that the Nuitrack is universal system and some of its types have a semantic similarity (in meaning) to Unity types, but in fact they do not have an implicit conversion:
public static Vector3 ToVector3(this nuitrack.Vector3 v)
{
return new Vector3(v.X, v.Y, v.Z);
}
Crib:

Do not forget that nuitrack.Vector3 stores data in millimeters, so multiply nuitrack.Vector3 vector, for example, 0.001f, to be brought into line 1 meter = 1 conventional unit in Unity (you need exactly to multiply, because this operation is faster than dividing, so optimize your code from the smallest).


• Getting the position of any joint in Unity coordinates:
public static Vector3 ToVector3(this nuitrack.Joint joint)
{
return new Vector3(joint.Real.X, joint.Real.Y, joint.Real.Z);
}

• Obtain rotation of any joint in the form of a quaternion, as it is much more convenient to work with the quaternions in Unity than with the system of Euler coordinates. It will be the main method to animate our game character, as computing the twists of the joints relative to each other and applying the result to a game skeleton, we are not dependent on the scale of the world.
public static Quaternion ToQuaternion(this nuitrack.Joint joint)
{
Vector3 jointUp = new Vector3(joint.Orient.Matrix[1], joint.Orient.Matrix[4], joint.Orient.Matrix[7]);   //Y(Up)
 
Vector3 jointForward = new Vector3(joint.Orient.Matrix[2], -joint.Orient.Matrix[5], joint.Orient.Matrix[8]);   //Z(Forward)
 
return Quaternion.LookRotation(jointForward, jointUp);
}

Not an example to follow...
A few words about today's hero and example for parsing this article game THEFT (in archive NUITRACK SDK you can see a project with the correct example of using the capabilities nuitrack). THEFT is a first-person runner with elements of Thriller, where the player is hiding from the police riding the hover-board along the narrow streets of the city. Players available following game features:

1. move left-right slopes of the body;
2. accelerate by pushing any foot;
3. shoot various items from the wrist gadget to scatter them (this will detain the police);
4. roll over the objects of the game world.

Few scenarios using the skeletoncan beunderlined a: it is a direct interaction with the world in paragraphs 1 and 4, and mediocre in paragraphs 2 and 3. Under the direct interaction we mean the case when, for example, the player'shand, can touch the game object.
Interactivity or beauty?
Also specify that we are allowing direct and not direct stream of movements. In the first case the movement of the user are fully transferred to the game character and the second one only runs a specific animation that is similar to the desired movement. The latter option can be useful when you need to avoidincorrect behavior of the character or its exit out of certain boundaries.
And where is Johnny?...
Let's start with the scene settings. On the scene, where there is a reference to one of the nuitrack component, should be the object NuitrackScripts (it can be found in the folder Nuitrack\Prefabs),when running the scene it will automatically be marked as DontDestroyOnLoad, to never have to worry about its presence during the transition to the next scene in the game.
Pay attention on the settings of the NuitrackManager script, where the appropriate flags can be used to connect or disconnect Nuitrack components.
To implement the behavior of the skeleton in the NUITRACK SDK, there is the RiggedAvatar class. You can use a more perfect analogue VicoAvatar. As in this class all the joints are hardcoded in the script, we modify RiggedAvatar, selecting the subclass JointGameObject, which contains the transform of the joint of your game skeleton and the type of the given joint from nuitrack.JointType for convenience. It allows you to configure the skeleton from the editor in easy way.
The trick:

In order for your arbitrary class to be available in the editor from an open field, specify the System.Serializable attribute.
[System.Serializable]
public class JointGameObject
{
public nuitrack.JointType typeObj;		// Joint type
public Transform obj;				// Transform the character's joint
 
public Quaternion offset;   			// Correction for the initial rotation
}
One small disclaimer: nuitrack recognizes 19 joints, but in nuitrack.JointType you'll find 25 varieties of those, it is normal, because some points are calculated additionally based on the base ones.

Amendment to the initial rotation (offset) is needed to align your skeleton and nuitrack skeleton, because at the same positions of the joints of your character and nuitrack skeleton may have a different rotation relative to the ancestor, which can cause incidents like in the first picture.
Crib:

A pose in which the total angle of rotation of the joint relative to the ancestor is 0 is the T-pose.For a correct comparison of skeletons, set the character into the T-pose.
There are scripts TPoseCalibration and CalibrationInfo to implement calibration, which are also attached to the object NuitrackScripts. Calibration should be done at the start of the game, then you'll know that the user is in a predetermined known position. Pay attention at that it is possible to choose the key pose for calibration, T-pose or X-pose when the arms bent at the elbows by 90 degrees.
Hey, look at me! Do as I do, do as I do...
Let's start with the most simple to revive our character. Remember that we use a direct stream for hands. We do some modifications to the class RiggedAvatar and then analyze it.
Public class RiggedAvatar : MonoBehaviour
{
public JointGameObject[] jointPers;
void Start()
	{
		for (int q = 0; q <jointPers.Length; q++)
jointPers[q].offers = jointPers[q].obj.rotation
}

void Update()
	{
	if (CurrentUserTracker.CurrentUser != 0)
	{
		nuitrack.Skeleton skeleton = CurrentUserTracker.CurrentSkeleton;

for (int q = 0; q <obj.Length; q++)
		{
			nuitrack.Joint joint = skeleton.GetJoint(obj[q].typeObj);
			obj[q].obj.rotation= CalibrationInfo.SensorOrientation * 
obj[q].offset * joint.ToQuaternion();
	}
}
}
}
The Start method is called when the scene loads (more details of call the procedure find here https://docs.unity3d.com/ru/530/Manual/ExecutionOrder.html). It is filled with amendments to the initial rotation of the character' joints.

In Update, the first condition verifies the presence of the recognized user. Nuitrack is able to recognize up to 10 users at a time, so CurrentUserTracker.CurrentUser returns 0 if there is no one in the view of the sensor to recognize, otherwise the user number of the first logged in view according to the queue rule (that is, when leaving the scope, the current user will be the next one who entered, regardless of when he entered).

CurrentUserTracker.CurrentSkeleton - receives the skeleton data of the current user.
Crib:

CalibrationInfo.SensorOrientation corrects for tilting vertically, this allows you to get the correct location of the skeleton, no matter how high you place the sensor.
Real heroes always go around…
Indirect stream movements was scary legendary and each developer comes up with his own... Implementing this behavior of the skeleton is a very specific task, as well as the reason that gave birth to it. In some tasks, you may only need a certain position of the joint or its speed, and in THEFT, to activate the repulsion animation, it is required that the user strictly perform a certain gesture.

In this scenario, you can go the simplest but not smartest way by adding an interlayer in the form of a gesture analyzer. It was necessary to create a separate half of the invisible skeleton, to which the movements of the user's legs transmitted by direct translation to do this.
On the joint of the knee is installed SphereCollider and RigidBody (in the figure is indicated by a red circle), as well as several BoxColliders with the flag "isTrigger", in order to process the occurrence event of the required point in the specified area (in the first figure it is indicated by green rectangles).

Further it is still simpler, objects with the isTrigger flag set the script ObjectRegistration script, which tells the control script ActionManager about the occurrence of the joint in the given area and some identifier about itself. The control script writes a chain of messages and compares it with a template to activate the animation. There are several such patterns, for example, when the user pushed with a swing of the foot forward, then a strong push occurs. We will not dwell on this implementation in detail, as this is especially individual.

Cunning of hand and no fraud...
Nothing is perfect, including Nuitrack... determination joints' position is very precise, but due to bandwidth limitations of Bluetooth (the problem is not characteristic of the sensors Orbbec Persee) is restricted to the size of the matrix of depth, that's the problem with movements of small amplitude. It brings a lot of inconvenience in the implementation of sight, when objects are far away from the player. You can help the user auto-sight, when the object enters the target area.
In practice, three versions of the aiming system were developed:

1. Look – the most accurate, but not intuitive.
2. By hand only with the use of the current skeleton - the most inaccurate, but intuitive.
3. Mixed, when the averaged point between the direction of the view and the position of the hand is determined.
The latter approach is based on the fact that usually a user looks at an object that targets, so the corrections from the sight are not visible and the aiming looks natural.

To improve the accuracy using the real position of the joints, not the final skeleton of the game character. Think about improving the script RiggedAvatar, which we discussed earlier, makes it easy to add joints separate array to be used for aiming. The joints of the shoulder, elbow and wrist of the right hand are needed for calculations.

Additional field:
public JointGameObject[] jointPersMove;
Getting the position of the joint:
Vector3 cJoint = skeleton.GetJoint(nuitrack.JointType type).ToVector3() * 0.001f;
The trick:

Unify the management in the way when the application can be tested from the editor. You can use can use the "stub" that will mimic the behavior of the skeleton by pressing, for example, any key for nuitrack methods.
This is a basic example of what can be done using nuitrack. We agree that the management of the body is a whole new experience in developing and thinking through the gameplay.. We are waiting for further content optimization and twisting settings but it's another story...