MiVRy Unreal Behavior Tree Documentation

MiVRy Gesture Recognition in Unreal Behavior Trees

This article guides you through the use of the MiVRy Gesture Recognition plug-in in Unreal Behavior Trees.

Unreal Engine provides an efficient and intuitive way to control the behavior of NPC with a so called “Behavior Tree”.
Gestures are a natural way for the Payer to interact with NPCs, such as giving commands or communicating emotions.
MiVRy makes the integration of Gestures into Unreal Behavior Trees easy and seamless, by providing both a Composite Node and a Decorator that you can use in your Behavior Tree to change the behavior of your NPCs when the Player performed a gesture.

Using the `MiVRy Composite`:

`Composite` Nodes in a Behavior Tree are the switches which decide what the NPC is going to do next.
The `MiVRy Composite` (C++ class name `UBTComposite_MiVRy`) is a Composite Node that will select an action based on the last gesture that the player performed – if any.
For example, you can connect two NPC actions to a `MiVRy Composite`, the first one to “wait and idle”, and the second one to “walk towards the player”.
Then you can set the `MiVRy Component` to choose the first action (“wait and idle”) until the player performs a “come here” gesture, upon which it should select the second action (“walk towards player”).
This is done by associating recorded gestures with the child nodes (or sub-trees) connected to the `MiVRy Composite`.

To use the `MiVRy Composite`, open your BehaviorTree in the Unreal Editor and create a new `MiVRy Composite` node.
Connect the node to your existing behavior tree, so that it will be considered for deciding the next action of the NPC.

MiVRy Composite Node

MiVRy Composite Details

MiVRy Actor:
In the Details of the `MiVRy Composite` node, you must set a `MiVRy Actor` which tracks the user’s motions and identifies the user’s gestures. So if you don’t have a `MiVRy Actor` in your Level yet, please create one and set its Details according to your needs (most importantly, set the “Gesture Database File” and the “Trigger Input” for the left and/or right hand according to your input scheme).

Similarity Threshold:
The `MiVRy Composite` node also allows setting a “Similarity Threshold” for gesture identification.
The Similarity Threshold an identified gesture is “good enough” to accept as a command.
This can be necessary, because MiVRy will always identify the “most likely” gesture. So if the user is making a motion that is different from all of the gestures that you recorded, it will still tell you which of the recorded gestures is “most similar” to the user’s motion.
So you may want to set a threshold for accepting gesture commands, based on how well they actually match with the recorded gesture. A value of “1.0” would mean that the user’s motion is the perfect average of all the recorded gesture samples. A value of “0.0” would mean that the user’s motion is nothing like the recorded gesture samples.

Mapping: Gesture ID -> Child:
The decision which gesture leads to which NPC behavior is defined in the `Mapping: Gesture ID -> Child` setting.
“Child” here means the Behavior Tree nodes connected at the bottom of the `MiVRy Composite` node.
The left number is the Gesture ID, the right number is the index of the child node in the Behavior Tree that you want to have executed, starting a “0” for the left-most child node and counting upwards.
For example, a mapping of “1 -> 0” would mean that when gesture “1” is identified, the left-most child in the Behavior Tree is executed. A mapping of “23 -> 1” would mean that whenever gesture “23” is identified, the second child in in the Behavior Tree is executed.
A special case is the Gesture ID “-1” which stands for “no gesture was identified”. You can use this mapping to select the NPCs behavior for the times when the player is not doing any gestures.

MiVRy Composite Gesture Mapping

NOTE: Every gesture ID can be mapped to only one child node. This can lead to problems when adding a new mapping with the (+) button because the new mapping would default to “Gesture ID = 0”. If you have already created a mapping for the Gesture ID “0”, then the Unreal Editor will reject creating new mappings. To avoid this, temporarily set the “Gesture ID” of “0” to something else, for example “999”, and return it to “0” when you’re finished adding new mappings.

MiVRy Composite Gesture Mapping Problem

Display Gesture Mapping in Node:
To simplify setting up the mappings, you can activate the “Display Gesture Mapping in Node” setting.
Then you can see all the mappings in the Behavior Tree with the respective gesture name.
However, this can slow down the display of the Behavior Tree, so you may want to disable it after you finished your setup.

Repeat Identified Gesture Child:
You can also decided which NPC behavior to select after the NPC has completed the action associated with the gesture with the “Repeat Identified Gesture Child”.
For example, if you map a “come-here” gesture to the NPC behavior “walk-to-player” and activate the “Repeat Identified Gesture Child”, then the NPC will continue to follow the player around until a different gesture is performed. If you deactivate “Repeat Identified Gesture Child”, then the NPC will only walk to the player once and then return to its default behavior.

Repeat the ‘No-Gesture’ Child:
There is a separate option to “Repeat the ‘No-Gesture’ Child”. The ‘No-Gesture Child’ here means the child node mapped to the Gesture ID “-1” as described above.

Blackboard Keys for latest Gesture:
If you want to control the NPC behavior not only by which gesture was performed but also by the way it was performed, you can do so via Blackboard variables.
For example, if your player performs a “look there” gesture, you may want the direction of the gesture motion to control in which direction the NPC looks.
In the “Blackboard Keys for latest Gesture” you can chose to save details about the performed gesture into Blackboard variables which can then be used by other Behavior Tree nodes to control the NPC.
For example, you can create a Blackboard variable of the type “Vector” and name it “Direction”, and then in the `MiVRy Composite` chose to save the “Primary gesture direction” (of either hand) into that variable.
Then, in other Behavior Tree nodes you can read that variable to control the NPC’s behavior.

MiVRy Composite Blackboard Variables

When child finishes:
Finally, you can also chose what to do when the child node selected by a gesture finishes, based on whether the NPC successfully performed the action or not.
For example, when a “come-here” gesture causes the NPC to walk to the player, you can select “Continue Execution” so that the NPC will wait for the next gesture. But if the NPC failes to walk to the player and aborts its action – for example when it sees an enemy – you can select to “Stop Execution”. The `MiVRy Composite` node will than return to the next higher-level node so that the NPC behavior can be controlled by other nodes – for example those nodes defining how to engange the enemy.

Using the `MiVRy Decorator`:

Alternatively, you can attach a `MiVRy Decorator` to any node in your Behavior Tree to add Gesture Recognition to it.
A `Decorator` is a kind of “plug-in” to a node that filters whether the node should be evaluated at all.
The `MiVRy Decorator` will either allow or deny the evaluation of the node based on the gesture performed by the user.
For example, you can have a Behavior Tree node that controls an NPC to “walk around randomly” but may want that behavior to stop as soon as the player performs a “come-here” gesture.
To use a `MiVRy Decorator`, right-click on your node in the Behavior Tree and select “Add Decorator… -> MiVRy Decorator”.

MiVRy Decorator Add To Node

MiVRy Decorator

MiVRy Decorator Details

MiVRy Actor:
In the Details of the `MiVRy Decorator`, you must set a `MiVRy Actor` which tracks the user’s motions and identifies the user’s gestures. So if you don’t have a `MiVRy Actor` in your Level yet, please create one and set its Details according to your needs (most importantly, set the “Gesture Database File” and the “Trigger Input” for the left and/or right hand according to your input scheme).

Similarity Threshold:
The `MiVRy Decorator` also allows setting a “Similarity Threshold” for gesture identification.
The Similarity Threshold an identified gesture is “good enough” to accept as a command.
This can be necessary, because MiVRy will always identify the “most likely” gesture. So if the user is making a motion that is different from all of the gestures that you recorded, it will still tell you which of the recorded gestures is “most similar” to the user’s motion.
So you may want to set a threshold for accepting gesture commands, based on how well they actually match with the recorded gesture. A value of “1.0” would mean that the user’s motion is the perfect average of all the recorded gesture samples. A value of “0.0” would mean that the user’s motion is nothing like the recorded gesture samples.

Use of Gesture ID list:
You can use the `MiVRy Decorator` deny or allow the execution of NPC behavior (Behavior Tree nodes) based on a list of Gesture IDs which you consider “acceptable” or “reason to cancel the behavior”.
This can be either a “Whitelist” (ie: the gestures in the list are the ones who permit the NPC to perform this behavior) or “Blacklist” (ie: the gestures in this list are the ones who do NOT permit the NPC to perform this behavior.

Gesture ID list:
To set which gestures the `MiVRy Decorator` considers the “right” or “wrong” gestures to affect NPC behavior, add them to the “Gesture ID list”.
The gesture ID “-1” stands for “no gesture was performed”.
NOTE: Every gesture ID can be in the list only once. This can lead to problems when adding a new item with the (+) button because the new item would default to “Gesture ID = 0”. If you have already created an item for the Gesture ID “0”, then the Unreal Editor will reject creating new items. To avoid this, temporarily set the “Gesture ID” of “0” to something else, for example “999”, and return it to “0” when you’re finished adding new items.

Display Gestures In Node:
To simplify setting up the gesture list, you can activate the “Display Gestures In Node” setting.
Then you can see all the gestures in the Behavior Tree with the respective gesture name.
However, this can slow down the display of the Behavior Tree, so you may want to disable it after you finished your setup.

Evaluate every gesture only once:
This setting chooses what happens after the performed gesture was considered.
When activated, the `MiVRy Decorator` will return to acting as if no gesture was ever performed.
When deactivated, the `MiVRy Decorator` will continue to use the last performed gesture as a reference for whether to allow or deny the NPC behavior.

How to use the Gesture Manager

You can get the GestureManager app at https://www.marui-plugin.com/download/mivry/MiVRy_GestureManager.zip

This app allows you to record and edit Gesture Database Files easily in VR.
Important: The GestureManager is written in Unity, so when you use Gesture Database Files created with the GestureManager in Unreal, be sure to check the “Unity Compatibility” option in the MiVRyActor or GestureRecognitionActor.
When you run the GestureManager, a floating panel will appear. You can move the panel by touching the red ball on it’s top. The ball is ‘sticky’, allowing you to move the panel. To stop dragging the panel, just pull your controller away with a sudden “yanking” motion.
A video tutorial on how to use the GestureManager in VR is available on YouTube:
https://www.youtube.com/watch?v=xyqeacqpES8

Important input fields in the GestureManager:

Number of Parts: How many motions – at most – comprise a gesture. A gesture consisting of one single hand motion has one part. A two-handed gesture has two parts, one for the left hand and one for the right hand. It is also possible to use gesture combinations where one hand has to perform multiple sequential motions (such as writing three letters – the individual letters are parts to a combination triplet). The number you put in this field decides the maximum. You can still also have combinations with less parts (for example: one-handed gestures among two-handed gestures).

Rotational Frame of Reference: How direction like “up”, “down”, “left”, “right”, “forward” and “back” are defined. For example, if a player is looking at the ceiling and performs a gesture in front of his face, in the “world” frame-of-reference, the gesture was performed “upward” because it was performed above the player’s head. But in the “head” frame-of-reference, the gesture was performed “forward”. This can decide which gesture is identified. For example, if you have a “punch the ceiling” gesture and a “punch the ground” gesture, you must choose a “world” frame-of-reference, but if you have a “touch my forehead” gesture and a “touch my chin” gesture, a “head” frame-of-reference may be more appropriate. The frame of reference can be selected separately for yaw (left-right / north-south), pitch (up/down) and roll (tiling the head).

Record Gesture Samples: This selects I for which gesture you want to record new samples or if you want to test the identification instead (please note that new samples do not have any effect until the “training” was performed). When you record samples, please make sure that you record the gesture many different ways. For example, if the player should be allowed to perform the gesture with a small motion and a large motion, be sure to record both small and large samples. It can also help to record gesture samples from several people to ensure that particular habits of one person don’t affect the recognition for other players.

Start Training / Stop Training: This starts or interrupts the training process where the AI tries to learn your gestures. The “Performance” value which is updated during the training indicates how many of your gestures the AI can already correctly identify. Even when the training the training is stopped prematurely the result is still preserved, so you can stop it as soon as you are satisfied. Sometimes the AI ‘misunderstands’ your intentions and the future recognition of gestures is not satisfactory. In this case, just re-run the training process. If the result still is not good, please record more gesture samples with greater variation to make it clearer to the AI what you intend.

How to use the MivryActor gesture recognition object

(1) Add the MivryActor to your level. (If you can’t find it in the Content Browser, check the View Options that “Plugin Content” is enabled).

(2) In the Details panel, choose the file path of the Gesture Database file to load (can be an absolute path or a relative path inside the project). If the file was created with the Unity version of MiVRy (including the GestureManager), make sure to check the “Unity Gesture Database File” checkbox.

(3) In the Details Panel, set the fields of the MiVRyActor:
– “Gesture Database File“:
The path to the gesture recognition database file to load.
This can be an absolute path (such as “C:\MyGestures\MyGestureFile.dat”) or relative to the projects root directory (such as “Content/MyGestureFile.dat”)/
– “Left Motion Controller” / “Right Motion Controller“: (Optional)
A motion controller component that will be used as the position of the left hand.
– “Left Hand Actor” / “Right Hand Actor” (Optional)
An actor to use as the left and right hand position in the level.
If you set neither of the “Motion Controller” nor “Hand Actor” fields, MiVRy will try to use Unreals AR functions to get the position of your motion controllers, which may not work with all VR-PlugIns.
– “LeftTriggerInput” / “RightTriggerInput“: (Optional)
The name of the input in the Input Manager (in Project settings) which will be used to start/end the gesture.
If you don’t set these, you will have to use the Blueprint or C++ functions of the MiVRy actor to trigger the start and end of a gesture.

How to use the GestureRecognitionActor (for one-handed gestures):

(1) Add a GestureRecognitionActor to your level.

(2) Use the “Create New Gesture” function to create new gestures.

GestureRecognitionActor Create Gesture

(3) Record a number of samples for each gesture by using the “startStroke” function with the, “contdStroke” and “endStroke” functions for your registered gestures, each time inputting the location and rotation of the headset or controller respectively.

GestureRecognitionActor Start Stroke

GestureRecognitionActor Continue Stroke

GestureRecognitionActor End Stroke

Repeat this multiple times for each gesture you want to identify.
We recommend recording at least 20 samples for each gesture.

(4) Start the training process by using the “startTraining” function.
You can optionally register delegates / callback events to receive updates on the learning progress.

GestureRecognitionActor TrainingCallbacks

You can stop the training process with the “stopTraining” function.
After training, you can check the gesture identification performance with the “recognitionScore” function.

(5) Now you can identify new gestures performed by the user in the same way as you were recording samples. Just set the “Record as Sample” parameter of the “startStroke” function to “-1”. The “endStroke” function will provide the ID and name of the identified gesture, together with a similarity measure (0 to 1) of how closely the gesture performance resembled the recorded gesture samples.

(6) You can save and load your gestures to a gesture database file with the “Save Gestures To Gesture Database File” function.

IMPORTANT: If you wish to use your gestures in a Unity app (for example with the Unity-based “GestureManager”, then make sure you enable the “Unity Compatibility Mode” in the Details Panel of the GestureRecognitionActor before you record any gesture samples!

How to use the GestureCombinationsActor (for two-handed gestures or gesture combos):

(1) Place GestureCombinationsActor in your level and set the desired “Number of Parts” in the Details Panel. For two-handed gestures, this is usually “2”, but if you intend to use combos of multiple sequential gesture motions for one or two hands, you can choose a different number.

(2) Create a new Gesture Combination.

GestureCombinationsActor Create Combination

(3) Create new Gestures for each part, starting with part number “0”. (You can use “0” to mean “left hand” and gesture part “1” to mean right hand, or any other way to identify the different parts of the Gesture Combination.)

GestureCombinationsActor CreateGestures

(4) Then set the Gesture Combination to be the combination of those gestures.

GestureCombinationsActor Set Parts

(5) Record a number of samples for each gesture using the startStroke, contdStroke and endStroke for your registered gestures. See Section 6 Point 3 of this document for details. The gestures for the various parts can be recorded in any order (first left hand then right or first right hand then left) or simultaneously. We recommend recording at least 20 samples for each gesture, and have different people perform each gesture in different ways.

(6) Start the training process with the startTraining function.
You can optionally register delegates / callback events to receive updates on the learning progress and the end of the training.

GestureCombinationsActor TrainingCallbacks

You can stop the training process by using the stopTraining function. After training, you can check the gesture identification performance with the recognitionScore function (a value of 1 means 100% correct recognition).

(5) Now you can identify new gestures performed by the user in the same way as you were recording samples by using the “startStroke”, “contdStoke”, and “endStroke” functions, just by setting the “Record as Sample” parameter to “-1”. Again, the order of performances (first left then right, first right then left, or simultaneously) does not matter.
After all parts (for example left and right hand, or just one hand when only one hand was gesturing) have been completed, use the “Identify Gesture Combination” to find out which Gesture Combination was performed by the user.

GestureCombinationsActor Identify Combination

(6) Now you can save and load the gestures (and the trained AI) by using the “Save to Gesture Database File” and “Load Gesture Database File” functions. The path can be either absolute or relative within your project.

GestureCombinationsActor Save File

Build Instructions

(1) Add the folder of your GestureDatabase files to be included in the build. In the “Project Settings”, go to the “Packaging” section, find the “Additional Non-Asset Directories to Copy” settings and add the folder where your GestureDatabase files are located.
(Important: not “Additional Non-Asset Directories to Package” – these end up in the PAK file.

(2) If you’re building for the Quest, please follow the official guide: https://developer.oculus.com/documentation/unreal/unreal-quick-start-guide-quest/
Also, in the Project Settings, “Android” section, enable “Support for arm64” and disable “Support for armv7”.

Troubleshooting and Frequently Asked Questions

(1) Where and when in my own program do I have to create the MiVRyActor, GestureRecognitionActor or GestureCombinationActor?

You can add any of the actors before run-time or spawn them during run-time. You can also spawn several actors.

(2) How can I use combinations of one-handed and two-handed gestures?

Use a GestureCombinationsActor with two parts (for left hand and right hand) and then create Gesture Combinations that only define a gesture for one part (ie hand). The following table shows an example:

Gesture Combination:Gesture on part 0 (“left hand”):Gesture on part 1 (“right hand”):
Wave (both hands)WaveWave
Wave (left hand)Wave
Wave (right hand)Wave
SaluteSalute
Clap handsClapClap

You can also use the GestureManager to create such GestureCombinations and then load the resulting file with a MiVRy actor.

(3) How do I identify gestures without having to trigger the “start” and “end” of the gesture.

To identify gestures continuously without a clear “beginning” and “end”, use the “contdIdentify” function. You still have to call “startStroke” once (for example at the start of the level), and have to continuously provide the controller position with the “contdStroke” function. Then, after the “contdStroke”, use the “contdIdentify” function to identify the currently performed gesture. Use the contdIdentificationPeriod value to control how long of a time frame to consider in the identification. You can also use contdIdentificationSmoothing to avoid the identification result from jumping from one gesture ID to another too easily.

(4) How can I open and edit gesture database (.DAT) files?

Please use the “GestureManager” ( https://www.marui-plugin.com/documentation-mivry-unreal/#gesturemanager ) to open and edit “.DAT” gesture database files. Please note that the GestureManager is based on Unity, so if you record gestures in the GestureManager and want to use them in Unreal, enable the “Unity Compatibility Mode” in your MiVRyActor, GestureRecognitionActor, or GestureCombinationsActor.

(5) The Gesture Recognition library does not detect if a gesture is different from all recorded gestures. How do I find out if the user makes a completely new (or “wrong”) gesture?

The gesture recognition plug-in will always return the number of the gesture which is most similar to the one you just performed.
If you want to check if the gesture you made is different from all the recorded gestures, check the “Similarity” value that you receive when identifying a gesture. This is a value between “0” and “1”, where “zero” means that the gestures are completely different, and “one” means that the performed gesture is a perfect average of all the recorded samples.

Thus, a value of one will indicate perfect similarity, a low value close to zero indicate great differences between the performed gesture and the recorded gesture samples. You can use this value to judge if the performed gesture is sufficiently similar to the recorded one.

(6) I want to use Gesture Recognition in my commercial project. What commercial licensing options do you provide?

Please contact us at support@marui-plugin.com for details.

(7) Do I have to use the “startTraining” function every time I start my game? Does it have to keep running in the background while my app is running?

No, you only need to call startTraining after you have recorded new gesture data (samples) and want these new recordings to be used by the AI. However, you need to save the AI after training to a database file (.DAT) and load this file in your game before using the other gesture recognition functions.
While the training is running, you cannot use any of the other functions, so you cannot let training run in the background. You must start (and stop) training in between using the AI.

(8) How long should I let the training run to achieve optimal recognition performance?

Usually, the AI will reach its peak performance within one minute of training, but if you’re using a large number of gestures and samples, it may take longer. You can check the current recognition performance from the training callback functions and see if the performance still keeps increasing. If not, feel free to stop the training.

(9) Gestures aren’t recognized correctly when I look up/down/left/right or tilt my head.

You can choose if the frame of reference for your gestures are the players point of view (“head”) or the real world or game world (“world”). For example, if the player is looking up to the sky when performing a gesture towards the sky, then from a “world” frame-of-reference the direction is “up”, but from players “head” point-of-view, the direction is “forward”. Therefore, if you consider your gestures to be relative to the world “up” (sky) and “down” (ground) rather than the visual “upper end of the screen” and “lower end of the screen”, then change the frameOfReferenceUpDownPitch to FrameOfReference.World. The same setting is available for the yaw (compass direction) and head tilt.

(10) MiVRy identifies any motion as some gesture, even when it doesn’t resemble any of the recorded gestures. Why? How can I tell if no valid gesture motion was performed?

MiVRy will always tell you the “most likely” best guess as to which gesture was just performed, no matter how different the currently performed motion is from all recorded gestures. This is because we cannot decide for you how much difference is tolerable.
In order to disqualify “wrong” motions, you have two options:
(A) you can check the “similarity” value returned by MiVRy. This value describes how similar the gesture motion was compared to previous recordings on a scale from 0 (very different) to 1 (very similar).
(B) you can check the “probability” value. Especially when you compare the probability values for all recorded gestures (for example via the “endStrokeAndGetAllProbabilitiesAndSimilarities” function) and see that they are all very low and not very different from one another, you may want to decide that the current gesture performance was not valid.

(11) What exactly does die “similarity” value of a gesture performance mean? How is it different from the probability value?

The “similarity” value expresses how much the identified gesture differs from the average of the recorded samples for that gesture. When you record several samples, MiVRy internally calculates a “mean” (“average”, “typical”) gesture motion based on those samples. It also calculates how much the recorded samples differ from this “mean” (ie. the “variance” of the samples). The “similarity” value is then calculated based on this “mean”. If your newly performed gesture motion hits exactly this “average”, then the similarity value will be one. The more it differs, the lower the “similarity” value will be, going towards zero. How fast it will fall depends on how similar the recorded samples were. If all recorded samples looked exactly the same, then MiVRy will be very strict, and the “similarity” value will fall fast when the currently performed motion isn’t also exactly alike. If, however the samples differed a lot, MiVRy will be more tolerant when calculating the “similarity” value and it will be higher. The value is always between 0 and 1. This “similarity” is different from the “probability” values, which are estimates by the artificial intelligence (neural network). “Probability” may contain many more considerations, for example if there are other gestures who resemble the identified gesture (probability may drop, similarity is unaffected), or if there are a multitude of distinct motions lumped together as one “gesture” (for example: having a gesture “alphabet” which contains drawings of “A”, “B”, “C” etc all lumped together as one gesture – then “similarity” will be calculated based on an “average” character that doesn’t resemble any sample, but the AI may successfully understand what you mean and give high “probability” values).

(12) Instead of triggering the start and end of a gesture motion, I want MiVRy to constantly run in the background and detect gestures as they occur.

You can use the “Continuous Gesture Identification” feature of MiVRy. When using the “GestureRecognition” or “GestureCombinations” objects directly, use the “contdIdentify” function – you can call this function repeatedly (for example on every frame or when something in your app happens) and every time it will tell you which gesture is currently being performed. When using the Unity “Mivry” component or the UnrealEngine “MivryActor”, use the “Continuous Gesture Identification” switch. Either way, two settings are important for Continuous Gesture Identification: “Continuous Gesture Period” and “Continuous Gesture Smoothing”. “Continuous Gesture Period” is the time frame (in milliseconds) that continuous gestures are expected to be. So if your gestures take 1 second to perform, set this to “1000” so that MiVRy will consider the last 1000 milliseconds to identify the gesture. “Continuous Gesture Smoothing” is the number of samples (previous calls to “contdIdentify” to use for smoothing continuous gesture identification results). When setting this to zero, each attempt to identify the gesture will stand alone, which may lead to sudden changes when switching from one gesture to another. If ContinuousGestureSmoothing is higher than zero, MiVRy will remember previous attempts to identify the gesture and will produce more stable output.

(14) What is the “Update Head Position Policy” / “Compensate Head Motion” setting?

This setting decides whether the AI should consider changes in head position during the gesturing.
During gesturing, the current position of the VR headset can/will be updated via the “updateHeadPosition” procedure.
This data is saved together with the motion data.
However, in many cases it is not advisable to take head motions during gesturing into account, because people may watch their hands while gesturing.
Following the moving hands with the head would then eliminate the hand motion relative to the headset (the hands would always be “in front of the headset”).
However, in some cases it may be useful to use the changing head position, for example if the user might be walking during a gesture.
You can chose whether or not the the data provided via calls to “updateHeadPosition” functions will be used with the UpdateHeadPositionPolicy (or call to GestureRecognition.setUpdateHeadPositionPolicy().
“UseLatest” will cause MiVRy to use the changing head position, thus compensating the relative head motion during gesturing.
“UseInitial” will not consider changes in head motion during gesturing, but only the head position at the start of the gesture.
Note that if you use a GestureRecognition or GestureCombinations object directly, you also need to provide the changing head position via “updateHeadPosition()” for this to have any effect.
Also note that the data provided via “updateHeadPosition” is stored regardless of the policy, even when it is not used later.