Gesture Recognition
MARUI offers the use of 3D gestures. You can teach MARUI any kind of motion that you perform with your controller and set it to trigger any Maya command or script.
For example, you could teach MARUI that a “waving” motion should create a new object, or that shaking your controller from side to side is supposed to duplicate the selected object.
The gestures are performed with the “Gesture” widget.
By default, this widget is available on the right controller when you hold down the “Alt” button,
but you can use the MARUI UI layout editor to map it to any button.
Gestures are managed in the MARUI Gesture Editor:
At the top of the Gesture Editor, you can see a list of the currently registered gestures.
Select a gesture from the list to see details about it in the bottom part of the Gesture Editor:
Name: This is an arbitrary name that you can give your gestures.
Command: This is the Maya command (or script) that will be triggered when you perform the gesture.
MEL/Python: Whether the command or script is a MEL or Python command.
How to create your own gestures:
Write the name of the gesture you want to create and the command it should trigger into the respective text fields and click the “Create New” button.
You can later change the name or the command by selecting the command from the list, changing the values in the text fields, and clicking “Update Selected”.
You can also delete commands by selecting them from the list and and clicking “Delete Selected”.
The field below will show how many samples (reference exampled) MARUI knows for this gesture.
If you created a new gesture, this value will initially be 0.
In order for MARUI to recognize your new gesture, you must first record a number of samples.
To do this, click the “Record sampled for this gesture” button. It should turn red to indicate that you are recording samples for this gesture.
Note: if the button is yellow, you are currently recording samples for another gesture, not the one that is currently selected in the list.
Now you can record samples for this gesture by pressing the “Gesture” widget button on your controller and performing the gesture in VR.
We recommend to record at least 10-20 samples, depending on how many different gestures you want to use. The more samples you record the more reliable the gesture recognition will be.
After you have finished recording samples, press the “Record sampled for this gesture” button again to stop recording (the button will turn grey again).
Once you have a number of samples for every gesture, you must tell MARUI to learn your gestures.
Press the “Learn gestures from recorded samples” button.
MARUI will try for about one minute to understand your gestures.
The reliability will be displayed about the button.
Once the learning process is completed, the button will turn grey again and MARUI will be able to recognize your newly created gesture.
Use the “Save” and “Load” buttons to save your gestures to hard disk before you exit Maya.
Guidelines for Recording Gestures
The reliability of the gesture recognition depends greatly on the way sample gestures are recorded. Here are guidelines for best results when recording gestures.
- Record at least 20 samples for every gesture (better to record 30~40)
- Record a similar number of samples for all gestures in one set.
Don’t have one gesture with only 20 samples and another gesture with over 100 samples. If one gesture has less samples than other gestures, record a few additional samples for that gesture. - Record the gesture in as many circumstances as possible. Record one sample of performing the gesture under some or all of the following conditions:
– looking straight ahead
– looking left / right
– looking up / down
– performing the gesture very fast or very slow
– performing the gesture without looking
– performing the gesture with the other hand (if intended to do so later)
– have the gesture performed by different people (if intended to do so later)
– other conditions under which you might want to use the gesture - The Gesture Learning / Training process can sometimes get stuck on a bad result. If you’re not satisfied, you can try running the learning/training process again with the same samples.