What if you could train any 3D gesture for your game or software in 30 seconds?
Introducing our 3D Gesture Recognition Library, a patented machine learning library designed to help game, app, and experience developers quickly and reliably program gesture input for Unity, Unreal, and (basically) any software in their pipeline. MiVRy works for VR, AR, Android, and any device with 2D or 3D input.
MiVRy for Oculus Quest Hand Tracking:
(APK, loadable through SideQuest)
How our patented machine learning library
shaves days off your development time
The biggest bottleneck in production processes is often user input. A perfect example is the modern keyboard QWERTY layout – a relic from the past designed to slow down your input speed due to conditions of the time (typewriter jamming). One of the biggest promises of VR is the ability to not only perceive in 360 degrees of freedom, but to also use both controllers as 360-degree input devices, allowing greater fidelity. With mobile, limited touchscreen real estate means you may not be able to fit all the controls you want onto the screen – even though smartphone screens are bigger than ever! With MiVRy you can add simple or complex gestures to your game or app for VR, mobile (iOS or Android), or anything you can think of!
Programming 3D gestures manually is extremely tedious. That’s where MiVRy’s AI steps in, slicing that manual programming time by turning it over to an advanced neural network which can learn any gesture with 98% reliability after simply 30 repetitions, which can be performed in about 30 seconds.
This frees precious development time up so you can spend more time working on the things that matter, not tweaking gestures endlessly. Any gesture – trained and implemented – in the time it takes your coffee to brew.
- Want to draw a bow and arrow and shoot them in your VR game? You can do that.
- Hoping to easily implement movement gestures into your Android app? Done in seconds.
- Want to have your game allow user-programmed spells to cast specific effects in your spellcaster game? Easy.
- How about a series of exercises for a iOS fitness app? Piece of cake.
Our 3D Gesture Recognition AI turns what would have taken dozens or perhaps hundreds of hours of manual programming time into something you can do in minutes.
The gestures can be both direction specific (“swipe left” vs. “swipe right”) or direction independent (“draw an arrow facing in any direction”) – either way, you will receive the direction, position, and scale at which the user performed the gesture!
Draw a large 3d cube and there it will appear, with the appropriate scale and orientation.
Both hand-handed gestures, two-handed gestures, and multi-part sequential gestures are supported. MiVRy supports any sort of 3D input – from VR to a phone’s internal gyroscope and accelerometer.
Develop for mobile with MiVRy (Android)!
Check out the video to the right, demonstrating a quick test app we made to show MiVRy’s functionality on Android. Using MiVRy you can quickly and easily add 3D movement-based gestures to your game or app. Gesture recognition has never been easier for Android and iOS!
Additionally, MiVRy now also works with continuous gestures. This means it can “listen” for gestures and only perform a function when that gesture is performed.
- Learns your gestures in a few seconds with high reliability
- Works with any VR or AR device
- Works on Smartphones (Android) with the internal motion sensor
- Real 3D gestures - all directions are possible
- One-handed, two-handed, and multi-part gestures supported
- Can be used with any 3D-tracked object: controllers, fingers, trackers
- Record your own gestures - simple and intuitive, without coding
- Easy to use - single C/C++/C#/Java class or no-code with Blueprint or Bolt
- Can have multiple sets of gestures simultaneously
- Continuous gesture recognition - "listen" for gestures while the user is moving
- High recognition fidelity
- Outputs the position, scale, and orientation at which the gesture was performed
- High performance (back-end written in optimized C/C++)
- No internet connection required - runs 100% on your device
- Unity plug-in - including a no-coding-required component and Bolt support
- Unreal-Engine plug-in - both C++ and BluePrint are supported
- Android Studio project plug-in (AAR)
Licensing
MiVRy v2 is free to use, including commercial use.
However, the “free” license use is limited to 100 gesture recognitions
(or 100 seconds of continuous gesture recognition) per session.
To unlock unlimited gesture recognition, please purchase a license.
The license is on a per-project basis.
License owners also receive premium support.