An iPhone app enabling hand, head, eye and body motion tracking.
Simo is the first approach that transforms an single off-the-shelf smartphone into a user motion tracking device and controller. Both the front and back cameras of the smartphone are used simultaneously to track the user’s hand, head, eye-gaze and body movements in real-world space and scale.
Simo is an ARKit iOS application made in Unity.
Device/hand motion + touch inputs: Users can interact by performing 3D hand movements in 6DOF (translation + rotation) and can reliably segment and further enhance their gestures by touchscreen inputs.
Head pose tracking: 6DOF head tracking. Example: This can be used for head-pointing.
Eye-gaze tracking: 6DOF eye-gaze tracking. Example: This can be used for eye-pointing.
Body pose tracking: 6DOF tracking of the torso (position + orientation). Example: This can be used for body-position or ego-centric interactions.
No specialized hardware required: No external hardware, external trackers, markers or cameras are required. Everything relies only on a single iPhone.
The Simo app tracks all following user motions simultaneously, in real-time and in world-scale:
Tracking areas of the front and back iPhone camera.
Device/Hand tracking.
Head pose tracking.
Eye-Gaze tracking.
Body tracking.
The project was last tested and run in Unity 2021.3+, Xcode 14.3, iPhone 13 Pro and iOS 16.4
System requirements:
This work is based on a publication “Simo: Interactions with Distant Displays by Smartphones with Simultaneous Face and World Tracking” in the CHI EA ‘20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. The publication also includes related work, user studies, applications, and future work directions.
Copyright (C) 2023 Teo Babic