Witness scientists manipulating a robot through hand gestures while donning the Apple Vision Pro. A groundbreaking app crafted by researchers empowers users to command robots solely with head and hand movements via the Apple Vision Pro mixed reality headset. This innovation opens doors to remote machine control across various scenarios, ranging from playful antics to navigating hazardous environments.
Dubbed “Tracking Streamer,” the app meticulously monitors human movements, particularly those of the head, wrists, and fingers. It then transmits this data wirelessly over Wi-Fi to a robot on the same network, which interprets the signals into corresponding actions. This breakthrough was detailed in a concise paper released on March 9 on Github, an open-source code repository. The system meticulously tracks 26 points on the hands and wrists, along with separate data points on the head, including spatial information such as altitude.
Younghyo Park, a doctoral candidate at MIT and the app’s developer, shared a demonstration video on X (formerly Twitter), showcasing the app’s functionality. In the video snippet, Gabe Margolis, a fellow MIT graduate student and co-author of the study, commands a quadrupedal robot using hand and body gestures. Margolis effortlessly directs the robot to approach a closed door, manipulate its handle with the gripper, and enter the room. Additionally, he gestures for the robot to retrieve a plastic lid and discard it into a bin, even prompting the robot to mimic his movements at one point.
Apple introduced its Vision Pro virtual reality (VR) headset in February 2024, sparking interest as footage emerged of individuals incorporating the device into their daily routines. Preceding its launch, researchers examined the suitability of VR headsets for everyday use, highlighting issues such as latency, restricted peripheral vision, and distortions akin to fun-house mirrors. Cautionary advice was issued against wearing VR headsets while engaged in activities like driving or descending stairs.
In their paper, Park and Margolis speculated that some users might integrate the Apple Vision Pro seamlessly into their daily attire, akin to wearing glasses. Continued usage of the headset could furnish additional data, facilitating the refinement of algorithms for teaching robots to emulate human motion patterns.