eBay has created a new technology it's calling "HeadGaze," which tracks the user's head movement through the iPhone X's TrueDepth camera and ARKit so that they can navigate around eBay without touching the screen. The technology was created by Muratcan Cicek, an eBay intern with motor impairments who was looking for a way to shop online independently.
The eBay team built a model that tracks the user's head using 3D information from ARKit, creating a virtual cursor that follows the head's motion in every direction. Then, designated buttons on the screen can be activated depending on how long the cursor has been in one spot. These buttons can perform actions like scrolling down, moving to another page, selecting a product to purchase, and more, all without touching the screen.
Cicek says that the technology's modular code design will let developers "easily integrate" HeadGaze's features into existing or future apps:
This year as part of my internship project at eBay, my team and I developed HeadGaze, a reusable technology library that tracks head movement on your iPhone X and starting today, the technology is available via open source on GitHub.com. The first of its kind, this technology uses Apple ARKit and the iPhone X camera to track your head motion so you can navigate your phone easily without using your hands.
HeadGaze enables you to scroll and interact on your phone with only subtle head movements. Think of all the ways that this could be brought to life. Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone?
eBay has developed an app called "HeadSwipe" as a way to test the HeadGaze technology. HeadSwipe is focused on browsing and buying items in eBay's deals section, and can be navigated entirely through head motions on iPhone X. Both HeadGaze and HeadSwipe are available open source on GitHub.
Next, the team is also looking at technology that tracks eye movements, and potentially fusing the two experiences together in future apps.
Top Rated Comments