Interfacting with the Azure Kinect through Unity3D

This project was to interface with Microsoft’s Azure Kinect to utilize its body tracking capabilities in a Unity3D application. Furthermore, the project aimed to synchronize multiple Azure Kinect units to track bodies in a larger area. We wanted this functionality to be used in an interactive exhibit at the University of Hawaii West Oahu campus called Create(x). When a body is recognized by an Azure Kinect unit, it is given a unique ID and exposes the body’s joint information. Each body has 32 joints, each containing position and rotation information. When a person stands in a position where two would see them, two bodies would be created, one for each Azure Kinect. If this should happen, a third body would be created dubbed a “conjoined body” that would average the joint information of the two bodies. When the person steps out of view of one of the Azure Kinects, that instance’s body is destroyed and so is the conjoined body, leaving only the the single tracked body.

This project taught me much about body tracking and really fueled my interest in tracking applications. I worked on this project alone and doing so prepared me to work on more projects independently. Furthermore, this project taught me much about different types of sensors and cameras and how to interface with them to produce a deliverable application.

Click here to learn more about the Create(x) exhibit.