# Using Skeleton Tracking in Unity **Published 12/08/2025** ## Current Hardware Status Microsoft originally made the Kinect during the Xbox 360 era, which is, unfortunately, a really long time ago at this point. They sold the Kinect and its successors until 2017 at which point it was discontinued, and they didn't do a whole lot with it until 2020, when the [Azure Kinect](https://en.wikipedia.org/wiki/Azure_Kinect) was released. Between the prior Kinect models and the Azure Kinect, the hardware found quite a bit of success in commercial applications, which is where my experience with it is. Unfortunately, in 2023 the Azure Kinect was also discontinued. Before that though in 2021, Microsoft was working with the company Orbbec[^1], which brings us to the current available hardware that can accomplish the same things that the now-unavailable Kinect could. See [their website](https://www.orbbec.com/products/#scroll-tof-camera) for more details on the products they offer in this category. ## Skeleton/Body Tracking in Unity One of the coolest features of the [Microsoft Kinect](https://en.wikipedia.org/wiki/Kinect) is/was the ability to track one or more players' skeletons within the view of the onboard cameras. You can use the position and motion information about all of the major joints tracked by the hardware to design inputs for a game or experience. [This article](https://pterneas.com/2018/04/30/orbbec-astra-nuitrack/) covers things in quite a bit more detail that I had planned to here. ```csharp namespace K4AdotNet.BodyTracking; public enum JointType { Pelvis, SpineNavel, SpineChest, Neck, ClavicleLeft, ShoulderLeft, ElbowLeft, WristLeft, HandLeft, HandTipLeft, ThumbLeft, ClavicleRight, ShoulderRight, ElbowRight, WristRight, HandRight, HandTipRight, ThumbRight, HipLeft, KneeLeft, AnkleLeft, FootLeft, HipRight, KneeRight, AnkleRight, FootRight, Head, Nose, EyeLeft, EarLeft, EyeRight, EarRight } ``` Setting up the required dependencies isn't too hard, but it's a good idea to read through some of the documentation first to get familiar with it. For a minimal setup in a Unity project, you basically just need these DLLs to be available: ![[kinect_k4a_dlls.png]] **Azure Kinect Wrapper:** https://github.com/bibigone/k4a.net **Orbbec SDK:** https://github.com/bibigone/k4a.net **Orbbec Unity SDK:** https://github.com/orbbec/OrbbecUnitySDK ## Depth Mapping & Blob Tracking For a past project, I made some handy code that can take the depth map of the hardware and turn it into blobs, which can then be individually tracked.[^2] This makes use of the Accord.NET Framework in order to process and count blobs in processed frames.[^3] The general idea is to take the depth map frame from the Orbbec camera, and using a previously-defined threshold, change the texture from one that contains many colors representing the depth of the scene to one that is 1-bit. In this case, the image becomes black and white, where white indicates an object that is close enough to be considered a blob. ```csharp for (int i = 0; i < obDepthFrame.width; i++) { for (int j = 0; j < obDepthFrame.height; j++) { try { ushort shortPixel = GetPixelValue( obDepthFrame.data, obDepthFrame.width, j, i, out int index); float normalized = (float)shortPixel / 65535f; // Black -> Closer (or invalid) // Filter out anything except stuff that's close (ideally) if (normalized == 0) { _blobTexColorsArray[index / 2] = _black; } else if (normalized < ThresholdValue) { // ThresholdValue is on the order of ~0.01 _blobTexColorsArray[index / 2] = _white; } else { _blobTexColorsArray[index / 2] = _black; } } catch (Exception ex) { Debug.LogError(quot;({i},{j}) failed with exception. " + ex.Message); } } } _filteredDepthMapTex.SetPixels(_blobTexColorsArray); _filteredDepthMapTex.Apply(); ``` Once we have a 1-bit texture representing all pixels that, by depth, are either close enough to be a blob or far enough away to not be, we can analyze it to build up our blob tracking collection. It could look something like the image in the [top answer here](https://stackoverflow.com/questions/65169869/detecting-and-counting-blobs-connected-objects-with-opencv#:~:text=an%20isolated%20blob.-,Input%3A,-python). ```csharp private IEnumerator AnalyzeBlobs(Texture2D filteredDepthMapTex) { try { byte[] data = filteredDepthMapTex.EncodeToPNG(); using MemoryStream memoryStream = new MemoryStream(data); using Bitmap bits = new Bitmap(memoryStream); using BlobCounter blobber = new BlobCounter(); blobber.MinHeight = BlobMinHeight; blobber.MinWidth = BlobMinWidth; blobber.ProcessImage(bits); Rectangle[] rects = blobber.GetObjectsRectangles(); Rectangle[] filtered = rects .Where(b => b.Width >= BlobMinWidth || b.Height >= BlobMinHeight) ?.ToArray(); ValidAndTotalBlobs = new Vector2( filtered?.Length ?? 0, rects?.Length ?? 0); OnBlobsAnalyzed?.Invoke(filtered); } catch (Exception e) { Debug.LogException(e); } finally { _blobRoutineRunning = false; } yield return null; } ``` There is quite a bit that I've left out, but hopefully you get the general appeal of hardware like this. There are a ton of possibilities for all kinds of different applications. Thanks for reading! — C #blog #development #games #sensors #unity #c_sharp #motion [^1]: https://lightbuzz.com/azure-kinect-masterclass-depth/ [^2]: https://www.orbbec.com/microsoft-collaboration/ [^3]: https://github.com/accord-net/framework