Introduction

Notch can read both RGB and Depth data from both the Kinect 1, Kinect 2 and Kinect 4 depth sensors. You can use this data in a whole variety of ways, from generating meshes, emitting particles, generating fields, and using the depth and colour images in video processing chains.

Setup

For Kinect 1 and Kinect 2 you will need a Kinect 1 or a Kinect 2 sensor with a Kinect for Windows Adapter. Notch uses the official Microsoft SDKs to integrate with the Kinect cameras, so you may also need to download the relevant drivers from Microsoft (Kinect 1 Drivers v1.8 and Kinect 1 SDK v1.8, Kinect 2 Drivers).

Connect your Kinect to your PC and open Notch Builder. Go to menu Devices > VideoIn/Camera/Kinect Settings to find the Video In settings. Depending on which Kinect sensor you have, tick “Kinect Enabled” underneath the corresponding device, hit apply, and you should notice your Kinect light up. It is possible to read only the colour or depth channel from the Kinect camera if the other channels are not required in order to improve performance.

Kinect 4 is engineered to only require a power connection and a USB connection (no adapter or separate driver download required), a USB 3.0+ port is required.
With Kinect 4, If you are experiencing unstable / glitchy frames, please insert the Kinect 4 into another USB 3.0+ port.

We are currently running 1.4.0 of the Kinect4 SDK, you will need to update your device firmware before using it within Notch. Download the 1.4.0 msi install package from: https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/develop/docs/usage.md and install the firmware to all of your Kinect 4 devices according to the instructions here: https://docs.microsoft.com/en-us/azure/kinect-dk/azure-kinect-firmware-tool.

Kinect 4 Setup Tips

To track skeletons you must install the Azure Kinect Body Tracking SDK which can be found here: https://docs.microsoft.com/bs-latn-ba/azure/Kinect-dk/body-sdk-download. Click on the “msi” link to download version 1.0.1. Once installed, you will need to copy some files to your Notch installation folder. Go to “C:\Program Files\Azure Kinect Body Tracking SDK\tools” and copy the following files:
cublas64_100.dll
cudart64_100.dll
cudnn64_7.dll
depthengine_2_0.dll
dnn_model_2_0.onnx
k4abt.dll
onnxruntime.dll
to “C:\Program Files\Notch”.

Some caveats to use Kinect 4 Body tracking (k4abt):

  • AMD USB controllers have been seen to cause crashes within the Kinect 4 body tracking library, so an Intel motherboard is required for stability.
  • The Kinect 4 body tracking library is based on NVIDIA Cuda, therefore an NVIDIA GPU is required for body tracking.

Please check your machine specification when using body tracking according to these guidelines.

Adding Kinect sources to your scene

Add a Depth Camera / Kinect Source Node to the Nodegraph (Video Processing > Input Output). This is the source node for any Kinect related outputs. This node contains both the RGB and depth image from the Kinect. Connecting it to a Kinect Mesh node will allow you to visualize what the Kinect sensor can see in 3D space. You need one Video Kinect Source for each physical Kinect camera in use.

By default, this node generates an alpha channel which is clipped using the depth image and the near and far planes set on the node. You may also output depth from this node as luminance in the RGB channel.

Nodes that use Kinect source

Video nodes can use Kinect RGB or Depth images as source as inputs. However, the following nodes are specifically made for use with the Kinect:

Using multiple Kinects with Notch

While Notch supports multiple Kinect 1’s (up to 4), the PC hardware / USB buses have limitations on the number of Kinects that they support simultaneously. Users have successfully used up to 4 Kinect 1’s on a single machine but this is dependent on the PC hardware itself. Only one Kinect 1 camera may be used per USB bus lane; on a typical PC, multiple USB ports share a single lane, so it is usually necessary to add additional USB add-on cards to make 4 Kinects work.

Only one Kinect 2 is supported per PC. This is a driver restriction, not a Notch restriction.

Use of Kinect 1 and Kinect 2 is mutually exclusive – only one or other’s drivers may be installed on the PC at any one time and only one or other may be used in Notch at once, so it is not possible to mix Kinect1 and Kinect2 in one project.

Use of Kinect 2 and Kinect 4 has been tested to work simultaneously as the Kinect 4 has no driver restriction as with Kinect 1.

Use of multiple Kinect 4’s is possible, on the same machine with appropriate USB controller cards (to ensure bandwidth isn’t hindered to the motherboard) but currently, this is an un-tested feature in Notch.

Skeletal data

Notch supports the Microsoft Skeletal data from Kinect 1 and 2. However, we have not found it to be suitable in production environments. The conditions required for Microsoft’s skeletal data don’t lend themselves to the stage or public spaces. Specifically, it requires a control pose at the beginning and the user to stay in a control space in view. Notch does not also support more than one Skeletal data at a time.

Recording Kinect data

Notch is able to record a stream of Kinect data for playback when there is no camera connected. In order to do this, use the “Capture Kinect1/2/4 Data” option in the Devices menu (Builder Pro only). Recording begins immediately and continues until the option is used again to switch it off. The resulting file may be loaded into Notch as a “Kinect Stream” resource. The Video Kinect Source node has a parameter which takes a pre-recorded stream which will then be used as a substitute for the live Kinect camera feed.

Considerations for the use of Kinect in performance environments

Using Kinect cameras takes care and planning. It was designed as a gaming controller, not for performance work. If you are inexperienced with their use, we strongly recommend you engage with an experienced interaction/integration specialist. Specific considerations:

  • IR Interference: Incandescent light sources (including the sun, and some models of strobe) significantly degrade the Kinect depth image, making it unusable.
  • Signal extension: Extension of Kinect 1 USB cabling is known to be achievable. Certain models of powered USB over ethernet extenders have been found to work to extend the range over 50 metres. However, users still report significant technical challenges in extending Kinect 2. Kinect 4 has its own restrictions for USB extension and should be researched before use.
  • Field of View: Both Kinect 1 and 2 sensors have very particular field of views and both version’s depth and RGB FOV’s differ.
  • Kinect 4 has a number of different FOV modes and colour sensor resolutions for near and far-field sensing, please find the best solution for your usage.

Troubleshooting

My Kinect camera doesn’t show up in Notch. What do I do?

  • Does the camera work outside of Notch? The Kinect SDKs (1/2/4) come with a number of example programs that test the cameras. “Kinect Studio” for Kinect 1 or 2 is worth trying, as is the Kinect4 SDK example “Azure Kinect Viewer” app which can be downloaded from here: https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/develop/docs/usage.md
  • The status of the Kinect cameras are visible in Device Manager. They may be missing drivers, or the problem may be due to the USB connection or lack of power.
  • Kinect 2 and Kinect 4 require USB3.0+ (sometimes a USB3.1 gen2 port can help with bandwidth), a compatible USB bus and a suitable software USB stack, which is available only in Windows 8 and up. Windows 7 is not able to support the Kinect 2 and Kinect 4. Older PCs may also be unable to support the device.

My Kinect camera doesn’t show up in exported blocks. What do I do?

  • Using Kinect 4 in exported blocks requires k4a.dll and depthengine_2_0.dll (found in the Notch installation folder) to be copied to the media server.

The frame rate of the camera appears poor, even though Notch’s frame rate is fine.

  • The Kinect 1 and 2 are both capable of 30hz input. Kinect 1 is only capable of 30hz at 640×480 and below.
  • The frame rate of the Kinect is limited by the speed of your USB bus. Multiple cameras and slow/busy buses will affect this. If you have multiple cameras connected and low frame rate, try disconnecting some cameras.
  • If your scene only requires depth data from the Kinect, try disabling the colour channel in the Video In / Kinect Settings dialogue. This will reduce the amount of bandwidth used.
  • If multiple programs try and use the Kinect camera at once the frame rate of the camera seen in each program will be reduced. This is particularly relevant if running Kinects in Notch standalone at the same time as Builder, or running instances of the Notch plugin inside media servers. Only one active plugin may use the Kinect at any time to achieve full frame rate.
  • Kinect 2 has a low light mode where it will drop the framerate from 30hz to 15hz, so in low light environments, this will be triggered.
  • Kinect 4 has problems with certain USB controllers, refer to https://docs.microsoft.com/en-us/azure/kinect-dk/troubleshooting if you have inconsistent or unstable data shown in their SDK test apps.
  • Sometimes Kinect4 sensors can get into a bad state, in the log there will be messages referring to “libusb”. If this happens then disconnect any sensors from power and reconnect, then go to Kinect Settings and disable Kinect 4, then re-enable.