Introduction #
Notch can read both RGB and Depth data from both the Kinect 1, Kinect 2 and Kinect 4 depth sensors. You can use this data in a whole variety of ways, from generating meshes, emitting particles, generating fields, and using the depth and colour images in video processing chains.
Setup #
For Kinect 1 and Kinect 2 you will need a Kinect 1 or a Kinect 2 sensor with a Kinect for Windows Adapter. Notch uses the official Microsoft SDKs to integrate with the Kinect cameras, so you may also need to download the relevant drivers from Microsoft Kinect 1 Drivers v1.8 and Kinect 1 SDK v1.8, Kinect 2 Drivers.
Connect your Kinect to your PC and open Notch Builder. Go to menu Devices > VideoIn/Camera/Kinect Settings to find the Video In settings. Depending on which Kinect sensor you have, tick “Kinect Enabled” underneath the corresponding device, hit apply, and you should notice your Kinect light up. It is possible to read only the colour or depth channel from the Kinect camera if the other channels are not required in order to improve performance.
Kinect 4 is engineered to only require a power connection and a USB connection (no adapter or separate driver download required), a USB 3.0+ port is required. With Kinect 4, If you are experiencing unstable / glitchy frames, please insert the Kinect 4 into another USB 3.0+ port.
We are currently running 1.4.1 of the Kinect4 SDK, you will need to update your device firmware before using it within Notch.
- Download & install the 1.4.1 msi install package
- Install the firmware to all of your Kinect 4 devices according to the instructions
Kinect 4 Setup Tips #
Using Kinect 4 in exported blocks requires k4a.dll and depthengine_2_0.dll (found in the Notch installation folder) to be copied to the media server.
For Skeleton Tracking and Body Index Segmentation you must install the Azure Kinect Body Tracking SDK. Click on the “msi” link to download version 1.1.1.
Some caveats to use Kinect 4 Body tracking (k4abt):
- AMD USB controllers have been seen to cause crashes within the Kinect 4 body tracking library, so an Intel motherboard is required for stability.
- The Kinect 4 body tracking library is based on the DirectML inferencing engine, which uses the ONNX machine learning model format. Whilst this now supports different GPU vendors for body tracking there are some minimum GPU specs.
- AMD GCN 1st Gen (Radeon HD 7000 series) and above
- Intel Haswell (4th-gen core) HD Integrated Graphics and above
- NVIDIA Kepler (GTX 600 series) and above
- Qualcomm Adreno 600 and above
- See the Microsoft docs for more hardware requirement details.
Please check your machine specification when using body tracking according to these guidelines.
Adding Kinect sources to your scene #
You will need to activate the Kinect camera you want to use by going to Devices > VideoIn/Camera/Kinect Settings and enabling it. Otherwise, it will not be initialized by Notch. This is by design, so Notch does not lock the device from the use by other software when you are not actively using it in Notch.
Add a Depth Camera / Kinect Source Node to the Nodegraph (Video Processing > Input Output). This is the source node for any Kinect related outputs. This node contains both the RGB and depth image from the Kinect. Connecting it to a Kinect Mesh node will allow you to visualize what the Kinect sensor can see in 3D space. You need one Video Kinect Source for each physical Kinect camera in use.
By default, this node generates an alpha channel which is clipped using the depth image and the near and far planes set on the node. You may also output depth from this node as luminance in the RGB channel.
Nodes that use Kinect source #
Video nodes can use Kinect RGB or Depth images as source as inputs. However, the following nodes are specifically made for use with the Kinect:
- Depth Camera / Kinect Mesh, Generates a mesh from a Kinect source.
- Kinect1 Skeleton, Generates skeletal data from a Kinect 1 camera, which can be used to drive a character animation live.
- Kinect2 Skeleton, Generates skeletal data from a Kinect 2 camera, which can be used to drive a character animation live.
- Kinect4 Skeleton, Generates skeletal data from a Kinect 4 camera, which can be used to drive a character animation live.
- Depth Camera / Kinect Colour Key.
- Depth Camera / Kinect Source, Loads the Kinect 1, Kinect 2 and Kinect 4 Colour and Depth Data.
Using multiple Kinects with Notch #
While Notch supports multiple Kinect 1’s (up to 4), the PC hardware / USB buses have limitations on the number of Kinects that they support simultaneously. Users have successfully used up to 4 Kinect 1’s on a single machine but this is dependent on the PC hardware itself. Only one Kinect 1 camera may be used per USB bus lane; on a typical PC, multiple USB ports share a single lane, so it is usually necessary to add additional USB add-on cards to make up to 4 Kinects work.
Only one Kinect 2 is supported per PC. This is a driver restriction, not a Notch restriction.
Use of Kinect 1 and Kinect 2 is mutually exclusive - only one or other’s drivers may be installed on the PC at any one time and only one or other may be used in Notch at once, so it is not possible to mix Kinect1 and Kinect2 in one project.
Use of Kinect 2 and Kinect 4 has been tested to work simultaneously as the Kinect 4 has no driver restriction as with Kinect 1.
Use of multiple Kinect 4’s is possible, on the same machine, with appropriate pci-express based USB controller cards (to ensure bandwidth isn’t hindered to the motherboard). In testing with an extra usb controller card only one pci-e slot worked reliably so be aware that testing hardware setup is crucial to success with using multiple kinect 4’s on a single machine.
Skeletal data #
Notch supports the Microsoft Skeletal data from Kinect 1, 2 and 4 (Azure). However, we have not found it to be suitable in production environments. The conditions required for Microsoft’s skeletal data don’t lend themselves to the stage or public spaces. Specifically, it requires a control pose at the beginning and the user to stay in a control space in view. Notch does not also support more than one Skeletal data at a time.
Recording Kinect stream data #
Notch is able to record a stream of Kinect data for playback when there is no camera connected. In order to do this, use the “Capture Kinect1/2/4 Data” option in the Devices menu (Builder Pro only). Recording begins immediately and continues until the option is used again to switch it off. The resulting file may be loaded into Notch as a “Kinect Stream” resource. The Video Kinect Source node has a parameter which takes a pre-recorded stream which will then be used as a substitute for the live Kinect camera feed.
Kinect recordings are currently limited to 15fps on all devices.
Output files for Kinect stream data #
For recorded kinect stream data (colour and depth data) a <filename>.kinect file will be produced which can be used by loading as a resource and then dragging onto the Kinect Source node “Kinect Stream” parameter. This will playback and override the real-time streaming from a sensor.
For recorded kinect skeleton data a <filename>.kskel file will be produced which can be used by loading as a resource and applying to the “Kinect Stream” parameter on the Kinect <1|2|4> Skeleton node, as per the kinect stream data, this will override real-time tracking of skeleton joints when playing.
Considerations for the use of Kinect in performance environments #
Using Kinect cameras takes care and planning. It was designed as a gaming controller, not for performance work. If you are inexperienced with their use, we strongly recommend you engage with an experienced interaction/integration specialist. Specific considerations:
- IR Interference: Incandescent light sources (including the sun, and some models of strobe) significantly degrade the Kinect depth image, making it unusable.
- Signal extension: Extension of Kinect 1 USB cabling is known to be achievable. Certain models of powered USB over ethernet extenders have been found to work to extend the range over 50 metres. However, users still report significant technical challenges in extending Kinect 2. Kinect 4 has its own restrictions for USB extension and should be researched before use.
- Field of View: Both Kinect 1 and 2 sensors have very particular field of views and both version’s depth and RGB FOV’s differ.
- Kinect 4 has a number of different FOV modes and colour sensor resolutions for near and far-field sensing, please find the best solution for your usage (bandwidth cost / frame resolution vs real-time performance).
Troubleshooting #
My Kinect camera doesn’t show up in Notch. What do I do?
- Does the camera work outside of Notch? The Kinect SDKs (1/2/4) come with a number of example programs that test the cameras. “Kinect Studio” for Kinect 1 or 2 is worth trying, as is the Kinect4 SDK example “Azure Kinect Viewer” app which can be downloaded
- The status of the Kinect cameras are visible in Device Manager. They may be missing drivers, or the problem may be due to the USB connection or lack of power.
- Kinect 2 and Kinect 4 require USB3.0+ (sometimes a USB3.1 gen2 port can help with bandwidth), a compatible USB bus and a suitable software USB stack, which is available only in Windows 8 and up. Windows 7 is not able to support the Kinect 2 and Kinect 4. Older PCs may also be unable to support the device.
My Kinect camera doesn’t show up in exported blocks. What do I do?
- Using Kinect 4 in exported blocks requires k4a.dll and depthengine_2_0.dll (found in the Notch installation folder) to be copied to the media server.
The frame rate of the camera appears poor, even though Notch’s frame rate is fine.
- The Kinect 1 and 2 are both capable of 30Hz input. Kinect 1 is only capable of 30Hz at 640x480 and below.
- The frame rate of the Kinect is limited by the speed of your USB bus. Multiple cameras and slow/busy buses will affect this. If you have multiple cameras connected and low frame rate, try disconnecting some cameras.
- If your scene only requires depth data from the Kinect, try disabling the colour channel in the Video In / Kinect Settings dialogue. This will reduce the amount of bandwidth used.
- If multiple programs try and use the Kinect camera at once the frame rate of the camera seen in each program will be reduced. This is particularly relevant if running Kinects in Notch standalone at the same time as Builder, or running instances of the Notch plugin inside media servers. Only one active plugin may use the Kinect at any time to achieve full frame rate.
- Kinect 2 has a low light mode where it will drop the frame rate from 30Hz to 15Hz, so in low light environments, this will be triggered.
- Kinect 4 has problems with certain USB controllers, refer to Microsoft Development Kit Troubleshooting docs if you have inconsistent or unstable data shown in their SDK test apps.
- Sometimes Kinect4 sensors can get into a bad state, in the log there will be messages referring to “libusb”. If this happens then disconnect any sensors from power and reconnect, then go to Kinect Settings and disable Kinect 4, then re-enable.