NextStage Pro
Frequently Asked Questions
What file formats does NextStage export?
Video
In the beta version of NextStage videos are recorded as either an uncompressed .tga image sequence, or a .avi video file using the CineForm Codec.
HD Capture mode exports a 1280x720 HD image, with full RGBA color channels.
Tracking Data
NextStage Pro exports camera tracking data as a Collada .dae file.
Collada is an easily readable XML based format, and is supported by almost all major 3D animation packages.
In the beta release rotation values are exported as euler angles with a rotation order of YXZ. This is subject to change in future versions.
Audio
NextStage records audio from the Kinect as an uncompressed .wav file.
Please note that the audio from the Kinect is intended only as a reference track for syncing other video and audio recordings.
The microphones on the Kinect are not production quality. They record at a low resolution frequency of 16mhz, are prone to audio artifacts and automatically adjust gain levels.
What video specifications does NextStage support?
HD Color using the onboard camera
Uncompressed .tga image sequence
1280x720 pixels
RGBA at 8 bits per channel
15-30fps*
Infrared Reference for External Syncing
Uncompressed .tga image sequence
512x424 pixels
8 bit black and white
30fps*
*please note that the Kinect runs at exactly 30hz. It captures exactly 30 frames per second, which is different from the slightly slower 29.97fps that is commonly referred to as 30fps.
Format:
Resolution:
Channels:
Frame Rate:
Format:
Resolution:
Channels:
Frame Rate:
Why does NextStage only output 720p?
There are a couple of reasons for this.
While the onboard color camera is capable of capturing 1920x1080 video, the Kinect’s depth camera has a narrower horizontal field of view that does not cover the entire color image. This would limit alpha mattes generated by the depth camera to the center of the image. But by cropping the color image down to 1280x720 the entire field of view is covered.
Having a narrower field of view for the color camera also makes it significantly easier to track a wide variety of shot types. If you were to shoot a closeup of an actor’s face with the field of view of the full 1080p image, the depth camera would be completely obscured, making it impossible to track any markers.
Another reason is file size. 1080p would double the required hard drive write speed and take up twice as much storage.
Again this is just the beta release, and as more video formats and codecs are added, so will other output resolutions.
Why can’t I manually control the Kinect’s color camera?
Because the Kinect is a consumer oriented device, and because it was designed for multiple applications to access it at once, the Kinect development team made the decision to have the color camera be automatically exposed.
There is no way for any one application to control the camera’s gain, exposure time, frame rate, or white balance.
Unfortunately, the auto exposure algorithm is extremely sensitive and “exposes to the right”. This leads to inconsistent exposure, flickering, over exposure, image noise and an overabundance of motion blur.
To try and alleviate this issue, NextStage has an option to “compensate” for the automatic exposure. By reading the exposure values for a given frame, NextStage tries to reverse the Kinect’s exposure and set it to a custom value.
This is far from an ideal solution. While it does suppress flickering and give some amount of creative control to the user, it does not solve the other issues and can create unwanted artifacts in high contrast lighting scenarios.
The SDK has been out for over a year and it does not seem likely that manual exposure controls will ever be enabled on the Kinect. If you want the Kinect to have manual control of the color camera, please respectfully let the development team know on the Kinect V2 SDK Forums.
Why does my recording drop to 15 frames per second in low light?
This is another unfortunate result of the Kinect’s automatic exposure.
In low light (or whatever the Kinect perceives to be “low light”) the Kinect will drop the color camera down from 30hz to 15hz (15 frames per second) in order to brighten the image.
If this happens during a recording NextStage will interpret the missing frames as “dropped frames”, which the user can choose to interpolate or ignore.
Please note that this only happens when the Kinect’s color camera is enabled. The infrared and depth cameras always run at 30hz. In EXT mode NextStage can track at in complete darkness.
Why doesn’t NextStage support other sensors?
As of this moment, the 2nd generation Kinect sensor is the only readily available depth camera suitable for realtime match moving. There are unique capabilities of the Kinect’s time of flight camera (global shutter, flood fill infrared image, millimeter precise depth data up to 4.5 meters away) that no other depth camera currently has.
However, the current interest in virtual and augmented reality devices is spurring the development into numerous new and affordable tracking systems. Some of these will likely make their way into NextStage in the near future.
Will NextStage work outside?
Unfortunately the Kinect sensor was not designed for outdoor use.
The infrared light contained in direct sunlight will temporarily “blind” the Kinect’s infrared camera, making it unable to capture depth information, and therefore make NextStage unable to track position and rotation.
However the Kinect is able to capture depth in indirect sunlight and some people have even been able to use it outside with cloudy skies. There still may be a significant amount of infrared light in these situations, and interference may affect the quality of the tracking data.
Please keep in mind that the Kinect requires AC power to run and is not a weather sealed device. NextStage is not responsible for any damage that may occur to your device, so proceed at your own risk.
Will NextStage work with infrared LEDs or other reflective materials?
It is theoretically possible to use active infrared lights such as LEDs and/or other reflective materials as markers NextStage can identify.
However they may not be as robust or reliable as retro-reflective markers. As of right now retro-reflective markers are the only “officially recommended” marker types.
But if you do find another way to create infrared markers for NextStage we would love to hear from you.
Can NextStage track objects or non-stationary markers?
NextStage was designed to track the position and rotation of the Kinect Sensor within a room.
It uses the relationship between stationary markers to find it’s own position. If the position of these markers relative to one another change, then NextStage is no longer able to identify those markers. This will lead to the tracking data losing quality, or being lost entirely.
If the Kinect is stationary and all visible markers move as one rigid object, then it is possible to reverse the tracking data in an external application and track a single moving object relative to the Kinect.
However NextStage relies on multiple markers spread out over a large space, and the immediate area surrounding each marker to be a flat surface, in order to create a precise and accurate track. While object tracking is possible, it will more than likely not be as high quality.
Can I make multiple depth masks?
As of right now NextStage only supports a single depth mask.
For a single mask NextStage needs to compare and align over 6.5 million 3D points per second. In order to keep the application running in real time the mask is a three dimensional rectangle, aligned to the three axes of the virtual camera space.
Support for multiple depth masks are planned for a future update, as long as it does not significantly impact performance.
Can I use the tracking algorithm in my own application?
NextStage uses a proprietary tracking algorithm that is not available for licensing.
However NextStage Pro can stream tracking to other applications and DIY projects using the OSC Framework.
Can I use the auto exposure compensation in my own application?
The source code behind NextStage’s automatic exposure compensation as been made available here, free for anyone to use.
Have a question you don't see answered here? Send an email to Support@NextStagePro.com
NextStage Pro © 2015 | legal disclaimer