By using GUI Program which I introduced in last post, we can acquire RGB stream, Depth stream and real-time 3D reconstructed stream.
Also, by moving the device, we can build the real-time 3D reconstruction stream, as following demo. In other words, 3D point cloud stitching process for Kinect data can be done in real-time.
Note that, 3D stitching time(Number of frames) is controllable.
Thanks for reading.
Will come back with more interesting topic next time:)
FYI, you can get above GUI program from below.
Download Live 3D Point cloud for Kinect Demo
Wanbin Song