Kinect for Windows implementing hand gestures and 3D mapping

Some great things are in store for Kinect for Windows. Microsoft is very close to releasing a hand-gesture recognition feature in its Kinect software that will allow its users to manipulate their PC much more efficiently. The new development allows the Kinect to accurately detect and follow your hand movements, and will bring new functionality to developers who want to implement hand gestures in their apps or their games.

Currently, only a couple of simple hand gestures have been featured. There's the pinch-to-zoom capability that we should all be familiar with. A Kinect developer demonstrated this feature through a maps application. By gripping both his hands, he was able to zoom in and out using the corresponding movements. He also demonstrated that rotating his arms would also rotate the map. The hand gripping motion was shown to also be able to manipulate actions in games like Jetpack Joyride.

Alongside hand gesture recognition, Microsoft discussed Kinect Fusion. Kinect Fusion analyzes the entire environment and objects in the environment to generate a 3D map. It allows users to hold the Kinect and scan objects or rooms in order to form 3D models. This new service will prove to be very useful for 3D printing, scanning buildings, and scanning bodies. It may even provide a new road map for gaming and augment reality apps.

Both of these new features will be included in an upcoming SDK most likely to be released later this month. Developers can get to work implementing these new features and bringing a whole new generation of apps to Kinect for Windows. These new features, combined with the yet-to-be-announced next-gen Kinect sensor should bring a whole new user-experience to PC users.

[via The Verge]