The Kinect motion tracking technology by Microsoft has expanded its use beyond gaming to a number of other areas, including most recently as a way to help stroke victims. This week, the Massachusetts Institute of Technology released a new video that shows one of the most interesting Kinect applications yet; a way for users to remotely interact with real world objects via an interface that changes based on that user's movements.
The project is called inFORM and, as you can see in the video above, the Kinect hardware is placed above a person's head so it can track its hand and arm movements. The information is then sent remotely to another location where a series of square columns that move up and down on a table is put in place. The end result is that when a person moves their hands and arms, the Kinect camera send that information to the table, where the square columns move up and down in sync with those movements.
The video shows the tablet's column moving a ball around, but it can also be used for other types of movements, such as the illusion of turning pages in a book. The MIT researchers claim inFORM could have a number of practical applications. It states:
One area we are working on is Geospatial data, such as maps, GIS, terrain models and architectural models. Urban planners and Architects can view 3D designs physically and better understand, share and discuss their designs.
It's certainly a fascinating way to use Kinect and hopefully it will graduate to become more than just a university research project.
9 Comments - Add comment