A PhD student at MIT’s Personal Robotics Group, has combined the Kinect motion controller with an iRobot Create platform to create a battery powered robot that can see its environment and obey your gestured commands. He used simultaneous localization and mapping code from OpenSLAM.org, some visualization packets from the Mobile Robot Programming Toolkit, and his own interaction, human detection, and gesture code.
The robot can generate detailed 3D maps of its surroundings and wirelessly send them to a host computer. It can also detect nearby humans and track their movements to understand where they want it to go.
Kinect has grabbed the attention of hackers trying to enable its motion-sensing capabilities in environments that don’t include an Xbox. The result has been open source Kinect drivers, multitouch capabilities, interaction with Windows 7 and Mac OS X, and 3D camera applications.
0