Onboard Localization and SLAM
At the core, our lab is focused on deploying and testing multi-agent robot algorithms in the most distributed way possible. A big part of controls is the measurement aspect, which is often done away from the robot. We are interested in measuring the turtlebots and AR-Parrot Drone 2's through "onboard localization", any means of measuring their position within their environment where calculations are done by the robot at hand. The AR-Parrot Drone 2 is equipped with a front and bottom facing camera. We use the cameras to localize the quadcopter in real time through a method called SLAM (Simultaneous Localization and Mapping) {link to slam page}. In particular, we use a SLAM algorithm called ORB-SLAM which is a recently developed tool from Raúl Mur Artal, José María Martínez Montiel and Juan Domingo Tardós Solano from the University of Zaragoza. ORB-SLAM works by taking ORB (Oriented Fast and Rotated Brief) features from opencv and tracking them in real time to develop a pointcloud and camera location estimation.
Capabilities
ORB-SLAM allows us to do what is sometimes difficult in the world of robotics - onboard localization. Initially we perform localization through robust, but limited means such as fixed cameras in the lab. Having onboard localization means that we are no longer restricted to deploying the robots in the lab environment. We are now only restricted to what ORB-SLAM is capable of. ORB SLAM is most capable of localizing in small to medium scale environments where camera movements are able to distinguish a perspective amongst the ORB-features. Distant environments and slow moving objects (such as trees or clouds) may confuse the algorithm, but it is robust enough to use in a busy street.