We present three approaches for ego-motion estimation using Time-of-Flight (ToF) camera data. Ego-motion is defined as a process of estimating a camera’s pose (position and orientation) relative to some initial pose using the camera’s image sequences. Ego-motion facilitates the localisation of the robot. The ToF camera is characterised with a number of error models. Iterative Closest Point (ICP) is applied to consecutive range images of the ToF camera to estimate the relative pose transform which is used for ego-motion estimation. We implemented two variants of ICP, namely point-to-point and point-to-plane. A feature-based ego-motion approach that detects and tracks features on the amplitude images and uses their corresponding 3D points to estimate the
relative transformation, is implemented. These approaches are evaluated using groundtruth data provided by a motion capture
system (Vicon). The SIFT ego-motion estimation was found to perform faster when compared to the ICP-based methods.
Reference:
Ratshidaho, T, Tapamo, JR, Claassens, J and Govender, N. ToF camera ego-motion estimation. 4th CSIR Biennial Conference: Real problems relevant solutions, CSIR, Pretoria, 9-10 October 2012
Ratshidaho, T., Tapamo, J., Claassens, J., & Govender, N. (2012). ToF camera ego-motion estimation. http://hdl.handle.net/10204/6266
Ratshidaho, T, JR Tapamo, J Claassens, and Natasha Govender. "ToF camera ego-motion estimation." (2012): http://hdl.handle.net/10204/6266
Ratshidaho T, Tapamo J, Claassens J, Govender N, ToF camera ego-motion estimation; 2012. http://hdl.handle.net/10204/6266 .