dc.contributor.author |
Ratshidaho, T
|
|
dc.contributor.author |
Tapamo, JR
|
|
dc.contributor.author |
Claassens, J
|
|
dc.contributor.author |
Govender, Natasha
|
|
dc.date.accessioned |
2012-11-01T08:17:38Z |
|
dc.date.available |
2012-11-01T08:17:38Z |
|
dc.date.issued |
2012-10 |
|
dc.identifier.citation |
Ratshidaho, T, Tapamo, JR, Claassens, J and Govender, N. ToF camera ego-motion estimation. 4th CSIR Biennial Conference: Real problems relevant solutions, CSIR, Pretoria, 9-10 October 2012 |
en_US |
dc.identifier.uri |
http://hdl.handle.net/10204/6266
|
|
dc.description |
4th CSIR Biennial Conference: Real problems relevant solutions, CSIR, Pretoria, 9-10 October 2012 |
en_US |
dc.description.abstract |
We present three approaches for ego-motion estimation using Time-of-Flight (ToF) camera data. Ego-motion is defined as a process of estimating a camera’s pose (position and orientation) relative to some initial pose using the camera’s image sequences. Ego-motion facilitates the localisation of the robot. The ToF camera is characterised with a number of error models. Iterative Closest Point (ICP) is applied to consecutive range images of the ToF camera to estimate the relative pose transform which is used for ego-motion estimation. We implemented two variants of ICP, namely point-to-point and point-to-plane. A feature-based ego-motion approach that detects and tracks features on the amplitude images and uses their corresponding 3D points to estimate the
relative transformation, is implemented. These approaches are evaluated using groundtruth data provided by a motion capture
system (Vicon). The SIFT ego-motion estimation was found to perform faster when compared to the ICP-based methods. |
en_US |
dc.language.iso |
en |
en_US |
dc.subject |
Ego-motion |
en_US |
dc.subject |
Time-of-Flight |
en_US |
dc.subject |
ToF |
en_US |
dc.subject |
Autonomous robots |
en_US |
dc.subject |
Iterative Closest Point |
en_US |
dc.subject |
ICP |
en_US |
dc.title |
ToF camera ego-motion estimation |
en_US |
dc.type |
Conference Presentation |
en_US |
dc.identifier.apacitation |
Ratshidaho, T., Tapamo, J., Claassens, J., & Govender, N. (2012). ToF camera ego-motion estimation. http://hdl.handle.net/10204/6266 |
en_ZA |
dc.identifier.chicagocitation |
Ratshidaho, T, JR Tapamo, J Claassens, and Natasha Govender. "ToF camera ego-motion estimation." (2012): http://hdl.handle.net/10204/6266 |
en_ZA |
dc.identifier.vancouvercitation |
Ratshidaho T, Tapamo J, Claassens J, Govender N, ToF camera ego-motion estimation; 2012. http://hdl.handle.net/10204/6266 . |
en_ZA |
dc.identifier.ris |
TY - Conference Presentation
AU - Ratshidaho, T
AU - Tapamo, JR
AU - Claassens, J
AU - Govender, Natasha
AB - We present three approaches for ego-motion estimation using Time-of-Flight (ToF) camera data. Ego-motion is defined as a process of estimating a camera’s pose (position and orientation) relative to some initial pose using the camera’s image sequences. Ego-motion facilitates the localisation of the robot. The ToF camera is characterised with a number of error models. Iterative Closest Point (ICP) is applied to consecutive range images of the ToF camera to estimate the relative pose transform which is used for ego-motion estimation. We implemented two variants of ICP, namely point-to-point and point-to-plane. A feature-based ego-motion approach that detects and tracks features on the amplitude images and uses their corresponding 3D points to estimate the
relative transformation, is implemented. These approaches are evaluated using groundtruth data provided by a motion capture
system (Vicon). The SIFT ego-motion estimation was found to perform faster when compared to the ICP-based methods.
DA - 2012-10
DB - ResearchSpace
DP - CSIR
KW - Ego-motion
KW - Time-of-Flight
KW - ToF
KW - Autonomous robots
KW - Iterative Closest Point
KW - ICP
LK - https://researchspace.csir.co.za
PY - 2012
T1 - ToF camera ego-motion estimation
TI - ToF camera ego-motion estimation
UR - http://hdl.handle.net/10204/6266
ER -
|
en_ZA |