The neural circuits that control grasping and perform related visual processing have been studied extensively in Macaque monkeys. We are developing a computational model of this system, in order to better understand its function, and to explore applications to robotics. We recently modelled the neural representation of three-dimensional object shapes, and are currently extending the model to produce hand postures so that it can be tested on a robot. To train the extended model, we are developing a large database of object shapes and corresponding feasible grasps. Finally, further extensions are needed to account for the influence of higher-level goals on hand posture. This is essential because often the same object must be grasped in different ways for different purposes. The present paper focuses on a method of incorporating such higher-level goals. A proof-of-concept exhibits several important behaviours, such as choosing from multiple approaches to the same goal.
Reference:
Kleinhans, A, Thill, S, Rosman, B.S., Detry, R and Tripp, B. 2014. Modelling primate control of grasping for robotics applications. In: European Conference on Computer Vision (ECCV) Workshops, Zurich, Switzerland, 7 September 2014
Kleinhans, A., Thill, S., Rosman, B. S., Detry, R., & Tripp, B. (2014). Modelling primate control of grasping for robotics applications. IEEE. http://hdl.handle.net/10204/8099
Kleinhans, A, S Thill, Benjamin S Rosman, R Detry, and B Tripp. "Modelling primate control of grasping for robotics applications." (2014): http://hdl.handle.net/10204/8099
Kleinhans A, Thill S, Rosman BS, Detry R, Tripp B, Modelling primate control of grasping for robotics applications; IEEE; 2014. http://hdl.handle.net/10204/8099 .
European Conference on Computer Vision (ECCV) Workshops, Zurich, Switzerland, 7 September 2014. Due to copyright restrictions, the attached PDF file only contains the abstract of the full text item. For access to the full text item, please consult the publisher's website.