Dr Christian Rauch
Thesis Title: Visual Articulated Tracking in Cluttered Environments
Examiners: Dr Hyung Jin Chang (University of Birmingham) and Prof. Bob Fisher (University of Edinburgh)
Manipulation tasks performed by humanoid robots often require precise estimation of the state of the manipulator and the object to be manipulated. This is particularly true for small or detailed objects. Using information provided by the robots visual sensors enables more precise estimation than exclusively using kinematics. The project will explore state of the art approaches for object tracking and pose estimation in the context of humanoid manipulation. The algorithms need to be adapted to the provided hardware platform and different sensor modalities need to be taken into account. Drawbacks of the approaches will be identified and solutions for these should be proposed and developed. Major concerns of using depth vision systems during manipulation tasks are the self occlusion by the manipulator, reflections and handling ambiguous objects. Especially for ambiguous objects, the tracking performance of objects with little features like low 3D structure and textureless surfaces will need to be evaluated.