Destructive results of natural disasters like the earthquake in Japan are not subject to control. Fukushima catastrophe has shown that for now, robots do not have human capabilities when it comes to search and rescue. At time of writing this paper a robotics competition takes place whose goal is promoting research on issues related to operation of robots in a human's environment. This so in future, disasters like Fukushima's catastrophe would be handled better by robots. In this paper we present the perspective of the dexterity group that worked as part of the Israeli team “ROBIL”. The team participated in DARPA's Virtual Robotic Challenge (VRC), a competition where teams competing for the best score. The competition was performed on a cloud-based simulator that monitored the physical behaviors of a humanoid in an unknown simulated environment. A detailed description of dexterity modules like motion planning and grasping will be presented with regard to theory and implementation using ROS nodes framework. In addition we present the integration between modules of the dexterity and vision workgroups. The integration was conducted by creating a common language within the workgroups defining the motion of objects in space. Finally we discuss the results and conclusions we drew from the dexterity implementation and integration during the three main project phases: planning, qualification and competition.
|Name||2013 IEEE International Symposium on Safety, Security, and Rescue Robotics, SSRR 2013|
|Conference||2013 IEEE International Symposium on Safety, Security, and Rescue Robotics, SSRR 2013|
|Period||21/10/13 → 26/10/13|