This book presents interdisciplinary research that pursues the mutual enrichment of neuroscience and robotics. Building on experimental work, and on the wealth of literature regarding the two cortical pathways of visual processing - the dorsal and ventral streams - we define and implement, computationally and on a real robot, a functional model of the brain areas involved in vision-based grasping actions. Grasping in robotics is largely an unsolved problem, and we show how the bio-inspired approach is successful in dealing with some fundamental issues of the task. Our robotic system can safely perform grasping actions on different unmodeled objects, denoting especially reliable visual and visuomotor skills. The computational model and the robotic experiments help in validating theories on the mechanisms employed by the brain areas more directly involved in grasping actions. This book offers new insights and research hypotheses regarding such mechanisms, especially for what concerns the interaction between the dorsal and ventral streams. Moreover, it helps in establishing a common research framework for neuroscientists and roboticists regarding research on brain functions.
This book is devoted to the development of adequate spatial representations for robot motion planning. Drawing upon advanced heuristic techniques from AI and computational geometry, the authors introduce a general model for spatial representation of physical objects. This model is then applied to two key problems in intelligent robotics: collision detection and motion planning. In addition, the application to actual robot arms is kept always in mind, instead of dealing with simplified models. This monograph is built upon Angel del Pobil's PhD thesis which was selected as the winner of the 1992 Award of the Spanish Royal Academy of Doctors.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.