Authors:
            
                    Jürgen Leitner
                    
                        
                    
                    ; 
                
                    Mikhail Frank
                    
                        
                    
                    ; 
                
                    Alexander Förster
                    
                        
                    
                     and
                
                    Jürgen Schmidhuber
                    
                        
                    
                    
                
        
        
            Affiliation:
            
                    
                        
                    
                    Dalle Molle Institute for Artificial Intelligence (IDSIA) / SUPSI / USI, Switzerland
                
        
        
        
        
        
             Keyword(s):
            Humanoid Robot, Robot Control, Object Manipulation, Reactive Control, Collision Avoidance, Robot perception, Eye-hand Coordination, Computer Vision.
        
        
            
                Related
                    Ontology
                    Subjects/Areas/Topics:
                
                        Evolutionary Computation and Control
                    ; 
                        Humanoid Robots
                    ; 
                        Informatics in Control, Automation and Robotics
                    ; 
                        Intelligent Control Systems and Optimization
                    ; 
                        Robotics and Automation
                    ; 
                        Vision, Recognition and Reconstruction
                    
            
        
        
            
                Abstract: 
                We propose a system incorporating a tight integration between computer vision and robot control modules on a complex, high-DOF humanoid robot. Its functionality is showcased by having our iCub humanoid robot pick-up objects from a table in front of it. An important feature is that the system can avoid obstacles – other objects detected in the visual stream – while reaching for the intended target object. Our integration also allows for non-static environments, i.e. the reaching is adapted on-the-fly from the visual feedback received, e.g. when an obstacle is moved into the trajectory. Furthermore we show that this system can be used both in
autonomous and tele-operation scenarios.