Top humanoid robot child: iCub

Top humanoid robot child: iCub

iCub is a robot child whose project is coordinated by IIT-Istituto Italiano di Tecnologia. It is a standardized common open-source platform for research on embodied artificial intelligence (AI). iCub is able to crawl on all fours, sit-up, balance walk, interact physically with the environment and recognize objects. It is one of the few robots in the world with a sensitive full-body electronic skin system to provide the sense of “touch”.
Robotic research benefited in the last 10 years from a standardized open-source platform for research on embodied artificial intelligence (AI), the humanoid robot iCub. Created in Italy, today it is available in laboratories across Europe, the U.S., South Korea, Singapore and Japan, and more than 100 researchers worldwide contribute to develop its skills. Researchers at IIT-Istituto Italiano di Tecnologia focused on the importance of such a research platform in a paper published today in Science Robotics.
“iCub: the not-yet finished story of building a robot child,” (Natale et al.) is a review of the evolution of the iCub robot from its origin to date. It shows the co-evolution of hardware and software according to research needs, and highlights the benefit for roboticists, namely, reusing and improving the results of one another toward the goal of creating machines that are more intelligent.
iCub was developed in 2004 as part of an European project coordinated by IIT, and the researchers have released three versions, iCub 1.0, iCub 2.0 and iCub 3.0. The platform integrates state-of-the-art robotics research results. Research activities include the whole spectrum of AI-related areas ranging from control to machine learning, human-robot interaction, and language acquisition.
“In our review paper, we emphasize what we’ve learned so far by working on building a community of open source roboticists. Researchers needed a standard platform for humanoid robotics. Reusing software created by others, replicating experiments, and benchmarking are important elements of robotics research. The standard platform is the enabler,” said Giorgio Metta, vice scientific director at IIT-Istituto Italiano di Tecnologia and coordinator of the iCub Facility. “Hardware and software co-evolution is fundamental to keep both at the state of the art.”
iCub is one of the few robots in the world with a sensitive full-body electronic skin system to provide the sense of touch. The skin system consists of more than 4000 conformable sensors.
The current humanoid is the size of a five-year-old child, and is able to crawl on all fours, sit up, balance walk, interact physically with the environment and recognize objects. It is one of the few robots in the world with a sensitive full-body electronic skin system to provide the sense of touch.
The first iCub design (iCub 1.0) focused on the hands and their manipulation skills. The second version (iCub 2.0) targeted whole-body control: legs included elastic actuators and larger feet to improve walking stability; stereoscopic vision and faster eye and head movement allowed better visual perception of the world; the skin system was integrated, providing the iCub with more than 4000 conformable sensors. This version allowed investigating human-robot interaction (HRI) in terms of joint attention in the form of gaze-cueing effects, and to develop whole-body controllers, which enabled the robot to interact physically with the environment in tasks such as balancing, getting up, push and recovery, and walking. The third and most recent version of iCub (iCub 3.0) further improved the legs to generate more natural steps and integrated two-event driven cameras to optimize vision and object recognition.
“iCub version 3.0 is in the making, but we are planning yet another revision originated from requests to use the robot in the clinical setting, in particular, to design HRI experiments and training for children diagnosed with autism spectrum disorder (ASD). Here, we need robustness because of the expected continuous physical interaction with children. Hence, we are designing new simpler hands and a user-friendly interface for therapist to set up experiments,” said Giorgio Metta.
iCub is able of standing up from a seat and balancing on its legs, while it senses the human interaction.
During the development of the iCub, the software evolved, as well. Presently, its middleware, YARP, consists of about 400,000 lines of code, providing the basic robot interfaces and reaching as far as kinematics, dynamics and vision libraries. The repositories amount to more than 4 million lines of code. In the past year, more than 160 developers actively contributed by writing new code, debugging and producing new complex experiments using the iCub. This makes the iCub project one of the largest open-source teams in the world.
Scientists predict that in the future robots will interact with us more and more in daily life.
A new research project underway in Italy is developing behavioural models that can be applied to humanoid robots and make them our partners in the workplace.
The AnDy Project’s field of application ranges from customer service, to healthcare, to the industrial sector.
Sensors
To ensure efficient collaboration, researchers at the Italian Institute of Technologies (IIT) in Genoa are developing hardware and software that allow humanoid machines to evaluate and predict human behaviour.
Daniele Pucci is Principal Investigator at the AnDy Project:
“The robot has several sensors to understand how the human is moving,” he explains. “The presence of the human being is detected, first, by sight. Secondly, during the interaction, the robot is able to sense contact with the human being through his ‘skin’. Then, to allow the robot to be aware of the human’s actions, it needs to be equipped with sensors”.
These sensors are integrated into a special high-tech suit worn by the human subject. The sensors can detect human movements and share that information with the robot in a fraction of a second. The robot can then react almost in real-time.
“An algorithm calculates the intensity of our effort, what we will call it the human dynamics, and this information is transmitted to the robot,” says Claudia Latella, a PhD Fellow at the Institute. “In the future we imagine that the robot will be even able to predict our movements and thus to help us perform common actions”.
The name of the test robot is iCub. While learning from humans how to move, iCub can be helpful across a range of human activities: manufacturing, health-care and assisted living.
The size and range of movement of each robot will be adapted to its particular function.
“The vision that we have of the final application of this type of robot is the personal assistant, which can be adapted in different ways, from the rehabilitation robot, to the one for assisted living,” says Giorgio Metta, Vice Scientific Director at IIT. “Obviously, the technologies we have developed for this kind of robot can also be used in the industrial sector”.
Another feature under development is iCub’s ability to record additional information – such as recognising a new object – through vocal instruction, without the direct intervention of a programmer.
“As we can talk with the robot, I can tell him, for example: ‘Look at this, this is a smartphone’, and then to allow the user to add other things the robot needs to know,” Metta explains. “The robot acquires the image and builds a new category by himself, the category of the smartphone”.
Researchers working on this European project are also developing facial expressions for the humanoid, to allow it to be more empathetic with potential partners, patients or elderly people.
“We need to integrate the cognitive abilities of the robot — the fact that he recognises the presence of the human being — with his motor skills, the fact that he is able to walk, collaborate and interact with human beings. We expect to reach this goal in the next 10 to 15 years,” says Pucci.
According to researchers, future human–robot collaboration will stick to assisting humans in the workplace. They say robots won’t completely replace their human colleagues.
He has the friendly, inquisitive face of a child and about the same height. He can get up, keep his balance while standing on only one leg, detect the presence of people and wave with his hand: These are just some of the capabilities of iCub, the robot. He could be useful across a range of activities: manufacturing, health-care and helping older people live at home for longer.
Researchers working on the AnDy project are also developing facial expressions for iCub, to allow him to be more empathetic. “The vision that we have of this type of robot is the personal assistant, which can be adapted in different ways, from the rehabilitation robot to the one for assisted living,” says Giorgio Metta, Vice Scientific Director at IIT, the Italian Institute of Technologies in Genoa.
Project coordinator Daniele Pucci goes into more details about the technical functions: “The robot has several sensors to understand how the human is moving,” he explains. “The presence of the human being is detected, first, by sight. Secondly, during the interaction, the robot is able to sense contact with the human being through his ‘skin’. Then, to allow the robot to be aware of the human’s actions, it needs to be equipped with sensors”.
The AnDy-project runs for 4 years until the end of 2020. Its budget of close to 4 Million Euros is entirely met by the EU.
Watch video

Comments are closed.