Humanoid robot
From Free net encyclopedia
A hum robot is a robot with its overall appearance based on that of the human body. In general humanoid robots have a torso with a head, two arms and two legs, although some forms of humanoid robots may model only part of the body, for example, from the waist up. Some humanoid robots may also have a 'face', with 'eyes' and 'mouth'. Androids are humanoid robots built to resemble humans.
Contents |
Introduction
A humand robot is an autonomous robot because it can adapt to changes in its environment or itself and continue to reach its goal. This is the main difference between humanoids and other kinds of robots, like industrial robots which are used to performing tasks in highly structured environments. In this context, some of the capacities of a humanoid robot may include, among others:
- self maintenance (recharge itself, swap batteries…)
- autonomous learning (learn or gain new capabilities without outside assistance, adjust strategies based on the surroundings and adapt to new situations)
- avoiding harmful situations to people, property and itself
- safe interacting with human beings and the environment
Like other mechanical robots, humanoids refer to the following basic components too: Sensing, Actuating and Planning and Control. Since they try to simulate the human structure and behaviour and they are autonomous systems, most of the times humanoid robots are more complex than other kinds of robots.
This complexity affects all robotic scales (mechanical, spatial, time, power density, system and computational complexity), but it is more noticeable on power density and system complexity scales. In the first place, current humanoids aren’t strong enough even to jump and this happens because the ratio power/weight is not as good as in the human body. On the other hand, there are very good algorithms for the several areas of humanoid construction, but it’s very difficult to merge all of them into one efficient system (the system complexity is very high). Nowadays, these are the main difficulties that humanoid robots development has to deal with.
Humanoid robots are created to imitate some of the same physical and mental tasks that humans undergo daily. Scientists and specialists from many different fields including engineering, cognitive science, and linguistics combine their efforts to create a robot as human-like as possible. Their creators' goal for the robot is that one day it will be able to both understand human intelligence, reason and act like humans. If humanoids are able to do so, they could eventually work alongside humans. Another important benefit of developing androids is to understand the human body's biological and mental processes, from the seemingly simple act of walking to the concepts of consciousness and spirituality.
There are currently two essential ways to model a humanoid robot. The first one models the robot like a set of rigid links, which are connected with joints. This kind of structure is similar to the one that can be found on industrial robots. Although this approach is used for most of the humanoid robots, a new one is emerging in some research works that use the knowledge acquired on biomechanics. In this one, the humanoid robot’s bottom line is a resemblance of the human skeleton.
Purpose
Humanoid robots are used as a research tool in several scientific areas.
Researchers need to understand the human body structure and behaviour (biomechanics) to build and study humanoid robots. On the other side, the attempt to simulate the human body leads to a better understanding of it.
Human cognition is a field of study which is focused on how humans learn from sensory information in order to acquire perceptual and motor skills. This knowledge is used to develop computational models of human behaviour and it has been improving over time.
Although the initial aim of humanoid research was to build better orthosis and prosthesis for human beings, knowledge has been transferred between both disciplines. A few examples are: powered leg prosthesis for neuromuscularly impaired, ankle-foot orthosis, biological realistic leg prosthesis and forearm prosthesis.
Besides the research, humanoid robots are being developed to perform human tasks like personal assistance, where they should be able to assist the sick and elderly, and dirty or dangerous jobs. Regular jobs like being a receptionist or a worker of an automotive manufacturing line are also suitable for humanoids.
They are becoming increasingly popular for providing entertainment too. For example, Ursula, a female robot, sings, dances, and speaks to her audiences at Universal Studios. Several Disney attractions employ the use of animatrons, robots that look, move, and speak much like human beings, in some of their theme park shows. These animatrons look so realistic that it can be hard to decipher from a distance whether or not they are actually human. Although their realistic look, they have no cognition or physically autonomy.
Timeline of Humanoid Robots
- 1495 — Leonardo da Vinci designs a humanoid automaton that looks like an armored knight, known as Leonardo’s robot. [1]
- 1738 — Jacques de Vaucanson builds The Flute Player, a life-size figure of a shepherd that could play twelve songs on the flute and The Tambourine Player that played a flute and a drum or tambourine. [2]
- 1774 — Pierre Jacquet-Droz and his son Henri-Louis created the Draughtsman, the Musicienne and the Writer, a figure of a boy that could write messages up to 40 characters long. [3]
- 1921 — Czech writer Karel Capek introduced the word “Robot” in his play “R.U.R” (Rossuum's Universal Robots). The word “Robot” comes from the word “robota”, meaning, in czech, “forced labor, drudgery”. [4]
- 1969 — D.E. Whitney published is article Resolved motion rate control of manipulators and human prosthesis.
- 1970 — Miomir Vukobratovic has proposed Zero Moment Point a theoretical model to explain biped locomotion. [5]
- 1972 — Miomir Vukobratovic and his associates at Mihajlo Pupin Institute the first active anthropomorphic exoskeleton.
- 1973 — In Waseda University, in Tokyo, Wabot-1 is built. It was able to communicate with a person in Japanese and to measure distances and directions to the objects using external receptors, artificial ears and eyes, and an artificial mouth. [6]
- 1980 — Marc Raibert established the MIT Leg Lab, which is dedicated to studying legged locomotion and building dynamic legged robots. [7]
- 1983 — Using MB Associates arms, “Greenman” was developed by Space and Naval Warfare Systems Center, San Diego. It had an exoskeletal master controller with kinematic equivalency and spatial correspondence of the torso, arms, and head. Its vision system consisted of two 525-line video cameras each having a 35 degree field of view and video camera eyepiece monitors mounted in an aviator's helmet. [8]
- 1984 — At Waseda University, the Wabot-2 is created, a musician humanoid robot able to communicate with a person, read a normal musical score with his eyes and play tunes of average difficulty on an electronic organ. [9]
- 1985 — Developed by Hitachi Ltd, WHL-11 is a biped robot capable of static walking on a flat surface at 13 seconds per step and it can also turn. [10]
- 1985 — WASUBOT is another musician robot from Waseda University. It performed a concerto with the NHK Symphony Orchestra at the opening ceremony of the International Science and Technology Exposition. [11]
- 1986 - 1993 — Honda developed seven biped robots which were designated E0 (Experimental Model 0) through E6. E0 was in 1986, E1 - E3 were done between 1987 and 1991, and E4 - E6 were done between 1991 and 1993. [12]
- 1989 — Manny was a full scale anthropomorphic robot with 42 degrees of freedom developed at Battelle's Pacific Northwest Laboratories in Richand, Washington, for the US Army's Dugway Proving Ground in Utah. [13]
- 1990 — Tad McGeer showed that a biped mechanical structure with knees could walk passively down sloping surface. [14]
- 1993-1997 — Honda developed P1 (Prototype Model 1) through P3, an evolution from E series, with upper limbs.[15]
- 1995 — Hadaly was developed in Waseda University, to study human-robot communication and has three subsystems: a head-eye subsystem, a voice control system for listening and speaking in japanese, and a motion control subsystem to use the arms to point toward campus destinations.
- 1995 — Wabian is a human-size biped walking robot from Waseda University.
- 1996-1998 — Saika, a light-weight, human-size and low-cost humanoid robot, was developed at Tokyo University. Saika has a two-DOF neck, dual five-DOF upper arms, a torso and a head. Several types of hands and forearms are under development also. [16]
- 1997 — Hadaly-2, developed at Waseda University, is a humanoid robot which realizes interactive communication with humans. It communicates not only informationally, but also physically.
- 2000 — Honda creates its 11th bipedal humanoid robot, ASIMO. [17]
- 2001 — Sony unveils humanoid robots, dubbed Sony Dream Robot (SDR), a small humanoid entertainment robot.
- 2003 — After a few prototypes, Sony renamed its Sony Dream Robot to Qrio.
- 2004 — Appears RoboSapien, a toy-like affordable humanoid biomorphic robot designed by Mark Tilden. [18]
Sensors
A sensor is a device that measures some attribute of the world. Being one of the three primitives of robotics (besides planning and control), sensing plays an important role in robotic paradigms.
Sensors can be classified according to the physical process with which they work or according to the type of measurement information that they give as output. In this case, the second approach was used.
Proprioceptive Sensors
Proprioceptive sensors sense the position, the orientation and the speed of the humanoid’s body and joints.
In human beings inner ears are used to maintain balance and orientation. Humanoid robots use accelerometers to measure the acceleration, from which velocity can be calculated by integration; tilt sensors to measure inclination; force sensors placed in robot’s hands and feet to measure contact force with environment; position sensors, that indicate the actual position of the robot (from which the velocity can be calculated by derivation) or even speed sensors.
Exteroceptive Sensors
Exteroceptive sensors give the robot information about the surrounding environment, which is the real world in case of humanoid robots. That information allows the robot to interact with the world. The exteroceptive sensors are classified according to their functionality.
Proximity sensors are used to measure the relative distance (range) between the sensor and objects in the environment. They perform the same task that vision and tactile sensing do in human beings. For that, humanoid robots can use sonars, infrared sensors or tactile sensors like bump sensors, whiskers (or feelers), capacitive or piezoresistive sensors. Tactile sensors also provide information about forces and torques transferred between the robot and the objects. There are other kinds of proximity measurements like laser ranging, the usage of stereo cameras or projecting a colored line, grid or pattern of dots on the environment and observe how the pattern is distorted.
Vision refers to processing data from any modality which uses the electromagnetic spectrum to produce an image. In humanoid robots context it is used to recognize objects and determine their properties. Vision sensors work similarly to the eyes of the human beings. Most humanoid robots use CCD cameras as vision sensors.
Sound sensors allow humanoid robots to hear what is spoken and perform as the ears of the human being. Microphones are usually used for that task.
Actuators
Actuator is the name given to each one of the motors which move the robot, a humanoid robot in this case.
Humanoid robots are constructed in such a way that they mimic human body, so they have actuators that perform like muscles and joints, although with a different structure. To achieve the same effect as human actuators humanoids use mainly rotary actuators. They can be either electric, pneumatic, hydraulic, piezoelectric or ultrasonic actuators.
Hydraulic and electric actuators have a very rigid behaviour and can only be made to act in a compliant manner through the use of relatively complex feedback control strategies . While electric actuators are better suited for high speed and low load applications, hydraulic ones operate at low speed and high load applications.
Piezoelectric actuators generate a small movement with a high force capability when voltage is applied. They can be used for ultra-precise positioning and for generating and handling high forces or pressures in static or dynamic situations.
Ultrasonic actuators are designed to produce movements in a micrometer order at ultrasonic frequencies (over 20 kHz). They are useful for controlling vibration, positioning applications and quick switching.
Pneumatic actuators operate on the basis of gas compressibility. As they are inflated, they expand and as well as they deflate they contract along the axis. If one end is fixed, the other will move in a linear trajectory. These actuators are intended for low speed and low/medium load applications. Between pneumatic actuators there are: cylinders, bellows, pneumatic engines, pneumatic stepper motors and pneumatic artificial muscles.
Planning and Control
In planning and control the essential difference between humanoids and other kinds of robots (like industrial ones) is that the movement of the robot has to be human-like, using legged locomotion, especially biped gait. The ideal planning for humanoid movements during normal walking should result in minimum energy consumption, like it happens in the human body. For this reason, studies on dynamics and control of these kinds of structures become more and more important.
To maintain dynamic balance during the walk, a robot needs information about contact force and its current and desired motion. The solution to this problem relies on a major concept, the Zero Moment Point (ZMP).
Another characteristic about humanoid robots is that they move, gather information (using sensors) on the “real world” and interact with it, they don’t stay still like factory manipulators and other robots that work in highly structured environments. Planning and Control have to focus about self-collision detection, path planning and obstacle avoidance to allow humanoids to move in complex environments.
There are features in the human body that can’t be found in humanoids yet. They include structures with variable flexibility, which provide safety (to the robot itself and to the people), and redundancy of movements, i.e., more degrees of freedom and therefore wide task availability. Although these characteristics are desirable to humanoid robots, they will bring more complexity and new problems to planning and control.
Ethical Concerns
The development of humanoid intelligence and capability raises some serious ethical questions. Most of them apply not only to humanoids, but to the robotic field in general.
Some people think that humanoids can continue to learn and evolve to a point where they will break away from human command and possibly revolt, or that their "upbringing" can determine their "personality" (e.g. a selfish, tyrannical person will produce a similar android). Although this is the first threat that people think when they are talking about humanoid robots, probably due to science fiction books and movies, there are other ethical concerns.
One of them is to whom should be awarded the patents of an invention done by a robot. Another one is who is responsible when an intelligent machine fails, commits a crime, or does something it shouldn’t do.
If in the future humanoids have the ability to reason, be self-aware and have feelings, another kind of questions will be raised. What would be the difference between human beings and humanoid robots’ rights? Could a person destroy or make robots his/her slaves?
Nowadays, other concerns are emerging mainly with the introduction of humanoid robots in tasks that were only done by human beings, like being a security guard. Humanoid robots for factories are also being developed. This can cause that people lose their jobs to robots that work with lower costs and higher productivity.
See also
References
- Asada, H. and Slotine, J.-J. E. (1986). Robot Analysis and Control. Wiley. ISBN 0-471-83029-1.
- Arkin, Ronald C. (1998). Behavior-Based Robotics. MIT Press. ISBN 0-262-01165-4.
- Brady, M., Hollerbach, J.M., Johnson, T., Lozano-Perez, T. and Mason, M. (1982), Robot Motion: Planning and Control. MIT Press. ISBN 0-262-02182-X.
- Horn, Berthold, K. P. (1986). Robot Vision. MIT Press. ISBN 0-262-08159-8.
- Craig, J. J. (1986). Introduction to Robotics: Mechanics and Control. Addison Wesley. ISBN 0-201-09528-9.
- Everett, H. R. (1995). Sensors for Mobile Robots: Theory and Application. AK Peters. ISBN 1-56881-048-2.
- Kortenkamp, D., Bonasso, R., Murphy, R. (1998). Artificial Intelligence and Mobile Robots. MIT Press. ISBN 0-262-61137-6.
- Poole, D., Mackworth, A. and Goebel, R. (1998), Computational Intelligence: A Logical Approach. Oxford University Press. ISBN 0-19-510270-3.
- Russell, R. A. (1990). Robot Tactile Sensing. Prentice Hall. ISBN 0-13-781592-1.
- Russell, S. J. & Norvig, P. (1995). Artificial Intelligence: A Modern Approach. Prentice-Hall. Prentice Hall. ISBN 0-13-790395-2.