The use of humanoid robots is becoming widespread all over the world. Humanoid robots are designed to mimic the human body and differ from other kinds of robots such as industrial ones in that their movement is human-like, based on legged locomotion, especially biped gait. They move about the “real world” and interact with it, performing a growing diversity of specialized and everyday tasks, unlike factory manipulators and other robots that work in highly structured environments.
Significantly, the Indian Space Research Organization (ISRO) plans to send a humanoid robot into space towards 2020-end as part of a crewless mission. The humanoid called Vyommitra, a legless female robot, will help ISRO prepare for its Gaganyaan manned space flight mission, the country’s first attempt at sending humans into space slated for 2022. Before accomplishing this mission, ISRO will send Vyommitra, which can speak but doesn’t move much, into space.
What’s a humanoid robot?
A humanoid robot’s overall appearance and body shape is built to resemble the human body. Generally, humanoid robots have a torso with a head, arms, and legs, though some of their types may have only some specific human parts, for instance, from the waist up. Some forms of robots have heads designed to replicate human facial features such as the face, eyes, and mouth. Humanoid robots built to aesthetically resemble a male human are termed as androids, while female counterparts are called Gynoids.
Leonardo Da Vinci is credited with creating one of the earliest forms of humanoids in 1495. Modeled on an armor suit, it could perform several human functions such as sitting, standing, and walking.
How do humanoids move, talk, and carry out actions?
Inventors and engineers study the human body structure and behavior (biomechanics) and attempt to simulate human cognition, which relies on sensory information to acquire perceptual and motor skills. First-grade sensors and actuators are deployed to enable humanoids to perform multiple functions. Based on computational models of human behavior, the sensors aid robots in sensing their environments, while cameras allow them to see clearly. Motors or actuators placed at strategic points guide these robots in moving and making gestures. Creating their fully functional and realistic versions necessitates the following mechanisms:
Sensors
Sensors measure attributes of the human world. Besides the essential requirements of planning and control, sensing plays a significant role in robotic paradigms. To help the humanoids sense their environment, sensors empower them with the ability to touch, smell, sight, hear, and balance themselves properly.
While the hearing sensor helps humanoids hear, decipher, and carry out instructions. Thanks to the touch sensor, they are prevented from bumping into things and causing self-damage. A force sensor helps them maintain balance and orientation, and heat and pain sensors enable them to know the impending harm or damage. Also, there are facial sensors that make humanoids capable of a wide range of expressions. Therefore, sensors can be categorized under the physical process with which they work or according to the type of sensory information that they give as output.
Scientists are continually working to make the sensors more efficient for accomplishing multiple tasks. They have turned their gaze on proprioceptive sensors (e.g., touch, muscle extension, limb position) to sense the position, the orientation, and the speed of the humanoid’s body and joints. Some of the areas that have received increasing emphasis include: accelerometers to measure the speed from which velocity can be computed by way of integration; tilt sensors to measure inclination; and force sensors placed in robot’s hands as well as feet to measure contact force with the environment.
Attention has also been given to position sensors that indicate the actual position of the robot (from which the velocity can be calculated by derivation) or even speed sensors. Tactile sensors provide information about forces and torques transferred between the robot and other objects; they use arrays of tactels to provide data on what has been touched.
In humanoid robots, vision sensors work to recognize objects and determine their properties. CCD cameras, which use the electromagnetic spectrum to produce an image, are used as the sight faculty of humanoids. Usually, microphones are deployed to enable the functionality of sound sensors that allow the robots to hear speech and environmental sounds.
Actuators or motors
Actuators or the motors responsible for motion in the robot, help robots move and make gestures akin to the flexible human body. Strong, efficient actuators can perform a wide range of actions, much like humans or even better.
Humanoid robots mainly utilize rotary actuators that perform human motions, much like muscles and joints, though with a different structure. Actuators can be either hydraulic, electric, piezoelectric, ultrasonic, or pneumatic.
Hydraulic actuators operate at low speed and high load applications. Electric coreless motor actuators are better suited for high speed and low load applications, though both can only be made to act in a compliant manner through rigid control strategies.
Piezoelectric actuators, on the other hand, can produce a small movement with a high force capability after the application of voltage. They are capable of ultra-precise positioning and for generating and handling high forces or pressures in static or dynamic situations.
Ultrasonic actuators generate movements in a micrometer order at ultrasonic frequencies (over 20 kHz). They can be used for controlling vibration, positioning applications, and quick switching.
Pneumatic actuators rely upon gas compressibility to function. Inflated, they expand along the axis, and when deflated, they contract. When one of their ends is fixed, the other will move in a linear trajectory. Intended for low speed and low/medium load applications, pneumatic actuators comprise cylinders, bellows, pneumatic engines, pneumatic stepper motors, and pneumatic artificial muscles.
AI-based Interaction
After the mechanisms that imitate human body parts are put in place, inventors program the instructions and codes that would enable the humanoids to carry out specific functions. Powered by Artificial Intelligence (AI), they can glide around and give replies when asked questions.
AI is critical to enhancing the level at which humanoid robots can interact with humans. It can make them decipher commands, questions, indication, even understand random, ambiguous statements and give replies laced with wit and sarcasm.
Functions of humanoid robots
Initially, AI was utilized in humanoids for research and experimental tools in several scientific areas such as the study of bipedal locomotion to explore ways to create leg prosthesis, ankle-foot orthosis, biological realistic leg prosthesis, and forearm prosthesis for the neuromuscularly impaired. Some were created for entertainment purposes of singing, playing music, dancing, and speak to audiences.
Now, the aim of humanoids has extended beyond research and experimentation to functional purposes such as performing various human tasks, such as interacting with human tools and environments and occupying different roles in the employment sector. They are an increasingly common feature in the workplace and can perform human tasks and act as personal assistants, receptionists, front desk officers, and automotive manufacturing line workers. They can help out at homes to assist the sick and elderly as household helps and nursing assistants, perform dirty or dangerous jobs, play and use tools, operate equipment and vehicles designed for the human form.
These life-like robots could also prove useful in helping out the children, or any person who needs assistance with day-to-day tasks or interactions. There have been many studies pointing out the effectiveness of humanoid robots supporting children with autism.
It has been decided by various countries to send humanoid robots for dangerous and distant space exploration missions, without needing to turn back around again and return to the Earth once the mission is completed. In essence, robots can perform any task a human being can, thanks to AI algorithms.
Going forward
Scientists are striving to reduce energy consumption in humanoid movements. In this context, studies on dynamics, control, and stabilization of walking biped robots on the surface have acquired crucial importance. Equally important is the maintenance of the robot’s gravity center over the center of the bearing area for providing a stable position.
As a humanoid needs information about contact force and its current and desired motion to maintain dynamic balance during the walk, the Zero Moment Point (ZMP) is an essential balancing approach that has been receiving the attention of inventors. Also, they are focussing on planning and control to allow humanoids to move in complex environments, armed with the knowledge of self-collision detection, path planning, and obstacle avoidance.
Humanoid robots include structures with variable flexibility that provide safety to the robot itself as well as to the people, more degrees of freedom, and wide task availability. To optimize these functionalities, scientists plan to further hone planning and control strategies in the functioning of robots.
Engineers at MIT and the University of Illinois at Urbana-Champaign have developed a method to control balance in a two-legged, teleoperated robot. It marks an essential step toward enabling a humanoid to carry out high-impact tasks in challenging environments. The robot is controlled remotely by a human operator donning a vest that relays information about the human’s motion and ground reaction forces to the robot. Through the vest, the human operator can direct the robot’s locomotion and feel its motions too. If the human feels that the robot is starting to tip over, she can adjust in a way to rebalance both herself and the robot.
In Japan, Prof Hiroshi Ishiguro of Osaka University and his team members have developed a humanoid robot with the ability of a human-like conversation. In the ERATO ISHIGURO symbiotic human-robot interaction project, they focused on the affinity process that emerges during the movement of the robot with a human. Towards this end, they developed a child-like android named “ibuki” who could walk together with the human by using equipped wheels.
Summing up
Humanoid robots can talk like us, walk like us, and express a wide range of emotions. Some of them can engage in a conversation; others can recall the last interaction you had with them. With constant advancements in AI, humanoid robots are all set to acquire more developed human attributes and competencies. Advanced android robotics is all set to facilitate the dramatic enhancement of life going forward.
Filed Under: ARM, Blog entry, Tech Articles
Questions related to this article?
👉Ask and discuss on EDAboard.com and Electro-Tech-Online.com forums.
Tell Us What You Think!!
You must be logged in to post a comment.