Asha: A humanoid caregiving robot
5 MINS READ
Highlights
An AI powered robot
As an outcome of the COVID-19 pandemic, social distancing has possibly had the deepest impact in caregiving environments—where human proximity is necessary to diagnose ailments, show empathy, treat patients, and provide care. The mechanisms of using masks, gloves and sanitizers are not always foolproof to ensure safety in caregiving environments.
That teleoperated humanoid robots powered by artificial intelligence (AI) technology can transform industries and societies is no longer science fiction.
The Humanoid Nurse
Let’s introduce you to Asha—a humanoid nurse who can see, perceive, hear, touch, and interact like human beings. TCS Research, along with a team of over 20 innovators comprising academicians, technology professionals, and students, have together birthed this AI entity.
The team, Aham Avatar—is a collaboration of individuals from TCS Research, Indian Institute of Science (IISc), and Hanson Robotics. While Hanson Robotics built the socially intelligent humanoid platform, TCS Research and IISc integrated its mobility and immersive teleoperation capabilities.
The Asha prototype was conceived as an entry into the global four-year ANA Avatar XPRIZE robotic competition.
The Anatomy
Her humanoid avatar has two sides—the robot at the front end and a human operator at the backend. The physical robot acts as an interface between two human beings, one of which is the operator. She is designed with two eyes (cameras), a torso, and other facial features that express emotion. Asha has sensor-equipped dexterous fingers that can pick, hold, move, and hand over objects just like a human does. The standing robot is mounted on a small moving platform.
TCS Research devised solutions encompassing AI and ML, and the platform, to ensure Asha moves around in a controlled environment. Hanson Robotics built the head and the torso with sophisticated technologies, including high-speed cameras that help mimic human vision.
IISc designed the teleoperation in a way that the remote operator uses virtual-reality goggles to see through Asha’s eyes. The operator controls the robot through a remote console that serves to steer Asha, helping her move across a space in varied directions. Multiple sensors ensure she listens to commands, acts accordingly, and stays out of danger. The operator wears a sensor equipped glove that help capture and convey actions to Asha. A pair of foot pedals helps her within indoor spaces. Asha also has an emergency stop feature that immediately disables operations, if needed.
Semi-autonomy is essential for Asha to make sure she does not collide with other objects. The operator cannot always see every object in the robot’s environment; Asha is programmed to sense obstacles and adjust the movement path to avoid collisions.
Moreover, Asha is equipped with a speech-to-text conversion capability, speech-driven synchronized lip movements, and emotion rendering. She has been programmed to differentiate between commands and voice responses. Furthermore, she is engineered with application programming interface (API) calls to implement simple commands such as ‘smile’, ‘pick up’, or ‘hand over’ objects.
Asha responds to voices too. When greeted, the operator interprets the “good evening, for instance, responds back, and the replicates the response in her voice to the user.
A Complete Caregiver
Despite being humanly controlled, Asha has semi-autonomy marked by some level of freedom. In a specific caregiving use case, Asha performs actions like checking the patient’s temperature with a touchless thermometer, inquiring about the person’s wellbeing, and offer help by bringing water.
The robot remains a work in progress with new functionalities continually being tested and added. Ultimately, Team Aham Avatar aims to make Asha a viable ‘human’ caregiving solution for environments that don’t allow for physical, proximity as was seen during the pandemic.