Skip to main content
Skip to footer
We're taking you to another TCS website now.


Presenting Asha, the humanoid avatar with a healing ‘touch’ even in times of social distancing

Among the problems that COVID-19 created in the world is the loss of human touch. At no other time in modern history has the simple act of coming in contact with a fellow human being created so much fear and anxiety so pervasively. Social distancing has become our primary self-preservation strategy to save lives. This has possibly had the deepest impact in caregiving environments—where human proximity is necessary to diagnose ailments, show empathy, treat patients, and provide care. The present mechanisms of using masks, gloves and sanitizers are not always fool proof to ensure safety.

The ability to perceive emotions and show sensitivity toward pain are critical to deliver care. For example, in wound management, the robot should be sensitive to the patient’s pain and provide care without causing any further injury.

Therefore, what if we could have proxies for caregivers? What if we could set up human-like substitutes that mimicked our actions without compromising on care or safety?

Teleoperated humanoid avatars can transform industries and societies and are no longer science fiction.

Welcome to the world of Asha, a humanoid nurse avatar who can see, perceive, hear, touch, and interact like human beings do. Tapping into its rich partner ecosystem, TCS Research, along with a team of over 20 innovators comprising academicians, technology professionals and students created Asha. The team, called Aham Avatar, is a collaborative effort of TCS Research, Indian Institute of Science (IISc) and Hanson Robotics[1]. While Hanson Robotics built the socially intelligent humanoid platform, TCS Research and IISc integrated the mobility and immersive teleoperation capabilities.





The prototype for Asha was conceived as an entry into the global four-year ANA Avatar XPRIZE[2] robotic competition. The $10-million prize aims to create an avatar system (humanoid robot) that can transport human presence to a remote location in real-time.

Relying on the knowledge of industrial and scientific domains, TCS Research triggered the collaboration with IISc. Two research centres within IISc - Robert Bosch Centre for Cyber-Physical Systems (RBCCPS)[3] and ARTPARK (AI & Robotics Technology Park) - collaborated on the project. RBCCPS is a research and academic center under the Division of Interdisciplinary Research at IISc. It focuses on foundational and applied research in robotics, involving advanced machine-learning techniques. ARTPARK, on the other hand, is a joint initiative between IISc and Bengaluru-based AI Foundry, and is seed-funded by the Government of India (via the Department of Science and Technology). It strives to translate cutting-edge artificial intelligence and robotics research to solutions that deliver societal, economic, and technological impact


Borrowing from their experience in developing the globally acclaimed humanoid robot, Sophia, Hanson Robotics developed a physical prototype for Asha. The idea to render emotions to the robot and make it suitable for interactions with humans was also kept in mind while developing Asha.

The humanoid avatar consists of two sides¾the robot at the front-end and a human operator at the backend. The physical robot acts as an interface between two human beings, one of which is the operator. It is constructed like a woman with two eyes (cameras), a torso and other facial features to speak and express emotions. It has fingers equipped with sensors. The dexterous fingers can pick, hold, move and hand over objects like a human being. Asha is a standing robot mounted on a small moving platform. TCS Research came up with solutions encompassing AI and ML, and the platform, to ensure that Asha can move around in a controlled environment. Hanson Robotics built the head and the torso with sophisticated technologies, including high-speed cameras that help mimic human vision.


IISc designed the teleoperation in a way that the remote operator uses virtual-reality goggles to see through the robot’s eyes. The operator controls the robot’s movements through a remote console, which helps to steer the humanoid in various directions. Multiple sensors ensure that the robot listens to commands, acts accordingly, and stays out of danger. The operator wears a sensor equipped glove with which the actions are captured and conveyed to Asha. A pair of foot pedals help to move the robot within indoor spaces. The robot is also provided with an emergency stop feature for immediate disabling, in case of contingencies.

Semi-autonomy is essential for Asha to make sure the robot does not collide with other objects. The operator cannot always see every object in the robot’s environment; Asha is programmed to sense obstacles and adjust the movement path to avoid collisions.


Moreover, Asha is equipped with a speech-to-text conversion capability, speech-driven synchronized lip movements, and emotion rendering. It has been programmed to differentiate between commands and voice responses. Asha is engineered with different API calls to implement simple commands such as ‘smile’, ‘pick up’ and ‘hand over’ (of a particular object).

Asha acts as the interface between the end user and the remote operator. A simple analogy is that of a video call between two people. Here, the robot acts like the video calling application, which has a physical form and can perform physical actions. Depending on what the user requests are, the operator makes Asha offer a suitable response. For instance, if the user greets the robot, the operator says “Good evening to you too;” the robot listens and replicates the same in its voice to the user. The operator instructs Asha to smile saying, “Asha is happy,” and Asha speaks thus and smiles at the user. When the operator commands the robot to fetch objects for the user, it moves around to do so. While Asha is largely controlled, it also has ‘semi-autonomy’ through several degrees of freedom, in order to move freely like human beings and keep away from danger. In a specific care-giving use case, Asha performs actions including checking the patient’s temperature with a touchless thermometer, inquiring about the person’s wellbeing, and offering help by bringing water.

Despite the challenges imposed by the pandemic the team dedicatedly worked for 18 months towards a prototype of Asha. New functionalities are being tested and added to improve the humanoid’s practical capabilities. Ultimately, the team aims to make it a viable solution for care-giving environments that are either remote or don’t allow for physical, close proximity human interaction during high-risk times like an epidemic or pandemic.






Connect with us to transform your workplace

*Required field
*Enter valid First Name
First Name
*Required field
*Enter valid Last Name
Last Name
*Required field
*Email syntax error
*Please use your business email.
*Required field
*Enter valid company name
Company Name
What's your challenge

Thank you for your interest

We will shortly connect.

Click here to submit another form.