Human Robot Interaction in Healthcare: supporting everyday task

Devaang Misra


Supervised by Carolina Fuentes Toro; Moderated by Yulia Cherdantseva

As the title suggests, the problem being tackled here is supporting everyday tasks in a Healthcare environment with the help of robots. The robot being used here is Pepper, a semi-humanoid robot that was manufactured by SoftBank Robotics. In a healthcare environment, Pepper can perform a myriad of everyday tasks, involving greeting hospital visitors, carrying small talks with patients or their families and keeping them entertained, communicating with nurses to provide instructions, Pepper can also help children and adults with special needs overcome their anxiety about their upcoming procedures, and Pepper can also educate and inform visitors on remaining compliant with health regulations, ensuring everyone's safety. Apart from this, Pepper has a lot to offer in the context of entertainment for the guests and patients, like performing animations and having small talks, especially helpful with patients suffering from dementia through regular activation of the patient with speeches and entertainment. The overall aim is a simplification of everyday tasks and reduces stress on medical staff which allows them to focus on more imperative tasks. The main aim of this project is to design QiChat applications that perform everyday tasks, and the main focus is on the conversation of Pepper with humans emphasizing on the conversations being: •Conversational •Cooperative (Quantity, Quality, Relation, Manner) •Character-Driven •Context-Driven •Multichannel To implement the tasks required for this project I would be using Choregraphe, which is a graphical environment for programming Pepper. Choregraphe allows using a library of prebuilt code blocks based on Python, so easy to use and customise. When we first start up Pepper it requires some minor configurations, however after that Pepper starts to walk in an autonomous walking state until it detects a person. Then, Pepper analyses the person and can deduce the age, gender, mood, etc. Once the age and gender parameters are all set up, a user-interaction scenario is initiated accordingly. Once the interaction is over, or after receiving no response from the person, Pepper gets reset and goes back to its autonomous walking state. My work here would involve selecting the user-interaction scenario to accommodate a healthcare environment, which would make Pepper more useful in such an environment.

Initial Plan (07/02/2022) [Zip Archive]

Final Report (13/05/2022) [Zip Archive]

Publication Form