Virginia Tech® home

Assistive Robotics & Health Technologies Research

People with autistic spectrum disorders are known to have three major issues - impairment of social relationships, social communication, and imaginative thought. The first two problems have a common element, "social interaction with people". Literature has shown that people with autism are likely to interact with computers and animals rather than humans because they are basically simpler than humans. Recent research also shows that using interactive robots might facilitate their social interaction. In this line, we have adopted a small iOS-based interactive robot, "Romo" as our research platform and developed an emotional interaction platform for children with autism. Based on our facial expression detection (client) and sonification server systems, we are creating a social interaction game. In addition, we use multiple Kinects in the room and oversee the overall interaction scene (e.g., distance, turn-taking, chasing, etc.). This project is supported by NIH (National Institutes of Health) via the NRI (National Robotics Initiative) program.

Robots Students

The trend to integrate art and science is pervasive in formal education. STEM education is evolving into STEAM (Science, Technology, Engineering, Art, and Math) movement by adding art and design to the equation. We have tried to develop informal STEAM education programs, specifically, for underrepresented students in a local area, including female students, students from low income families, and students with various disabilities. Our third iteration is a 13-week afterschool program at Eastern Montgomery Elementary School through effective human robot interactions (HRI). The present research project investigates the impact of HRI on a child-robot theater platform on children's interest in the STEM filed. 

Robot Musical Theater Afterschool Program at Eastern Montgomery Elementary School.

Robot Theatre
Robot Musical Theater 2021

As the influence of social robots in people’s daily lives continues to grow, research has been conducted on understanding people’s perception about robots’ characteristics, including sociability, trust, acceptance, and preference. Among many variables, we have focused on factors that influence user perception on robots’ emotive expressions. Robots’ facial expressions, voice (speech), body language, and posture have been considered and we have integrated multiple facets of user perception on robots during a conversational task by varying the robot types, emotions, facial expressions, and voice types. The results will provide a design guideline for emotional and trustworthy robots, especially employing emotive expressions and facilitate the relationship between people and social robots such as assistive robots, voice assistants, and any other conversational agents. This project is supported by the Northrop Grumann Undergraduate Research Experience Award.

Robot Voice

Affect detection systems are a great means of procuring direct feedback for a system by tracking the user’s reactions. In this case, the facial affect detection system monitors the facial expression of users and detects the emotion displayed. Visualization of emotions detected can aid in making informed decisions. It can be applied to study the effects of the system on the user and alter the behavior of the system accordingly. For instance, the visualizations of a system where a robot is interacting with a child can be monitored in order to change the course of robot action according to the insights gained from monitoring the emotions of the child during the course of interaction.

Affect Detection

The Smart Exercise application is an Android application paired with a wearable Bluetooth IMU sensor that is designed to provide real-time auditory and visual feedback on users’ body motion, while simultaneously collecting kinematic data on their performance. The application aims to help users improve physical and cognitive function, improve users’ motivation to exercise, and to give researchers and physical therapists access to accurate data on their participants’ or patients’ performance and completion of their exercises without the need for expensive additional hardware.

Smart Exercise App

Performing independent physical exercise is critical to maintain one's good health. However, it is hard specifically for people with visual impairments to do exercise without proper guidance. To address this problem, we have developed a Musical Exercise platform using Microsoft Kinect for people with visual impairments. With the help of audio feedback of Musical Exercise, people with visual impairments can perform exercises in a good form consistently. Our empirical assessment shows that our system is a usable exercise assistance system. The results also confirm that a specific sound design (i.e., discrete) is better than the other sound or no sound at all.

Musical Exercise

Many outdoor navigation systems for visually impaired people are already out there. However, a few attempts have been made to enhance their indoor navigation. Think about your blind friends when they attend the conference and stay at a hotel. They might not be familiar with all the layout of the new room and the entire hotel. We have interviewed visually impaired people and identified current problems and plausible issues. Based on that, we have designed and developed indoor navigation system, called "Personal Radar" using an ultrasonic belt (as an object detection technology) and tactile and auditory feedback (as a display technology). For the next version, we are looking at the application of a new lidar system.

Blindfold

The primary purpose of this research project is to emerge collaboration patterns among neurotypical and neurodiverse individuals when performing a task in both in-person and remote settings. Specifically, we are interested in how neurotypical adults and autistic adults collaborate with and within each other. We plan to collect data from various sensors (e.g., cameras, heart rate monitors) and utilize them to establish cognitive, emotional, and engagement states of individuals during collaborative tasks and to eventually model collaborative behaviors. Essentially, outcomes of this research can contribute to the workplace design that provides individualized support to enhance effective and delighted collaboration among neurodiverse team settings. Collaborated with Virginia Tech Autism Clinic & Center for Autism Research and the Department Human Development and Family Science, this research project has been supported by the NIH R03 Grant Program.

Sample Task
Sample collaborative task using Minecraft