Overview

Description

Robots can help people around the house by operating appliances: washing mashines, microwaves, light switches, or even television sets. This could especially help people with limited mobility, perhaps due to increasing age or to a physical disability. The problem is, these appliances were made for humans, not for robots, and robots really struggle to percieve and manipulate what we call the "device controls": knobs, buttons, switches, etc. This project makes use of a concept called "Shared Autonomy" to allow humans to help robots complete these difficult tasks. Shared autonnomy is when the robot does some subtasks on its own (e.g., turning a knob to a specific angle) but asks for abbreviated human assistance on other subtasks (e.g. finding the knob in the first place).

Thus far, this project has yielded interfaces through which a human can help the robot locate knobs, light switches, and push-buttons on a device, after which the robot can actuate the controls autonomously. The robot only needs to ask once for each control, as the locations are stored in a map of the device. Pilot studies indicate that the shared autonomy approach is faster and less frustrating than a click-and-drag teleoperation approach, as well as more frequently successful.   

Future Plans

This project seeks to build a large, semantic map of the robot's environment to help the robot do useful tasks on its own. This involves knowing where each room is (by name) within the house, where each device is within the room, and where each control is on the device. Future work will include re-detecting devices whenever the robot returns to them, both to eliminate the robot's navigation error from driving around and to account for small displacements of the device, e.g., if somebody bumped it. Programming the robot to re-find devices like this will make our framework more robust and practicable in a potentially chaotic home environment.