Robot, bring me... my glasses: Mixed air and ground assistive teams

Robots that assist elderly or disabled persons, or even anyone in their day-to-day tasks, can lead to a huge improvement in quality of life. At ROCON we are pursuing domestic mobile manipulators, as well as UAVs for monitoring the persons. Our next goal is to integrate these two platforms into an overall framework that will both monitor the persons and assist them on the ground. This presents a wide range of opportunities for a team of students, starting from low-level vision and control tasks, through sensing and state estimation, to high-level control using machine learning and AI planning tools. Each student will work on one well-defined subtopic in these areas.

Specific tasks include:

  • Vision for detecting the position and state of interesting objects in the environment -- such as a light switch, an object to retrieve, or the person being assisted.
  • Higher-level sensing and estimation concern tracking persons, objects, or the robot itself along their trajectory as it evolves in time, using filtering and estimation techniques.
  • Low-level control includes position and velocity control for the robot arm, UAV, or wheelchair: for example, the robot arm might plan its motion so as to turn off a light switch.
  • High-level control concerns the overall assistive task. For example, a UAV might fly so as to keep monitoring a group of persons, or the mobile manipulator may plan its trajectory so as to find and retrieve a lost object.

Initial results, where an asistive mobile manipulator turns off light switches, are showcased in the demo movie below. Interested students should get in touch with any of the contact persons below to setup a meeting.

Start date (duration): 
November, 2016
Project status and type: