Home

diagram

ANUBIS: A Natural User Robot Interface System

As complicated machines become more prevalent in everyday life, new methods are necessary for controlling them.  In our increasingly fast-paced world, telepresence systems are becoming necessary for interacting with people from across the world.  Telepresence systems range from simple business communication solutions to remote surgical robots that require extremely precise control.  As these devices become more complicated and advanced, natural control systems are developing to allow users to smoothly and effortlessly control these robotic avatars.

Design Objectives

We propose to build a user interface system to control a humanoid robot. The project is broken down into several design objectives. They are as follows:

  1. Kinect – use the Kinect to capture upper body motions and translate them into robot control commands
  2. Leap Motion – add the Leap Motion controller to capture fine-grain finger and hand motions
  3. Directional Control – add an appropriate system to capture natural walking movements and translate them into robot directional commands
  4. First Person View – add a camera transmission system to the robot and a headset to the operator to allow for immersive robot control

These design objectives can also be seen as project phases. The Kinect is the first step to a viable proof of concept. The Kinect system components are the building blocks to interfacing with the robot and controlling the various joints. The Leap Motion is a system that is important to the user. This is because without this piece of the project, the user cannot naturally pick objects up. The directional control is the more complex of the pieces. This allows the user to control the robot by walking in a natural manner. The first person view adds a sense of immersion while controlling the robot. It also will give the user a better sense of the robots surroundings and how to navigate those surroundings.

Our approach to solving this problem is to create software systems that will use information from the hardware systems above to control a robotic avatar.  These software systems will need to communicate the user’s motions and intentions to the robot while suppressing any extra noise or unintentional actions.  These software systems may include, but are not limited to, gesture-based control, inverse kinematics, and software that corrects for user error.

We intend to complete the Senior Design Project by showcasing a hardware/software platform for controlling a humanoid robot using a natural user interface.  The project will include appropriate documentation in the form of both written documents and demonstrational videos.

It is important to note that this capstone project would not end with our capstone group. Capstone groups after us would still be able to use the hardware and code that we wrote to expand upon our project (less money would need to be spent on a continuing project if they use the hardware we propose). Components of this project can not only be used together, but also as components for other design ideas. The applications for a system like this can go beyond the laboratory setting.

The project will focus around creating a natural user interface for a humanoid robot.  This is primarily a software project, since most of the hardware already exists in the necessary forms.  The proposed project is a new and current issue that has not yet been solved, so it could further research in this area.  The project will utilize experience gained by the team in previous projects giving the team a good foundation for the software setup.  While most of the work invested in the project will be directed toward creating a software system, the functionality will be readily demonstrable in the form of an articulated humanoid robot.  In addition, many of the components could be reused in future projects.  This project could be the starting point for many robotics and virtual reality projects.