Robot grasping test data base and Viz

Name: Cindy Grimm
Affiliation: MIME/Robotics/EECS
Phone: 737-4914
E-mail: grimmc@onid.oregonstate.edu
Website:
Knowledge Required: One or more of: database design, UI design, visualization experience, systems implementation
Motivation: Right now you either have to own a robot arm/manipulator (which can cost upwards of 15K for a decent arm) or have a facility nearby that has one. You also need to maintain the arm and understand how to make it run. As anyone who’s worked with specialized hardware knows, this is a full-time job in and of itself. And suppose you want to test your algorithm on more than one arm… On top of this, designing specialized test equipment is difficult, and makes repeatable experiments a challenge. Our (long - 5 yr) goal is to develop a test facility that gives remote users access to both robot arms of different types and testing suites and benchmarks.

Our motivation for this senior design project is to develop proto-types for important parts of this type of system in order to prove feasibility.
Description: Designing key components of a web-based software repository for storing robot grasping data. Has both a UI, a database, and a visualization component. The basic idea is to enable remote users to set up a grasp in a grasp test facility. The remote user provides an algorithm for reaching out and grasping an object (which may include feedback from, eg, a camera or touch sensors). The system then runs the algorithm, and records what happens using sensors in the grasping test hardware (robot joint angles, torques, videos, force sensors, IMUs). The remote user can then visualize what happened and “re-play” it. We also envision that the remote user can run the grasp test “live” by tele-operating the robot arm.

There are obviously many components to this: The front end (specifying the object, providing the algorithm, testing it in a simulator), the data capture itself, developing the data base with a web-based front-end, and visualizing the results. Any one of these components could be a project, depending on the interests/skills of the group.

Objectives: The goal of this project would be to create a first-draft, working proto-type of one part of the proposed system.
Deliverables:
Project deliverables
One of:
Specification & key component implementation of the UI for setting up a test. This would be built on top of an existing simulator (OpenRave). The key concern is creating an interface that is sufficiently flexible that users could “test drive” the system in a way that is agnostic to the underlying hardware (arm, data streams). We envision a Robot Operating System (ROS)-style “Topic subscription” model.
Database design. In the end, we envision the test center generating a continuous stream of data that is stored in a way that is easily accessible on the web. This data is heterogeneous, and consists of combinations of: Arm & object tested, grasp tested, algorithm tested, results of the test both data stream (cameras, videos, sensors, actual movement of the arm, object) and the ability to annotate in a human-readable form (“This was a 3 fingered grasp of a slippery, hard object that failed because the object slipped out of the grasp, possibly because the grasp was not tight enough”). Part of the design is to envision the types of queries users might want to do and how to specify them.
Visualization design. Given a test, show the user what happened. This includes both 3D reconstructions (how the arm and object moved) and visualization of force/torque/IMU data on top of those reconstructions.
Other comments: There is already an existing simulator (two actually, GraspIt! And OpenRave) for specifying grasps, both of which are built in Python. We would like to integrate with the Robot Operating System (also in Python, and with it’s own visualization - OpenRave also integrates with ROS) as a general model for communication between the algorithms/ui and the robot arm itself/OpenRave.

   D. Kevin McGrath
   Last modified: Thu Nov 16 11:32:03 2017