Robust Event Detection for Dextrous Manipulation

The detection of events uses a combination of tactile sensors and context, as shown in the figure below. The context consists of information about the phase (only certain kinds of events are feasible in each phase class) and information about the commanded motions, forces, etc. This information is combined to assess the probability that a particular event has occurred, as explained following the figure.

Features are derived from the sensor and context information and clustered in a feature space. The example shown below is a two-dimensional feature space; various events are shown as subspaces.

Event confidence functions are computed from the feature space clusters and used to determine (i) which events could be occuring and (ii) which event(s) probably have occured. By definition, a new phase begins as soon as an event is committed to, and a new context information then applies. As explained in the framework project description, a prepatory action (deceleration) begins when a Contact Event confidence exceeds an "alert" threshold. The contact phase begins when the Contact Event confidence exceeds a commitment threshold.

The following figure shows the results for several events during an object manipulation task. In each case the real event is listed at the top of the plot. The correct events are quickly identified, althoug in some cases another event briefly exceeds an "alert" threshold.

Details on the event detection scheme can be found in:

  • Marc Tremblay's PhD thesis


Back to the Dextrous Manipulation Lab home page or projects page.
Marc Tremblay and Mark Cutkosky
May 1995
{sub_heading}{sub_heading}