Haptic Exploration using Dextrous Robotic Hands

[The Exploratory Procedure] [Simulations] [Experimentation]
[Future Work] [Download the Paper]

The Exploratory Procedure

[Human Exploratory Procedures] [A Robotic Exploratory Procedure]Humans are very good at learning about unknown objects using the sense of touch. However, most robots are created to operate in structured environments and are not equipped with the sensors and software to learn about such objects. Our research goal is to create an approach for haptic exploration by robotic hands.

Mimicking the human process of exploration, the robotic exploratory procedure is a sequence of phases (see a description of the Phase-Based Framework) in which fingers alternately manipulate and explore an object surface. The minimal configuration for this procedure in the plane is two active fingers and a passive "palm" or ground.

During the manipulation phase, the two active fingers grasp the object and rotate it, using rolling contacts to increase the workspace. For more information about rolling, please see our pages on Rolling Manipulation.

During the exploratory phases, one finger and the palm stably grasp the object while the third finger explores by rolling and sliding over the object surface. The state diagram below shows the phases and transitions involved in this process. The backward transition occurs when a phase must be bypassed because of workspace limitations or if a stable grasp cannot be found.

[Phases of the Exploratory Procedure]

Simulations

[A Picture of the Java Simulation] The algorithm was initially tested using simulations in MATLAB and Java. Simulation was necessary to ensure that the exploratory algorithm worked as an automated process, as well as to test the efficiency of the procedure for different object shapes given the workspace limitations of our robot fingers. Java was chosen because it is object oriented and thus is well-suited to simulate the interactions between independent objects (the fingers and the unknown object). The picture on the right is an example snapshot of the simulation.

Experimentation

The algorithm was tested on Marvin, a two-fingered robot "hand" in the Dextrous Manipulation Laboratory. The figure on the left below shows this robot exploring a 10cm diameter plastic softball with a 4mm ridge feature. The robot has an idea of the size of the object obtained when it first made contact and grasped the object. The figure on the right shows a close up of one one of the fingertips we have used on the robot. This particular finger has piezoelectric sensors made from PVDF embedded in a silcone rubber skin, a strain gauge force sensor, and an optical contact sensor made from fiber optic cables.

[The Robot Exploring] [A Finger with Sensors]

The figure below shows sensor information that was gathered using the exploratory procedure. The tactile array sensor data clearly shows the shape of the feature, while the tangential force sensed shows a change in force and stick-slip vibrations occurring as the finger moves over the feature.

[Data gathered suring Exploration]

Future Work

We believe that there is much work to be done in this field to both improve robotic exploration and recognition of unknown objects. First, we must develop robotic fingers and sensors well-suited for exploration. This includes improving finger workspace and fingertip shapes. We have also begun experiments with piezoelectric and optical contact sensors.

Second, more exploratory procedures for sensing different object properties should be implemented and a robust object model should be developed that can store the sensor data obtained during exploration. However, this model should not simply store raw sensor data; the model should extract salient information from the data to allow real-time modification of the exploratory procedure.

Third, the model should be displayed on a haptic interface using force- and tactile-feedback. One of the possible applications of this work is a "remote geologist" that would pick up rocks during remote planetary exploration. It is logical that the models sent back to earth should be displayed in a similar manner to they way in which they were obtained - why use only a visual display if the model is created from tactile data?

Download the paper

Some of the research outlined on this page is described in more detail in a paper presented and published at the 1997 IEEE International Conference on Robotics and Automation. Please download a pdf version of "Haptic Exploration of Objects with Rolling and Sliding" by Allison M. Okamura, Michael L. Turner, and Mark R. Cutkosky.

For related work and other papers published by the DML, please see our Publications page. For information about the authors, please see the People page.


Back to the Dextrous Manipulation Lab home page or projects page.
Allison M. Okamura
May 4, 1997