Current Projects

  • Biomimetic Robots: [ Running ]   [ Climbing ]
    This research is aimed at developing a new class of biologically inspired robots that exhibit much greater robustness in performance in unstructured environments than today's robots. This new class of robots will be substantially more compliant and stable than current robots, and will take advantage of new developments in materials, fabrication technologies, sensors and actuators. Applications will include autonomous or semi-autonomous tasks such as reconnaissance and de-mining for small, insect-like robots and human interaction tasks at a larger scale.
  • [ Haptics projects]
    - Low power cutaneous haptic feedback and sensing for portable, wearable applications.
    - Embedded fiber bragg sensors for light-weight, robust and highly sensitive robotic arms and fingers.

Previous Projects

Haptics Projects 

  • Tactile Sensing for Exploration and Dexterous Manipulation
    Tactile Sensing and Information Processing for Man and Machine Systems is a joint project with Professor Gregory Kovacs of the Center for Integrated Systems (CIS) Transducers Group at Stanford and with Professors Robert Howe and Roger Brockett at the Harvard Robotics Laboratory to investigate mechanisms for acquiring and interpreting information from tactile sensors for use with robotic and teleoperated hands. For robotic hands we are also exploring how to program and control hands that use tactile sensors. For teleoperated hands we are exploring how best to relay haptic information to human operators. For part of this work we are collaborating with physiologists from the University of Umea, Sweden. This project is sponsored by the Perceptual Sciences Division of the Office of Naval Research.
  • Haptic Exploration Stylus
    Haptic Exploration Stylus for Telegeology is a joint project with the Intelligent Mechanisms Group of the NASA Ames Research Center to investigate issues surrounding using a needle-like sensor to make teleoperated and autonomous geological observations. The project involves developing a stylus sensor to collect haptic information useful to a telescientist, interpreting the information, storing it, and displaying the information on a haptic replay device.
    • Interview with Prof. Vermeij. Prof. Vermeij is a blind paleontologist at U.C. Davis. He describes how he uses haptic feedback for field exploration.
  • Haptic Environment Identification with Friction
    Haptic Environment Identification is joint project with Interval Corporation. Although friction is becoming increasingly important in manipulation, the current state of the art in terms of identifying, modeling and displaying frictional properties through a haptic interface is primitive. We are therefore undertaking a series of experiments and analyses aimed at identifying the main frictional properties of small devices and displaying them through a haptic interface. See publications by Christopher Richard and his thesis. 
  • Haptics in Education: The Haptic Paddle
    The haptic paddle is a one degree of freedom force-feedback joystick that was developed to illustrate principles in an undergraduate mechanical engineering class (ME161: Dynamic Systems) at Stanford University. The goal was to create a demonstration tool that would be inexpensive, flexible, and fun for the students.

Videos

  • (387K) Telemanipulating a two fingered robot with Virtual Technology's CyberGlove with
    force-feedback. Forces sensed at the robotic fingers are replayed to the user through the ungrounded device. Tests were performed, showing that users could discriminate between objects with less than 0.5 cm difference in width. 
  • Two short video clips of event-based manipulation with a two-fingered hand: 
    • (137K) Responding to an event in which an object is stripped from the grasp, causing a hard constraint to be violated and prompting a transition to a new phase in which the fingers retract.
    • (223K) Assembling an object into a corner. The corner location is unexpectedly moved so that an object/floor contact occurs before finalassembly. This event triggers a transition to a sliding phase that brings the object into the corner.

Other Projects