The introduction of surgical robots in minimally invasive surgery has allowed enhanced manual dexterity through the use of microprocessor controlled mechanical wrists. They permit the use of motion scaling for reducing gross hand movements and the performance of micro-scale tasks that are otherwise not possible. The high degree of freedom offered by robotic surgery, however, can introduce the problems of complex instrument control and hand-eye coordination. The purpose of this project is to investigate the use of real-time binocular eye tracking for empowering the robots with human vision using knowledge acquired in situ, thus simplifying, as well as enhancing, robotic control in surgery. By utilizing the close relationship between the horizontal disparity and the depth perception, varying with the viewing distance, we demonstrate how vergence can be effectively used for recovering 3D depth at the fixation points and further be used for adaptive motion stabilization during surgery. A dedicated stereo viewer and eye tracking system has been developed and experimental results involving normal subjects viewing real as well as synthetic scene are presented. Detailed quantitative analysis demonstrates the strength and potential value of the method.
pubs.doc.ic.ac.uk: built & maintained by Ashok Argent-Katwala.