Eye-hand coordination

Eye-hand coordination (also known as hand–eye coordination) is the coordinated control of eye movement with hand movement and the processing of visual input to guide reaching and grasping along with the use of proprioception of the hands to guide the eyes.

In microsurgery, surgeons use micro instruments under high magnifications to handle delicate tissues. These procedures require highly skilled attentional and motor control for planning and implementing eye-hand coordination strategies. Eye-hand coordination in surgery has mostly been studied in open, laparoscopic, and robot-assisted surgeries, as there are no available tools to perform automatic tool detection in microsurgery. Koskinen et al. introduced and investigated a method for simultaneous detection and processing of micro-instruments and gaze during microsurgery. They trained and evaluated a convolutional neural network for detecting 17 microsurgical tools with a dataset of 7500 frames from 20 videos of simulated and real surgical procedures. Model evaluations result in mean average precision at the 0.5 thresholds of 89.5-91.4% for validation and 69.7-73.2% for testing over partially unseen surgical settings, and the average inference time of 39.90 ± 1.2 frames/second. While prior research has mostly evaluated surgical tool detection on homogeneous datasets with a limited number of tools, they demonstrated the feasibility of transfer learning and conclude that detectors that generalize reliably to new settings require data from several different surgical procedures. In a case study, they applied the detector with a microscope eye tracker to investigate tool use and eye-hand coordination during an intracranial vessel dissection task. The results show that tool kinematics differentiate microsurgical actions. The gaze-to-microscissors distances are also smaller during dissection than other actions when the surgeon has more space to maneuver. The presented detection pipeline provides the clinical and research communities with a valuable resource for automatic content extraction and objective skill assessment in various microsurgical environments 1).


2: Chainey J, Elomaa AP, O'Kelly CJ, Kim MJ, Bednarik R, Zheng B. Eye-Hand Coordination of Neurosurgeons: Evidence of Action-Related Fixation in Microsuturing. World Neurosurg. 2021 Nov;155:e196-e202. doi: 10.1016/j.wneu.2021.08.028. Epub 2021 Aug 13. PMID: 34400325.

3: Chainey J, O'Kelly CJ, Kim MJ, Zheng B. Neurosurgical performance between experts and trainees: Evidence from drilling task. Int J Med Robot. 2021 Oct;17(5):e2313. doi: 10.1002/rcs.2313. Epub 2021 Jul 23. PMID: 34288358.

4: Baby B, Singh R, Suri A, Dhanakshirur RR, Chakraborty A, Kumar S, Kalra PK, Banerjee S. A review of virtual reality simulators for neuroendoscopy. Neurosurg Rev. 2020 Oct;43(5):1255-1272. doi: 10.1007/s10143-019-01164-7. Epub 2019 Aug 23. PMID: 31444716.

5: Busse H, Kahn T, Moche M. Techniques for Interventional MRI Guidance in Closed-Bore Systems. Top Magn Reson Imaging. 2018 Feb;27(1):9-18. doi: 10.1097/RMR.0000000000000150. PMID: 29406410.

6: Choque-Velasquez J, Colasanti R, Collan J, Kinnunen R, Rezai Jahromi B, Hernesniemi J. Virtual Reality Glasses and “Eye-Hands Blind Technique” for Microsurgical Training in Neurosurgery. World Neurosurg. 2018 Apr;112:126-130. doi: 10.1016/j.wneu.2018.01.067. Epub 2018 Jan 31. PMID: 29360589.

7: Bigsby K, Mangine RE, Clark JF, Rauch JT, Bixenmann B, Susaret AW, Hasselfeld KA, Colosimo AJ. Effects of postural control manipulation on visuomotor training performance: comparative data in healthy athletes. Int J Sports Phys Ther. 2014 Aug;9(4):436-46. PMID: 25133072; PMCID: PMC4127506.

8: Gasco J, Patel A, Luciano C, Holbrook T, Ortega-Barnett J, Kuo YF, Rizzi S, Kania P, Banerjee P, Roitberg BZ. A novel virtual reality simulation for hemostasis in a brain surgical cavity: perceived utility for visuomotor skills in current and aspiring neurosurgery residents. World Neurosurg. 2013 Dec;80(6):732-7. doi: 10.1016/j.wneu.2013.09.040. Epub 2013 Sep 25. PMID: 24076054.

9: van Lindert EJ, Grotenhuis JA, Beems T. The use of a head-mounted display for visualization in neuroendoscopy. Comput Aided Surg. 2004;9(6):251-6. doi: 10.3109/10929080500165476. PMID: 16112975.

10: Alvarez M, Carvajal F, Renón A, Pérez C, Olivares A, Rodríguez G, Alvarez V. Differential effect of fetal, neonatal and treatment variables on neurodevelopment in infants with congenital hypothyroidism. Horm Res. 2004;61(1):17-20. doi: 10.1159/000075192. Epub 2003 Nov 27. PMID: 14646397.

11: Mittal S, Farmer JP, Al-Atassi B, Montpetit K, Gervais N, Poulin C, Cantin MA, Benaroch TE. Impact of selective posterior rhizotomy on fine motor skills. Long-term results using a validated evaluative measure. Pediatr Neurosurg. 2002 Mar;36(3):133-41. doi: 10.1159/000048368. PMID: 11919447.

12: Ruff RM. What role does depression play on the performance of the Ruff 2 and 7 Selective Attention Test? Percept Mot Skills. 1994 Feb;78(1):63-6. doi: 10.2466/pms.1994.78.1.63. PMID: 8177689.

1)
Koskinen J, Torkamani-Azar M, Hussein A, Huotarinen A, Bednarik R. Automated tool detection with deep learning for monitoring kinematics and eye-hand coordination in microsurgery. Comput Biol Med. 2022 Feb;141:105121. doi: 10.1016/j.compbiomed.2021.105121. Epub 2021 Dec 11. PMID: 34968859.