Show pageBacklinksCite current pageExport to PDFBack to top This page is read only. You can view the source, but not change it. Ask your administrator if you think this is wrong. In [[microsurgery]], surgeons use [[micro instruments]] under high [[magnification]]s to handle delicate [[tissue]]s. These [[procedure]]s require highly [[skill]]ed [[attention]]al and [[motor]] control for [[planning]] and implementing [[eye-hand coordination]] strategies. Eye-hand coordination in surgery has mostly been studied in open, laparoscopic, and robot-assisted surgeries, as there is no available [[tool]]s to perform automatic tool detection in microsurgery. Koskinen et al. introduced and investigated a method for simultaneous detection and processing of micro-instruments and gaze during microsurgery. They trained and evaluated a [[convolutional neural network]] for detecting 17 microsurgical tools with a dataset of 7500 frames from 20 videos of simulated and real surgical procedures. Model evaluations result in mean average precision at the 0.5 thresholds of 89.5-91.4% for validation and 69.7-73.2% for testing over partially unseen surgical settings, and the average inference time of 39.90 ± 1.2 frames/second. While prior research has mostly evaluated surgical tool detection on homogeneous datasets with a limited number of tools, they demonstrated the feasibility of transfer learning and conclude that detectors that generalize reliably to new settings require data from several different surgical procedures. In a case study, they applied the detector with a [[microscope]] [[eye tracker]] to investigate tool use and eye-hand coordination during an intracranial [[vessel]] [[dissection]] [[task]]. The results show that tool [[kinematics]] differentiate microsurgical actions. The gaze-to-microscissors distances are also smaller during dissection than other actions when the surgeon has more space to maneuver. The presented detection pipeline provides the clinical and research communities with a valuable resource for automatic content extraction and objective skill assessment in various microsurgical environments ((Koskinen J, Torkamani-Azar M, Hussein A, Huotarinen A, Bednarik R. Automated tool detection with deep learning for monitoring kinematics and eye-hand coordination in microsurgery. Comput Biol Med. 2022 Feb;141:105121. doi: 10.1016/j.compbiomed.2021.105121. Epub 2021 Dec 11. PMID: 34968859.)). motor_control.txt Last modified: 2024/06/07 02:55by 127.0.0.1