Robot Assisted Laparoscopic Hysterectomy

Sample phase segmentation of hysterectomy by a human annotator

Aims: Develop methods that can automate segmentation of procedure into phases and assessment of technical skill

Description: This project aims to develop methods that can automatically 1) segment the surgical procedure into constituent steps (phases), and 2) assess the surgical skill within each step. We have collected a dataset of 30 robot-assisted laparoscopic hysterectomy procedures performed using the da Vinci system. This contains endoscopic stereo video, instrument and console motion, and user events on the system.

Funding: Link Foundation Fellowship, NSF NRI Award 1227277

People: Anand Malpani, S. Swaroop Vedula, Chi Chiung Grace Chen, Gregory D. Hager


  1. Berges AJ, Vedula SS, Chen CCG, Malpani A. Is There a Relationship Between Warm-Up Virtual Reality Simulation and Trainee Robot-Assisted Laparoscopic Hysterectomy Performance? Journal of Minimally Invasive Gynecology. 2018 Nov 1;25(7):S92. (under preparation)
  2. Malpani A, Martinez N, Vedula SS, Hager GD, Chen CCG. Automated Skill Classification using Time and Motion Efficiency Metrics in Vaginal Cuff Closure. Society of Gynecologic Surgeons, 44th Annual Scientific Meeting. Orlando, FL; 2018. (under revision)
  3. Malpani A, Arora S, Vedula SS, Chen CCG, Hager GD. Crowdsourcing surgical activity summaries for phase recognition. LABELS: International Workshop Large-Scale Annotation of Biomedical Data and Expert Labels Synthesis. Granada, Spain; 2018.
  4. Malpani A, Lea C, Chen CCG, Hager GD. System events: readily accessible features for surgical phase detection. Int J CARS. 2016 May 13;11(6):1201–1209.
  5. Chen CCG, Tanner E, Malpani A, Vedula SS, Fader AN, Scheib SA, Green IC, Hager GD. Warm-up before robotic hysterectomy does not improve trainee operative performance: a randomized trial. J Minim Invasive Gynecol. 2015 Nov;22(6, Supplement):S34.

Cataract Surgery

Aims: Develop technology to enable an automated co-pilot for eye surgeons.

Summary: This project aims to develop technology to enable a system that can function as an automated co-pilot for surgeons using cataract surgery as a testbed procedure. Given video of the surgical field during a procedure, this system segments the procedure into constituent activities, assign a skill rating for each activity, and provide commensurate feedback to the surgeon. This project relies upon multiple modes of data including video images and verbal and textual surgical narratives.

Funding: The Wilmer Eye Institute Pooled Professors Fund 2016 (PI: Dr. Shameema Sikder), and an unrestricted research grant to The Wilmer Eye from Research to Prevent Blindness

People: Shameema Sikder, S. Swaroop Vedula, Gregory D. Hager, Anand Malpani, Sidra Zafar, Tae Soo Kim


  1. Kim TS, O’Brien M, Zafar S, Hager GD, Sikder S, Vedula SS. Objective assessment of intraoperative technical skill in capsulorhexis using videos of cataract surgery. Int J CARS [Internet]. 2019 Apr 11 [cited 2019 Apr 19]; Available from:
  2. Yu F, Silva Croso G, Kim TS, Song Z, Parker F, Hager GD, Reiter A, Vedula SS, Ali H, Sikder S. Assessment of Automated Identification of Phases in Videos of Cataract Surgery Using Machine Learning and Deep Learning Techniques. JAMA Netw Open [Internet]. 2019 Apr 5 [cited 2019 Dec 17];2(4). Available from: PMCID: PMC6450320
  3. Kim TS, Malpani A, Reiter A, Hager GD, Sikder S, Swaroop Vedula S. Crowdsourcing Annotation of Surgical Instruments in Videos of Cataract Surgery. In: Stoyanov D, Taylor Z, Balocco S, Sznitman R, Martel A, Maier-Hein L, Duong L, Zahnd G, Demirci S, Albarqouni S, Lee S-L, Moriconi S, Cheplygina V, Mateus D, Trucco E, Granger E, Jannin P, editors. Intravascular Imaging and Computer Assisted Stenting and Large-Scale Annotation of Biomedical Data and Expert Label Synthesis. Granada, Spain: Springer International Publishing; 2018. p. 121–130.

Nasal Septoplasty

Aims: Develop and validate algorithms for objective assessment of intraoperative technical skill and competency in nasal septoplasty.

Summary: This project involves three core research directions. First, develop novel techniques to capture instrument motion data in nasal septoplasty. Second, to develop and validate algorithms based on machine learning and deep learning techniques to automate assessment of surgical technical skill and competency. Third, to determine the association between intraoperative technical skill and patient outcomes. To date, research in this project has led to multiple algorithms for technical skill assessment, evidence that technical complexity of procedures are associated with surgeons’ perception of surgical success, and techniques to model surgical anatomy using instrument motion data, among other discoveries. Ongoing work addresses the impact of surgical anatomy on complexity of the procedure and on intraoperative technical skill. This project relies upon surgical instrument motion data, structured surveys of surgeons, and CT images.

Funding: NIH R01 DE025265 (PI: Dr. Masaru Ishii) and NIH R21 DE022656 (PI: Dr. Masaru Ishii)

People: Masaru Ishii, S. Swaroop Vedula, Molly O’Brien, Anand Malpani, Narges Ahmidi, Lisa Ishii, Hajira Naz, Gregory D. Hager


  1. Tseng YW, Vedula SS, Malpani A, Ahmidi N, Boahene KDO, Papel ID, Kontis TC, Maxwell J, Wanamaker JR, Byrne PJ, Malekzadeh S, Hager GD, Ishii LE, Ishii M. Association Between Surgical Trainee Daytime Sleepiness and Intraoperative Technical Skill When Performing Septoplasty. JAMA Facial Plast Surg [Internet]. 2018 Oct 11 [cited 2018 Nov 15]; Available from:
  2. Ahmidi N, Poddar P, Jones JD, Vedula SS, Ishii L, Hager GD, Ishii M. Automated objective surgical skill assessment in the operating room from unstructured tool motion in septoplasty. Int J CARS. 2015 Apr 17;1–11.