Computer-Assisted Medical Interventions (CAMI) Assistant

Description

The chair is concerned with the use of Artificial Intelligence in the context of developing computer assistance to medical and surgical interventions. This includes AI-based image processing and calibration, AI-based simulation, extracting knowledge from intervention tracks to be able to model intervention quality. Context-aware assistance and more autonomous assisting devices (including robots) are also considered in the chair.
The chair is conducted in close collaboration with clinical teams and industrial partners.

Activities

The work already undertaken is about the use of deep learning for automatic calibration of interventional CBCT imaging systems. Learning is also being studied for image processing and fusion in order to allow real-time guidance of diagnostic or treatment procedures such as prostate biopsies and neurosurgery. In these two cases, ultrasound imaging has a predominant role during the operation and raises big challenges in terms of image processing. In addition, we are developing new robotic approaches based on continuous robots with a medium-term objective of endoluminal use.

Chair events

Structuring activity on AI at the national level with CAMI partners. Working group about AI for Health in TIMC and CIC/CHU Grenoble Alpes.

Scientific publications

Journals

  • Calka, M., Perrier, P., Ohayon, J., Grivot-Boichon, C., Rochette, M., & Payan, Y. (2020). Machine-Learning based model order reduction of a biomechanical model of the human tongue. Computer Methods and Programs in Biomedicine, 105786.
    https://hal.archives-ouvertes.fr/hal-02966539/
     
  • Caggiari, S., Worsley, P. R., Payan, Y., Bucki, M., & Bader, D. L. (2020). Biomechanical monitoring and machine learning for the detection of lying postures. Clinical Biomechanics, 105181.
    https://arxiv.org/ftp/arxiv/papers/2010/2010.03804.pdf
     
  • Derathé, A., Reche, F., Moreau-Gaudry, A., Jannin, P., Gibaud, B., & Voros, S. (2020). Predicting the quality of surgical exposure using spatial and procedural features from laparoscopic videos. International Journal of Computer Assisted Radiology and Surgery, 15(1), 59-67. https://hal.archives-ouvertes.fr/hal-02353077
     
  • Carton, F. X., Chabanas, M., Le Lann, F., & Noble, J. H. (2020). Automatic segmentation of brain tumor resections in intraoperative ultrasound images using U-Net. Journal of Medical Imaging, 7(3), 031503. doi: 10.1117/1.JMI.7.3.031503

Conferences

  • Dupuy, T., Beitone, C., Troccaz, J., & Voros, S. (2021). 2D/3D Deep registration for real-time prostate biopsy navigation. Accepted In SPIE Medical Imaging 2021: Image-Guided Procedures, Robotic Interventions, and Modeling.
     
  • Carton, F. X., Chabanas, M., Munkvold, B. K., Reinertsen, I., & Noble, J. H. (2020). Automatic segmentation of brain tumor in intraoperative ultrasound images using 3D U-Net. In SPIE Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions, and Modeling (Vol. 11315, p. 113150S), Houston, March 2020. doi: 10.1117/12.2549516,
     
  • Lapouge, G., Younes, H., Poignet, P., Voros, S., & Troccaz, J. (2019). Needle Segmentation in 3D Ultrasound Volumes Based on Machine Learning for Needle Steering. Hamlyn Symposium on Medical Robotics, London, June 2019. https://hal.archives-ouvertes.fr/hal-02276986