Development of a sensor fusion method for AR-based surgical navigation
Current surgical navigation systems are mostly limited to displaying their results on external monitors in the vicinity of the patient, which forces the surgeon to switch between the displayed planning and the surgical site. Augmented reality (AR) devices reduce this problem by directly displaying the planning in the surgeon’s field of view.The goal of this project was to show and evaluate the possibility of using AR for surgical navigation. The work focused on the implementation of a visual marker registration and pose estimation algorithm and the development of an inertial-optical sensor fusion approach.MaterialsThe work utilizes the Moverio BT-35E (EPSON) and the visual marker DENAMARK (mininavident AG). The BT-35E has a built-in camera and an inertial measurement unit (IMU). The proposed system is implemented in C++ on a Windows computer, which is connected to the AR-device.MethodsFigure 1: (a) visual marker DENARMARK and (b) used AR-device: Moverio BT-35E.The overall software concept is given in…