By importing images acquired through minimal user interaction in a clinical setting, a virtual surgical simulator is possible. This program creates a combination of visual, haptic, and aural feedback that processes images to register anatomical structures as a 3D rendering. A research team found that this technology creates a three-dimensional display of realistic bone transparency, fluid simulation rooted in physics, and a constraint-based algorithm of the bone-drill interaction haptics. By implementing and optimizing CUDA and OpenGL graphics, the simulator has the capacity to maintain real-time rendering of large data sets. The user can upload their data, while the algorithm provides information for surgical planning and rehearsal. Current adaptations are for cochlear implants, as well as nasal and sinus surgery.
2021-022
(None)
Corris, AndrewAndrew.Corris@nationwidechildrens.org614-355-1604