Combining computer graphics, computer vision and virtual reality technology to surgical interventions Combining computer graphics, computer vision and virtual reality technology to surgical interventions

CASSPAR: Computer Assisted Surgery, Simulation and Planning using Augmented Reality

CASSPAR, an EPSRC (Engineering and Physics Sciences Research Council) funded project (10/2002-03/2006) in collaboration with the Department of Medical Physics and Bioengineering at University College London (UCL), aimed to combine computer graphics, computer vision and virtual reality technology to surgical interventions. In this page, we show some results and deliverables which originated from this project.

3D Visualisation: the 3DView software

The 3D visualisation of anatomical structures of interest is an important part of any surgical navigation or simulation application. In the early stages of the project we developed the 3DView software which allows us to render 3D volumes from X-ray Computed Tomography  (CT), Magnetic Resonance (MR) images or any other image modality stored in DICOM or raw format. Cryosections in RGB format can be rendered as well. Aside the standard 3D volume rendered view and the three orthogonal views (sagittal, transverse and coronal), additional functionality includes maximum intensity projection (MIP), multi-planar reformatting (MPR), clip planes, transfer function (RGBA), polygonisation for surface rendering and FEA, stereoscopic view (anaglyph and virtual window) and watershed segmentation. The images below show snapshots of two different 3DView modes: on the left a MIP of a CT head and on the right the segmented skin of the same head with an additional anaglyph stereo window. Download here a demo version of 3DView.

       
 [1] Maximum Intensity Projection (MIP) of a CT head    [2] Segmented skin surface of the same head with an additional anaglyph stereo window  
 

Surgical Navigation with Augmented Reality: ARView

Surgical navigation (SN) systems provide a surgeon with global position information of a surgical tool inserted in the patient's body. This is typically useful in minimally invasive (aka keyhole) surgery where endoscopes inserted in the body show local images but their global position and orientation is uncertain. Commercial SN systems are already used in many hospitals across the world, though not quite routinely due to their high cost and insufficient accuracy for a bevy of critical surgical interventions. Augmented Reality (AR) draws heavily on the standard Virtual Reality (VR) concept and technology. It differs to the latter in that it 'augments' the real world with virtual objects and subjects, as opposed to generating an entirely virtual world. AR in combination with SN provides the surgeon with so called X-ray vision. Virtual images of the patient, typically drawn from CT Or MR images, are overlaid with intra-operatively obtained images from optical devices such as the surgical microscope or an endoscope. This allows the surgeon to see structures lying underneath the currently observed anatomy.

Three operations are needed to arrive at an AR overlay:

The ARView software is an extended version of 3DView, which performs the above three operations and can be plugged into any existing surgical navigation system. A recent paper on this topic was published in the International Journal of Medical Robotics and Computer Assisted Surgery (IJMRCAS).

Also have a look at this video (avi 41 MB) showing the complete overlay process for a rigid endoscope in a lab setting.

Research Team:

Dr. Rudy Lapeer and Dr. Min Si Chen.