An "augmented-reality" aid for plastic and reconstructive surgeons

Stud Health Technol Inform. 1997:39:232-6.

Abstract

Starting from MR and CT images for a given patient, a new single image representation of all parameters has been generated by using false-color techniques in a standard UNIX and X-11 environment. A transformation linking together the MR, CT parameters and the RGB (red, green, blue) color components has been used. Moreover an unsupervised segmentation method based on a number of neural and fuzzy models may directly produce segmented image volumes. Each image of the various sequences has been interactively displayed by using a specifically designed application. The resulting images have been displayed on a stereo monitor allowing the three-dimensional rendering of visual data through LCD shuttered glasses. Moreover, a 3-D control system based on low frequency magnetic fields has been used, while a bandheld Polhemus stylus could be used as an electronic knife for dissecting the 3-D data set and for defining flaps and grafts. Bone or soft-tissue contour can be analyzed, and sections can be removed from the model to allow a view of the underlying structures. Flaps and grafts obtained utilizing the above-reported techniques can be fitted exactly, without repeated removal and recarving. Nuances of depth, tapering, and arc are carved directly into the bone, while chances of asymmetry are markedly diminished. In this way, moreover, anesthetic times are reduced by more efficient utilization of operative time, which usually offsets the increased cost of imaging.

MeSH terms

  • Artificial Intelligence
  • Computer Simulation
  • Humans
  • Image Processing, Computer-Assisted / methods*
  • Surgery, Plastic / methods*
  • User-Computer Interface*