Stereoscopic augmented reality for single camera endoscopy: a virtual study

Yen Yu Wang, Atul Kumar, Kai Che Liu, Shih Wei Huang, Ching Chun Huang, Wei Chia Su, Fu Li Hsiao, Wen Nung Lie

Research output: Contribution to journalArticle

Abstract

Endoscopic surgery causes less tissue injury compared to open surgical techniques, thus promoting more rapid recuperation and reduced post-operative pain. Endoscopy, however, allows the surgeon to visualise only the anatomical surface of the surgical site, with a relatively narrow field of view. Moreover, the 2D video captured by the conventional endoscope does not provide depth perception of the surgical scene. In this study, these limitations have been addressed with the development of an augmented reality (AR) system with stereoscopic visualisation. A phantom and its 3D CT model were used, respectively, to form the real and virtual parts of the AR. The virtual environment camera pose was tracked using algorithms for image feature detection, feature matching and Perspective-n-Point applied on the endoscopic image and the 3D virtual model-rendered image. The endoscope video frame- and the virtual model-rendered images were superimposed to form the AR composite view. The depth buffer (z-buffer) of the rendering window was further used to make a stereo pair of the AR image. The AR system produced a stereo composite view having well-aligned real and virtual components. The RMS error of the real and virtual image contours registration was 9.6 ± 6.7 mm. Correlation coefficients between the depth map from z-buffer and a depth camera was between 0.60 and 0.96 (p < 0.05). The AR system requires further improvement to be applicable at a higher frame rate of endoscope image acquisition. It also needs to include motion and deformation models when applied to animals or patients.

Original languageEnglish
Pages (from-to)182-191
Number of pages10
JournalComputer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization
Volume6
Issue number2
DOIs
Publication statusPublished - 2018 Mar 4

Fingerprint

Endoscopy
Endoscopes
Augmented reality
Buffers
Cameras
Depth Perception
Depth perception
Pain
Image acquisition
Wounds and Injuries
Composite materials
Surgery
Virtual reality
Animals
Visualization
Tissue

All Science Journal Classification (ASJC) codes

  • Computational Mechanics
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Science Applications

Cite this

Wang, Yen Yu ; Kumar, Atul ; Liu, Kai Che ; Huang, Shih Wei ; Huang, Ching Chun ; Su, Wei Chia ; Hsiao, Fu Li ; Lie, Wen Nung. / Stereoscopic augmented reality for single camera endoscopy : a virtual study. In: Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization. 2018 ; Vol. 6, No. 2. pp. 182-191.
@article{1123a857e3474d60a10981438b09fdbb,
title = "Stereoscopic augmented reality for single camera endoscopy: a virtual study",
abstract = "Endoscopic surgery causes less tissue injury compared to open surgical techniques, thus promoting more rapid recuperation and reduced post-operative pain. Endoscopy, however, allows the surgeon to visualise only the anatomical surface of the surgical site, with a relatively narrow field of view. Moreover, the 2D video captured by the conventional endoscope does not provide depth perception of the surgical scene. In this study, these limitations have been addressed with the development of an augmented reality (AR) system with stereoscopic visualisation. A phantom and its 3D CT model were used, respectively, to form the real and virtual parts of the AR. The virtual environment camera pose was tracked using algorithms for image feature detection, feature matching and Perspective-n-Point applied on the endoscopic image and the 3D virtual model-rendered image. The endoscope video frame- and the virtual model-rendered images were superimposed to form the AR composite view. The depth buffer (z-buffer) of the rendering window was further used to make a stereo pair of the AR image. The AR system produced a stereo composite view having well-aligned real and virtual components. The RMS error of the real and virtual image contours registration was 9.6 ± 6.7 mm. Correlation coefficients between the depth map from z-buffer and a depth camera was between 0.60 and 0.96 (p < 0.05). The AR system requires further improvement to be applicable at a higher frame rate of endoscope image acquisition. It also needs to include motion and deformation models when applied to animals or patients.",
author = "Wang, {Yen Yu} and Atul Kumar and Liu, {Kai Che} and Huang, {Shih Wei} and Huang, {Ching Chun} and Su, {Wei Chia} and Hsiao, {Fu Li} and Lie, {Wen Nung}",
year = "2018",
month = "3",
day = "4",
doi = "10.1080/21681163.2016.1197798",
language = "English",
volume = "6",
pages = "182--191",
journal = "Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization",
issn = "2168-1163",
publisher = "Taylor and Francis Ltd.",
number = "2",

}

Stereoscopic augmented reality for single camera endoscopy : a virtual study. / Wang, Yen Yu; Kumar, Atul; Liu, Kai Che; Huang, Shih Wei; Huang, Ching Chun; Su, Wei Chia; Hsiao, Fu Li; Lie, Wen Nung.

In: Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization, Vol. 6, No. 2, 04.03.2018, p. 182-191.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Stereoscopic augmented reality for single camera endoscopy

T2 - a virtual study

AU - Wang, Yen Yu

AU - Kumar, Atul

AU - Liu, Kai Che

AU - Huang, Shih Wei

AU - Huang, Ching Chun

AU - Su, Wei Chia

AU - Hsiao, Fu Li

AU - Lie, Wen Nung

PY - 2018/3/4

Y1 - 2018/3/4

N2 - Endoscopic surgery causes less tissue injury compared to open surgical techniques, thus promoting more rapid recuperation and reduced post-operative pain. Endoscopy, however, allows the surgeon to visualise only the anatomical surface of the surgical site, with a relatively narrow field of view. Moreover, the 2D video captured by the conventional endoscope does not provide depth perception of the surgical scene. In this study, these limitations have been addressed with the development of an augmented reality (AR) system with stereoscopic visualisation. A phantom and its 3D CT model were used, respectively, to form the real and virtual parts of the AR. The virtual environment camera pose was tracked using algorithms for image feature detection, feature matching and Perspective-n-Point applied on the endoscopic image and the 3D virtual model-rendered image. The endoscope video frame- and the virtual model-rendered images were superimposed to form the AR composite view. The depth buffer (z-buffer) of the rendering window was further used to make a stereo pair of the AR image. The AR system produced a stereo composite view having well-aligned real and virtual components. The RMS error of the real and virtual image contours registration was 9.6 ± 6.7 mm. Correlation coefficients between the depth map from z-buffer and a depth camera was between 0.60 and 0.96 (p < 0.05). The AR system requires further improvement to be applicable at a higher frame rate of endoscope image acquisition. It also needs to include motion and deformation models when applied to animals or patients.

AB - Endoscopic surgery causes less tissue injury compared to open surgical techniques, thus promoting more rapid recuperation and reduced post-operative pain. Endoscopy, however, allows the surgeon to visualise only the anatomical surface of the surgical site, with a relatively narrow field of view. Moreover, the 2D video captured by the conventional endoscope does not provide depth perception of the surgical scene. In this study, these limitations have been addressed with the development of an augmented reality (AR) system with stereoscopic visualisation. A phantom and its 3D CT model were used, respectively, to form the real and virtual parts of the AR. The virtual environment camera pose was tracked using algorithms for image feature detection, feature matching and Perspective-n-Point applied on the endoscopic image and the 3D virtual model-rendered image. The endoscope video frame- and the virtual model-rendered images were superimposed to form the AR composite view. The depth buffer (z-buffer) of the rendering window was further used to make a stereo pair of the AR image. The AR system produced a stereo composite view having well-aligned real and virtual components. The RMS error of the real and virtual image contours registration was 9.6 ± 6.7 mm. Correlation coefficients between the depth map from z-buffer and a depth camera was between 0.60 and 0.96 (p < 0.05). The AR system requires further improvement to be applicable at a higher frame rate of endoscope image acquisition. It also needs to include motion and deformation models when applied to animals or patients.

UR - http://www.scopus.com/inward/record.url?scp=85006222299&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85006222299&partnerID=8YFLogxK

U2 - 10.1080/21681163.2016.1197798

DO - 10.1080/21681163.2016.1197798

M3 - Article

AN - SCOPUS:85006222299

VL - 6

SP - 182

EP - 191

JO - Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization

JF - Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization

SN - 2168-1163

IS - 2

ER -