A navigation method for targeted prostate biopsy based on MRI-TRUS fusion

Jisu Hu, Qi Ma, Xusheng Qian, Zhiyong Zhou, Yakang Dai

Article ID: 2052
Vol 3, Issue 1, 2022
DOI: https://doi.org/10.54517/urr.v3i1.2052
VIEWS - 797 (Abstract)

Download PDF

Abstract

 A navigation method is proposed to enable the fusion of magnetic resonance imaging (MRI) and transrectal ultrasound (TRUS) for targeted prostate biopsy. This method directly establishes the transfor-mation between the preoperative MRI image and the intraoperative TRUS based on the selected coplanar MRI and TRUS images without the use of 3D TRUS. According to the real-time spatial pose of the intrao-perative TRUS, the resliced preoperative MRI image is computed and displayed along with the preoperative planning 3D model, and the planned lesion region is mapped to TRUS to guide the needle in-sertion. In the phantom experiment, the average error between the planned target point and the actual puncture position was calculated to be (1.98±0.28) mm. The experimental results show that this method can achieve high targeting accuracy and has potential value in clinical applications.


Keywords

Prostate cancer; Prostate biopsy; Magnetic resonance imaging; Transrectal ultrasound; Elec-tromagnetic tracking


References

1. Del Monte M, Leonardo C, Salvo V, et al. MRI/US fusion-guided biopsy: performing exclusively targeted bi-opsies for the early detection of prostatecancer[J]. La Ra-diologia Medica, 2018, 123(3) : 227-234.

2. Liau J, Goldberg D, Arif-Tiwari H. Prostate cancer detec-tion and diagnosis: role of ultrasound with MRI correlates [J]. Current Radiology Reports, 2019, 7(3) : 7.

3. Deng Yisen, He Yuhui, Zhou Xiaofeng. Development of prostate targeted puncture technology[J]. Journal ofmini-mally Invasive Urology, 2018, 7(6) : 428-432.

4. Hu Y, Ahmed H U, Taylor Z, et al. MR to ultrasound reg-istration for image-guided prostateinterventions [J]. Medical Image Analysis, 2012, 16(3) : 687-703.

5. Wang Y, Cheng J Z, Ni D, et al. Towards personalized statistical deformable model and hybrid point matching for robust MR-TRUS registration[J]. Ieeetransactions on Medical Imaging, 2015, 35(2) : 589-604.

6. Congming, Wu Tong, Liu Dong, et al. Prostate MR/TRUS segmentation and registration based on supervised learning[J]. Journal of Engineering Sciences, 2020, 42 (10) : 1362-1371.

7. Hu Y, Modat M, Gibson E, et al. Weakly-supervised con-volutional neural networks for multimodal image registration [J]. Medical Image Analysis, 2018, 49: 1-13.

8. Sheng X, Kruecker J, Turkbey B, et al. Real-time MRI-TRUS fusion for guidance of targeted prostate biopsies[J]. Computer Aided Surgery Official Journal of the International Society for Computer Aided Surgery, 2008, 13(5) : 255.

9. Ni Dong, Wu Hailang. MRI-TRUS multi-modality image fusion for targeted prostate biopsy[J]. Journal of Shenzhen University Science and Engineering, 2016, 33(2) : 111-118.

10. Martin P R, Cool D W, Fenster A, et al. A comparison of prostate tumor targeting strategies using magnetic resonance imaging-targeted, transrectal ultrasound-guided fusion biopsy[J]. Medical Physics, 2018, 45 (3) : 1018-1028.

11. Hu Y, Kasivisvanathan V, Simmons L A M, et al. Devel-opment and phantom validation of a 3-D-ultrasound-guided system for targeting MRI-visible lesions during transrectal prostate biopsy[J]. IEEE Transactions on Bio-medical Engineering, 2016, 64(4) : 946-958.

12. Milletari F, Navab N, Ahmadi S A. V-net: Fully convo-lutional neural networks for volumetric medical image seg-mentation[C]//2016 Fourth International Conference on 3D Vision (3DV). IEEE, 2016: 565-571.

13. Mercier L, LangT, Lindseth F, et al. A review of cali-bration techniques for freehand 3-D ultrasound systems [J]. Ultrasound in Medicine & Biology, 2005, 31(4) : 449-471.10.

14. Eggert D W, Lorusso A, Fisher R B. Estimating 3-D rigid body transformations: a comparison of four major algorithms[J]. Machine Vision and Applications, 1997, 9(5) : 272-290.11.

Refbacks

  • There are currently no refbacks.


Copyright (c) 2022 Jisu Hu, Qi Ma, Xusheng Qian, Zhiyong Zhou, Yakang Dai

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.


This site is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).