Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 233
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
REVIEW ARTICLE
Year : 2022  |  Volume : 8  |  Issue : 1  |  Page : 23

A review of the development of intelligent delineation of radiotherapy contouring


1 Department of Nursing Administration, Army Medical University, Chongqing, China
2 Department of Oncology, Xinqiao Hospital, Army Medical University, Chongqing, China

Date of Submission07-May-2022
Date of Decision09-Aug-2022
Date of Acceptance16-Aug-2022
Date of Web Publication21-Oct-2022

Correspondence Address:
Jianguo Sun
Department of Oncology, Xinqiao Hospital, Army Medical University, No. 83 Xinqiao Street, Chongqing 400037
China
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/digm.digm_25_22

Rights and Permissions
  Abstract 


To date, the manual segmentation in radiotherapy contouring is featured with time- and effort-consuming and low efficiency. Therefore, it is imperative to develop novel technology to improve the precision and repeatability about the segmentation of radiotherapy contouring. The use of artificial intelligence (AI) delineation in tumor targets during radiotherapy has shown up, which contains the methods based on template atlas, image segmentation, and deep learning. Intelligent delineation of radiotherapy makes the automatic delineation of organs at risk possible, saves operators' time, and reduces the heterogeneity of contouring, which greatly increases the accuracy and quality of the contouring delineation in radiotherapy. All in all, automatic delineation of radiotherapy based on AI is flourishing. Researchers should further learn to build recognized standards and develop mature technologies to fulfill the clinical application in the near future.

Keywords: Deep learning, Image segmentation, Intelligent delineation, Radiotherapy contouring, Template atlas


How to cite this article:
Ren R, Chen G, Yang F, Cui T, Zhong L, Zhang Y, Luo B, Zhao L, Qian J, Sun J. A review of the development of intelligent delineation of radiotherapy contouring. Digit Med 2022;8:23

How to cite this URL:
Ren R, Chen G, Yang F, Cui T, Zhong L, Zhang Y, Luo B, Zhao L, Qian J, Sun J. A review of the development of intelligent delineation of radiotherapy contouring. Digit Med [serial online] 2022 [cited 2022 Dec 2];8:23. Available from: http://www.digitmedicine.com/text.asp?2022/8/1/23/359355




  Introduction Top


Cancers have already carried serious health and financial burdens on people around the world. It is reported that radiotherapy, one of the three main treatment options for malignant tumors, could treat tumors by artificial or natural radiation and is required for most cancer patients at different stages. It is reported that 55% of malignant tumors have been treated by surgery (27%), radiotherapy (22%), or chemotherapy (6%) from the World Health Organization. Radiotherapy equipment has developed to linear accelerators and proton and heavy ion accelerators; radiotherapy has also evolved to modern three-dimensional or even four-dimensional radiotherapy.[1],[2] In this process, the most important step is to find out the contouring site in radiotherapy and delineate it before radiotherapy.

Nowadays, the contouring in radiotherapy is mainly based on computed tomography (CT) and magnetic resonance imaging images, which rely on the clinical experience and skills from physicians.[3],[4] The results of manual segmentation are largely associated with the operator's experience, which is time-consuming, labor-intensive, and inefficient.[5],[6],[7] First, medical images have high complexity and lack clear linear features. Especially on CT images, there are blurred edges and it's difficult to distinguish between regions, which results in low contrast in normal organs, tumors, and surrounding organs. The tumors also have nonuniform internal density and variable and uncertain shapes, making it difficult for computers to identify. Second, the planned target areas on the same medical record may differ from radiotherapy physicians, and radiotherapy target areas delineated by the same physician on the same CT images at different times are also different,[8] which makes accurate segmentation technology face huge challenges. Hence, it is hard to ensure the consistency and the repeatability of the delineation results. To ensure the quality and reduce the differences of target delineation in radiotherapy among different physicians and radiotherapy units, it is urgent to develop intelligent delineation of radiotherapy contours.[9],[10]


  Intelligent Radiotherapy Top


In the era of rapid development of radiotherapy technology, one of the key points is to make radiotherapy contouring fast and accurate.[11],[12] Accompanied by the continuous application and development of computer digitization and artificial intelligence (AI) technology in the medical field,[13],[14] it has promised to revolutionize radiotherapy. As a branch of computer science, AI is more competent than the human brain and does it faster and better for heavy repetitive labor, mathematical calculations, and large amounts of memory. Especially in learning and remembering massive medical records, medical literature, clinical guidelines, drug instructions, images, pathological sections, delineation of radiotherapy target areas and organs at risk (OARs), and radiotherapy plan design, AI is obviously superior. It takes years or even decades for radiologists to master this knowledge, and they have to keep learning. The AI system only needs to input a large amount of medical information and medical data, and let the system continue to train and learn in depth. Intelligent radiotherapy makes the radiotherapy process mainly based on single center, the construction of the Internet plus radiotherapy cloud platform and the development of the Internet of things. Through the development of radiotherapy systems, something like the advanced remote network system of intelligent radiotherapy has been built to enhance sharing radiotherapy technology.


  Intelligent Delineation Of Radiotherapy Contouring Top


Nowadays, many domestic companies, such as Lianxin, Quanyu, Yinuo, and Xudong, have realized the automatic delineation of OARs through AI technology, which greatly improves the automation and speed of delineation of radiotherapy contours and reduces the need for different radiotherapy physicians or different heterogeneity of contouring produced by radiation physicians.

Synchronously, AI technology also provides a strong development impetus for the possible semi-automatic or automatic delineation of radiotherapy contouring,[15],[16],[17] which can be collectively referred to as intelligent delineation of radiotherapy contouring. AI is characterized with the ability of updating knowledge in a very short time.[18],[19] Through it, the automatic delineation of OARs[20],[21],[22] is performed, which greatly improves the automation and speed of contouring delineation in radiotherapy,[23] and reduces the heterogeneity of contouring produced by radiographers for different radiotherapy physicians or different states.[24],[25],[26]

The main ways to realize intelligent radiotherapy contouring are shown in [Figure 1].
Figure 1: The classifications of radio-therapy delineation methods. ABAS: Atlas-Based Auto segmentation, CNN: convolutional neural network, CLAF-CNN: Cross-layer attention fusion network.

Click here to view



  Intelligent Delineation Method Based On Template Atlas Top


Due to the development of the shared medical data and massive medical resources,[27],[28] it provides a solid foundation for the construction of a radiotherapy contour template atlas database.[29],[30] The intelligent delineation method based on the template atlas library has gone through three stages: (1) Single template deformation matching stage: the radiotherapy physician delineates the radiotherapy contouring of a certain tumor type and a certain patient and stores it as a template of the tumor type, then uses it to deform and match the image data of the tumor type in other patients to achieve the purpose of automatic delineation. (2) Multi-template and single-target matching stage: establishing a massive structure database of the same tumor types between multiple patients. When delineating patient image data, the closest data are selected automatically as a template from the database to automatically delineate after registration and deformation. (3) Multi-template and multi-target matching stage: establishing a massive structure database of the same tumor types between multiple patients. When delineating patient image data, the closest groups of data are selected automatically as templates from the database.

There are some software for radiotherapy contouring delineation based on the intelligent delineation method of the template atlas, such as atlas-based auto segmentation (ABAS) software,[31],[32] the deformable image registration software products MIM Maestro[33],[34],[35] and RayStation software.[36],[37],[38] During operation, the software is applied to scan and select templates, and the individualized patient image data are matched through deformation registration, the radiotherapy contouring data is delineated on the template of the individualized patient's image data. Then, the result of the cross-cut is obtained after manual verification and modification to complete the sketching process. For example, ABAS is a commonly used in a set of representative patients with carefully delineated OARs, serving as a reference set (i.e., atlas) for contouring new patients,[39] which significantly reduces workload and improves consistency in many radiotherapy departments. Kim et al. noticed that the dice similarity coefficient (DSC) and Hausdorff distance of clinical target volume (CTV) were 0.79 mm and 9.70 mm on patients with endometrial and cervical cancers, respectively, by ABAS.[40]

However, it has obvious shortcomings for the delineation method based on the template atlas. As described in [Table 1], (1) The construction of the template library requires the support of a lot of data, which will consume workforce and time. (2) The selection of the most matching data from the library will also consume a lot of computing time. (3) The data in the library cannot be effectively guaranteed in terms of delineation accuracy and standardization. If inaccurate, the delineation results must be greatly deviated. (4) The deformation registration of the template and the individualized patient image cannot be completely coincident, resulting in an unsatisfactory mapping effect. (5) The method is mostly aimed at a single organ or tissue, not automatically segment multiple organs or tissues at the same time, and the work efficiency is limited. (6) The template library of each radiotherapy unit is based on its previous data, and it is difficult to delineate the target area of homogeneous chemoradiotherapy.
Table 1: The advantages and disadvantages between 3 different artificial intelligence based intelligent delineation methods

Click here to view



  Intelligent Delineation Method Based On Image Segmentation Top


Image segmentation is a research direction in the field of image processing and computer vision, which divides an image into homogeneous regions with their own characteristics and extracts the interest target object.[41],[42] Moreover, it is the key step to image processing and analysis. At present, image segmentation is widely used in various fields, such as medical image analysis, image compression, image retrieval, and so on.[43–45]

The premise of radiotherapy is to clearly distinguish the radiotherapy target area, surrounding normal tissues, and organs. For medical image segmentation, it plays an indispensable role in realizing automatic contouring.[46] In the contouring of radiotherapy, the process of image segmentation divides the image into multiple regions according to the similar properties within each region (grayscale, color, texture, brightness, contrast), which corresponds to different organs and tumors. By image segmentation, the computer could identify tumor regions and endangered organs. In the long development, the image segmentation in radiotherapy has gone through the process from “manual segmentation,” “semi-automatic segmentation” to “automatic segmentation.” The main semi-automatic segmentation methods are divided into the following steps: (1) Region-based segmentation method: the image is divided into different regions by grayscale, texture and other image space features. The applied algorithms in such circumstances contain threshold segmentation and clustering algorithms. (2) Boundary-based segmentation method: the boundary of the target is determined by the difference of gradient information. (3) Segmentation method based on graph theory: the image segmentation problem and the minimum cut problem of graphs in graph theory are linked, and the image is mapped into a directed graph. (4) Segmentation method based on energy functional function. Automatic segmentation methods are mainly based on region segmentation methods, and most of which use similar feature within the same object for image processing. According to the different image processing methods of segmentation algorithm, it can be divided into threshold method, region growing method, region splitting, and merging method.

In general, traditional image segmentation methods can achieve better segmentation results. However, the segmentation effect is not separated from the operator's prior knowledge, which causes the segmentation being affected by subjective factors for relatively inexperienced operators. Therefore, the relative segmentation accuracy may not be assured. And such image segmentation is still at the stage of semi-automatic segmentation, which is difficult to meet the requirements of the development of the current era, as shown in [Table 1]. Recently, the emergence of fully automatic segmentation methods realizes the entire process of image segmentation completely by computer, but without human subjective intervention. Dr. Choonsik Lee proposed that automatic segmentation would facilitate radiotherapy prescriptive criteria development for mitigating cardiovascular complications.[47] Liu et al. found that the proposed model (U-ResNet) from their group could promote the efficiency and accuracy of delineation, performing equally well with the segmentation generated by oncologists.[48] Gan et al. implemented a hybrid convolutional neural network (CNN) and 2D CNN and 3D CNN into the automatic lung tumor delineation using CT images, and showed that the method could achieve good lung tumor segmentation.[49] Dr. Gregory Sharp achieved accurate segmentation outcomes within a clinically acceptable amount of time for three important structures in the head neck area by the proposed hybrid approach when compared with a segmentation approach based on using multiple atlases in combination with label fusion.[50] Zhang et al. showed that a cross-layer attention fusion network (CLAF-CNN) could obtain accurate segmentation results for OARs, which was the potential to develop the efficiency of radiotherapy planning for nasopharynx cancer and lung cancer.[51] The study proved that an asymmetric CNN model with two encoding paths from preimplant MR (masked by HR-CTVMR) and postimplant CT images was successfully developed for automatic segmentation of HR-CTVCT for T and O brachytherapy patients.


  Intelligent Delineation Method Based On Deep Learning Top


It is shown that deep learning based on AI is the cornerstone of the fourth industrial revolution in human history. In the contouring of radiotherapy, AI makes automatically delineate possible and has realized the automatic delineation and extraction of most OARs.[6],[22],[52] It is proposed that deep learning technology can simplify and standardize the treatment process, decrease the workload of medical workers, and enhance efficiency.[53],[54] Moreover, it is applied to image generation, and optimization of treatment planning, prognosis assessment, the prediction of toxic, and side effects.[55],[56],[57],[58] The process of implementing deep learning usually requires the following steps

First, a large number of image data with annotated tumor target areas and OARs needs to be constructed. The radiotherapy physician must delineate the radiotherapy target area and OARs, and mark the tumor area and surrounding endangered organs on the TPS. When accumulating a considerable amount, a marked data set will be formed. Second, data cleaning is required for the results of manual annotation. Third, the algorithm model for target area delineation is constructed based on deep learning. Fourth, continuous learning and training are performed to improve the accuracy and efficiency of radiotherapy contouring.

The use of AI to achieve automatic delineation of radiotherapy contouring is of great significance.[59] At first, the results obtained by the quantitative analysis of the images in the computer are objective and are not disturbed by human factors. Else, computers with a lot of learning may even outperform humans in accuracy. Then, the processing results can be obtained in a short time once the machine learning is completed, which will exponentially reduce the working time of radiotherapy contouring.[42],[60],[61] It is reported that using deep learning contouring mainly led to equal or significantly improved quantitative performance measures for the glandular and upper digestive tract OARs.[6] Yi et al. employed two different convolutional neural networks (CNNs) and observed that CNNs at various feature resolution levels well delineate rectal cancer CTVs and OARs.[22] Zhou et al. found that deep learning-based auto-segmentation would improve CTVs contouring accuracy, reduce contouring time, and improve clinical efficiency when treating cervical cancer.[62],[63],[64] Lin et al. designed a deep learning contouring tool for auto-segmenting the primary gross tumor volume of nasopharyngeal carcinoma on magnetic resonance (MR) images and demonstrated a high level of accuracy, shorten contouring time in 203 patients.[63] Dr. Trebeschi's group applied deep learning assistance to the segmentation of rectal cancer on multiparametric MR images and obtained a DSC of 69%.[64]

As shown in [Table 1], it still lacks recognized standards, mature technologies, and effective extraction capabilities for the automatic delineation of radiotherapy targets yet.[9],[65],[66] A key challenge is to build large databases and to be trained with a sufficient variety of representative examples in the medical field because deep learning requires multiple training samples for effective clinical practice. Else, there are intra- or inter-observer differences in the contours of TV and OARs. Furthermore, deep learning frameworks like different annotations of input images trained on biased datasets, will certainly produce argumentative results.

Although there are certain technical studies in progress, most software or models for auto-delineation of TV and OARs in radiotherapy have not yet been used in the clinic. With the development of AI and deep learning, the automatic delineation of radiotherapy target areas will be realized in the near future, making it more objective, reasonable, accurate, highly repeatable, fast, and convenient.


  Conclusions Top


All in all, the intelligent delineation of radiotherapy contouring will bring a huge change in radiology treatment over the future. The existing methods consist of atlas-based delineation, image-based segmentation, and deep learning. Compared with manual delineation, it is functioned with shorter times, higher accuracies, and reduced variabilities. Truthfully, intelligent contour segmentation has been explored to cervical cancer, prostate cancer, head-and-neck cancer, and other sites. Yet in routine clinical practice, most models for auto-delineation in radiotherapy are not being used. It is still a long way to reached the stage where clinicians could be replaced with AI and patient safety could be ensured. In the subsequent years, it is necessary to increase the acceptance and implementation of auto-delineation software in clinics. In conclusion, the value of the delineation models of radiotherapy contouring based on AI might be expected to improve the efficiency of segmentation.

Financial support and sponsorship

This study was supported by grants from The National Key Research and Development Project (2016YFC0106400).

Conflicts of interest

Jianguo Sun is an Associate Editor of the journal. The article was subject to the journal's standard procedures, with peer review handled independently of this editor and his research groups.



 
  References Top

1.
Sokol O, Scifoni E, Tinganelli W, Kraft-Weyrather W, Wiedemann J, Maier A, et al. Oxygen beams for therapy: Advanced biological treatment planning and experimental verification. Phys Med Biol 2017;62:7798-813.  Back to cited text no. 1
    
2.
Senthilkumar K, Maria Das KJ. Comparison of biological-based and dose volume-based intensity-modulated radiotherapy plans generated using the same treatment planning system. J Cancer Res Ther 2019;15:S33-8.  Back to cited text no. 2
    
3.
Mahantshetty U, Poetter R, Beriwal S, Grover S, Lavanya G, Rai B, et al. IBS-GEC ESTRO-ABS recommendations for CT based contouring in image guided adaptive brachytherapy for cervical cancer. Radiother Oncol 2021;160:273-84.  Back to cited text no. 3
    
4.
Dong Y, Liu Y, Chen J, Li W, Li Y, Zhao Q, et al. Comparison of postoperative CT- and preoperative MRI-based breast tumor bed contours in prone position for radiotherapy after breast-conserving surgery. Eur Radiol 2021;31:345-55.  Back to cited text no. 4
    
5.
Xia X, Wang J, Li Y, Peng J, Fan J, Zhang J, et al. An artificial intelligence-based full-process solution for radiotherapy: A proof of concept study on rectal cancer. Front Oncol 2020;10:616721.  Back to cited text no. 5
    
6.
van Dijk LV, Van den Bosch L, Aljabar P, Peressutti D, Both S, Steenbakkers R, et al. Improving automatic delineation for head and neck organs at risk by deep learning contouring. Radiother Oncol 2020;142:115-23.  Back to cited text no. 6
    
7.
Daisne JF, Blumhofer A. Atlas-based automatic segmentation of head and neck organs at risk and nodal target volumes: A clinical validation. Radiat Oncol 2013;8:154.  Back to cited text no. 7
    
8.
Chen MY, Woodruff MA, Dasgupta P, Rukin NJ. Variability in accuracy of prostate cancer segmentation among radiologists, urologists, and scientists. Cancer Med 2020;9:7172-82.  Back to cited text no. 8
    
9.
Hamidi M, Mahendran P, Denecke K. Towards a digital lean hospital: Concept for a digital patient board and its integration with a hospital information system. Stud Health Technol Inform 2019;264:606-10.  Back to cited text no. 9
    
10.
Urago Y, Okamoto H, Kaneda T, Murakami N, Kashihara T, Takemori M, et al. Evaluation of auto-segmentation accuracy of cloud-based artificial intelligence and atlas-based models. Radiat Oncol 2021;16:175.  Back to cited text no. 10
    
11.
Smith AG, Petersen J, Terrones-Campos C, Berthelsen AK, Forbes NJ, Darkner S, et al. RootPainter3D: Interactive-machine-learning enables rapid and accurate contouring for radiotherapy. Med Phys 2022;49:461-73.  Back to cited text no. 11
    
12.
Macomber MW, Phillips M, Tarapov I, Jena R, Nori A, Carter D, et al. Autosegmentation of prostate anatomy for radiation treatment planning using deep decision forests of radiomic features. Phys Med Biol 2018;63:235002.  Back to cited text no. 12
    
13.
Mintz Y, Brodie R. Introduction to artificial intelligence in medicine. Minim Invasive Ther Allied Technol 2019;28:73-81.  Back to cited text no. 13
    
14.
Amisha, Malik P, Pathania M, Rathaur VK. Overview of artificial intelligence in medicine. J Family Med Prim Care 2019;8:2328-31.  Back to cited text no. 14
    
15.
Kano Y, Ikushima H, Sasaki M, Haga A. Automatic contour segmentation of cervical cancer using artificial intelligence. J Radiat Res 2021;62:934-44.  Back to cited text no. 15
    
16.
Shapey J, Wang G, Dorent R, Dimitriadis A, Li W, Paddick I, et al. An artificial intelligence framework for automatic segmentation and volumetry of vestibular schwannomas from contrast-enhanced T1-weighted and high-resolution T2-weighted MRI. J Neurosurg 2019;134:171-9.  Back to cited text no. 16
    
17.
Quon JL, Han M, Kim LH, Koran ME, Chen LC, Lee EH, et al. Artificial intelligence for automatic cerebral ventricle segmentation and volume calculation: A clinical tool for the evaluation of pediatric hydrocephalus. J Neurosurg Pediatr 2020;27:131-8.  Back to cited text no. 17
    
18.
Chouard T, Venema L. Machine intelligence. Nature 2015;521:435.  Back to cited text no. 18
    
19.
Mertz L. Updating diagnoses for speed and accuracy: Using AI, cameras, assays, and more. IEEE Pulse 2020;11:20-4.  Back to cited text no. 19
    
20.
Wu QN, Wang YL, Quan H, Wang JJ, Gu SS, Yang W, et al. Research on automatic segmentation of pelvic organs at risk based on fusion network model based on limited training samples (In Chinese). J Biomed Eng 2020;37:311-6.  Back to cited text no. 20
    
21.
Fan J, Wang J, Chen Z, Hu C, Zhang Z, Hu W. Automatic treatment planning based on three-dimensional dose distribution predicted from deep learning technique. Med Phys 2019;46:370-81.  Back to cited text no. 21
    
22.
Song Y, Hu J, Wu Q, Xu F, Nie S, Zhao Y, et al. Automatic delineation of the clinical target volume and organs at risk by deep learning for rectal cancer postoperative radiotherapy. Radiother Oncol 2020;145:186-92.  Back to cited text no. 22
    
23.
Mattiucci GC, Boldrini L, Chiloiro G, D'Agostino GR, Chiesa S, De Rose F, et al. Automatic delineation for replanning in nasopharynx radiotherapy: What is the agreement among experts to be considered as benchmark? Acta Oncol 2013;52:1417-22.  Back to cited text no. 23
    
24.
Cunha CE, Fernandes R, Santos CX, Boccaletti KW, Pellizzon AC, Barbosa JH. Viability of mobile applications for remote support of radiotherapy patients. Rev Assoc Med Bras (1992) 2019;65:1321-6.  Back to cited text no. 24
    
25.
Maguire R, Ream E, Richardson A, Connaghan J, Johnston B, Kotronoulas G, et al. Development of a novel remote patient monitoring system: The advanced symptom management system for radiotherapy to improve the symptom experience of patients with lung cancer receiving radiotherapy. Cancer Nurs 2015;38:E37-47.  Back to cited text no. 25
    
26.
Mu W, Chen Z, Liang Y, Shen W, Yang F, Dai R, et al. Staging of cervical cancer based on tumor heterogeneity characterized by texture features on (18) F-FDG PET images. Phys Med Biol 2015;60:5123-39.  Back to cited text no. 26
    
27.
Goldberg LR, Crocombe LA. Advances in medical education and practice: Role of massive open online courses. Adv Med Educ Pract 2017;8:603-9.  Back to cited text no. 27
    
28.
Pereira T, Morgado J, Silva F, Pelter MM, Dias VR, Barros R, et al. Sharing biomedical data: Strengthening AI development in healthcare. Healthcare (Basel) 2021;9:827.  Back to cited text no. 28
    
29.
Beaton L, Nica L, Tyldesley S, Sek K, Ayre G, Aparicio M, et al. PET/CT of breast cancer regional nodal recurrences: An evaluation of contouring atlases. Radiat Oncol 2020;15:136.  Back to cited text no. 29
    
30.
Borm KJ, Voppichler J, Düsberg M, Oechsner M, Vag T, Weber W, et al. FDG/PET-CT-based lymph node atlas in breast cancer patients. Int J Radiat Oncol Biol Phys 2019;103:574-82.  Back to cited text no. 30
    
31.
Greenham S, Dean J, Fu CK, Goman J, Mulligan J, Tune D, et al. Evaluation of atlas-based auto-segmentation software in prostate cancer patients. J Med Radiat Sci 2014;61:151-8.  Back to cited text no. 31
    
32.
Loi G, Fusella M, Vecchi C, Menna S, Rosica F, Gino E, et al. Computed tomography to cone beam computed tomography deformable image registration for contour propagation using head and neck, patient-based computational phantoms: A multicenter study. Pract Radiat Oncol 2020;10:125-32.  Back to cited text no. 32
    
33.
Fukumitsu N, Nitta K, Terunuma T, Okumura T, Numajiri H, Oshiro Y, et al. Registration error of the liver CT using deformable image registration of MIM Maestro and Velocity AI. BMC Med Imaging 2017;17:30.  Back to cited text no. 33
    
34.
Sun L, Hu W, Lai S, Shi L, Chen J. In vivo 3-D dose verification using PET/CT images after carbon-ion radiation therapy. Front Oncol 2021;11:621394.  Back to cited text no. 34
    
35.
Casati M, Piffer S, Calusi S, Marrazzo L, Simontacchi G, Di Cataldo V, et al. Methodological approach to create an atlas using a commercial auto-contouring software. J Appl Clin Med Phys 2020;21:219-30.  Back to cited text no. 35
    
36.
Verbeek N, Wulff J, Janson M, Bäumer C, Zahid S, Timmermann B, et al. Experiments and Monte Carlo simulations on multiple Coulomb scattering of protons. Med Phys 2021;48:3186-99.  Back to cited text no. 36
    
37.
Han EY, Kim GY, Rebueno N, Yeboa DN, Briere TM. End-to-end testing of automatic plan optimization using RayStation scripting for hypofractionated multimetastatic brain stereotactic radiosurgery. Med Dosim 2019;44:e44-50.  Back to cited text no. 37
    
38.
Yang Y, Shao K, Zhang J, Chen M, Chen Y, Shan G. Automatic planning for nasopharyngeal carcinoma based on progressive optimization in RayStation Treatment Planning System. Technol Cancer Res Treat 2020;19:1-8.  Back to cited text no. 38
    
39.
Han X, Hoogeman MS, Levendag PC, Hibbard LS, Teguh DN, Voet P, et al. Atlasbased auto-segmentation of head and neck CT images. Med Image Comput Comput Assit Interv 2008;5242:434-41.  Back to cited text no. 39
    
40.
Kim N, Chang JS, Kim YB, Kim JS. Atlas-based autosegmentation for postoperative radiotherapy planning in endometrial and cervical cancers. Radiat Oncol 2020;15:106.  Back to cited text no. 40
    
41.
Cardenas CE, Yang J, Anderson BM, Court LE, Brock KB. Advances in auto-segmentation. Semin Radiat Oncol 2019;29:185-97.  Back to cited text no. 41
    
42.
Li Q, Kim J, Balagurunathan Y, Liu Y, Latifi K, Stringfield O, et al. Imaging features from pretreatment CT scans are associated with clinical outcomes in nonsmall-cell lung cancer patients treated with stereotactic body radiotherapy. Med Phys 2017;44:4341-9.  Back to cited text no. 42
    
43.
An FP, Liu ZW. Medical image segmentation algorithm based on feedback mechanism CNN. Contrast Media Mol Imaging 2019;2019:6134942.  Back to cited text no. 43
    
44.
Aldemir E, Gezer NS, Tohumoglu G, Barış M, Kavur AE, Dicle O, et al. Reversible 3D compression of segmented medical volumes: Usability analysis for teleradiology and storage. Med Phys 2020;47:1727-37.  Back to cited text no. 44
    
45.
Heinrich MP, Blendowski M, Oktay O. TernaryNet: Faster deep model inference without GPUs for medical 3D segmentation using sparse and binary convolutions. Int J Comput Assist Radiol Surg 2018;13:1311-20.  Back to cited text no. 45
    
46.
Fu Y, Lei Y, Wang T, Curran WJ, Liu T, Yang X. A review of deep learning based methods for medical image multi-organ segmentation. Phys Med 2021;85:107-22.  Back to cited text no. 46
    
47.
Jung JW, Mille MM, Ky B, Kenworthy W, Lee C, Yeom YS, et al. Application of an automatic segmentation method for evaluating cardiac structure doses received by breast radiotherapy patients. Phys Imaging Radiat Oncol 2021;19:138-44.  Back to cited text no. 47
    
48.
Liu Z, Liu F, Chen W, Tao Y, Liu X, Zhang F, et al. Automatic segmentation of clinical target volume and organs-at-risk for breast conservative radiotherapy using a convolutional neural network. Cancer Manag Res 2021;13:8209-17.  Back to cited text no. 48
    
49.
Gan W, Wang H, Gu H, Duan Y, Shao Y, Chen H, et al. Automatic segmentation of lung tumors on CT images based on a 2D & 3D hybrid convolutional neural network. Br J Radiol 2021;94:20210038.  Back to cited text no. 49
    
50.
Fritscher KD, Peroni M, Zaffino P, Spadea MF, Schubert R, Sharp G. Automatic segmentation of head and neck CT images for radiotherapy treatment planning using multiple atlases, statistical appearance models, and geodesic active contours. Med Phys 2014;41:051910.  Back to cited text no. 50
    
51.
Liu Z, Sun C, Wang H, Li Z, Gao Y, Lei W, et al. Automatic segmentation of organs-at-risks of nasopharynx cancer and lung cancer by cross-layer attention fusion network with TELD-Loss. Med Phys 2021;48:6987-7002.  Back to cited text no. 51
    
52.
Chen M, Wu S, Zhao W, Zhou Y, Zhou Y, Wang G. Application of deep learning to auto-delineation of target volumes and organs at risk in radiotherapy. Cancer Radiother 2022;26:494-501.  Back to cited text no. 52
    
53.
Min H, Dowling J, Jameson MG, Cloak K, Faustino J, Sidhom M, et al. Automatic radiotherapy delineation quality assurance on prostate MRI with deep learning in a multicentre clinical trial. Phys Med Biol 2021;66:195008.  Back to cited text no. 53
    
54.
Dai Z, Zhang Y, Zhu L, Tan J, Yang G, Zhang B, et al. Geometric and dosimetric evaluation of deep learning-based automatic delineation on CBCT-synthesized CT and planning CT for breast cancer adaptive radiotherapy: A multi-institutional study. Front Oncol 2021;11:725507.  Back to cited text no. 54
    
55.
Liu Y, Lei Y, Wang Y, Wang T, Ren L, Lin L, et al. MRI-based treatment planning for proton radiotherapy: Dosimetric validation of a deep learning-based liver synthetic CT generation method. Phys Med Biol 2019;64:145015.  Back to cited text no. 55
    
56.
Nguyen D, Long T, Jia X, Lu W, Gu X, Iqbal Z, et al. A feasibility study for predicting optimal radiation therapy dose distributions of prostate cancer patients from patient anatomy using deep learning. Sci Rep 2019;9:1076.  Back to cited text no. 56
    
57.
Li H, Boimel P, Janopaul-Naylor J, Zhong H, Xiao Y, Ben-Josef E, et al. DeepConvolutional Neural Networks for imaging based survival analysis of RectalCancer patients. Int J Radiat Oncol Biol Phys 2017;99:183.  Back to cited text no. 57
    
58.
Ibragimov B, Toesca D, Chang D, Yuan Y, Koong A, Xing L. Development of deep neural network for individualized hepatobiliary toxicity prediction after liver SBRT. Med Phys 2018;45:4763-74.  Back to cited text no. 58
    
59.
Wang C, Zhu X, Hong JC, Zheng D. Artificial intelligence in radiotherapy treatment planning: Present and future. Technol Cancer Res Treat 2019;18:1-11.  Back to cited text no. 59
    
60.
Chow JCL. Internet-based computer technology on radiotherapy. Rep Pract Oncol Radiother 2017;22:455-62.  Back to cited text no. 60
    
61.
Huynh E, Coroller TP, Narayan V, Agrawal V, Hou Y, Romano J, et al. CT-based radiomic analysis of stereotactic body radiation therapy patients with lung cancer. Radiother Oncol 2016;120:258-66.  Back to cited text no. 61
    
62.
Ma CY, Zhou JY, Xu XT, Guo J, Han MF, Gao YZ, et al. Deep learning-based auto-segmentation of clinical target volumes for radiotherapy treatment of cervical cancer. J Appl Clin Med Phys 2022;23:e13470.  Back to cited text no. 62
    
63.
Lin L, Dou Q, Jin YM, Zhou GQ, Tang YQ, Chen WL, et al. Deep learning for automated contouring of primary tumor volumes by MRI for nasopharyngeal carcinoma. Radiology 2019;291:677-86.  Back to cited text no. 63
    
64.
Trebeschi S, van Griethuysen JJM, Lambregts DMJ, Lahaye MJ, Parmar C, Bakers FCH, et al. Deep learning for fully-automated localization and segmentation of rectal cancer on multiparametric MR. Sci Rep 2017;7:5301.  Back to cited text no. 64
    
65.
Alleyne-Mike K, Sylvester P, Henderson-Suite V, Mohoyodeen T. Radiotherapy in the Caribbean: A spotlight on the human resource and equipment challenges among CARICOM nations. Hum Resour Health 2020;18:49.  Back to cited text no. 65
    
66.
Numasaki H, Teshima T, Okuda Y, Ogawa K, Japanese Society for Radiation Oncology Database Committee. Japanese structure survey of radiation oncology in 2013. J Radiat Res 2020;61:799-816.  Back to cited text no. 66
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Intelligent Radi...
Intelligent Deli...
Intelligent Deli...
Intelligent Deli...
Intelligent Deli...
Conclusions
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed142    
    Printed10    
    Emailed0    
    PDF Downloaded16    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]