A Two-Stage Network for Segmentation of Vertebrae and Intervertebral Discs: Integration of Efficient Local-Global Fusion Using 3D Transformer and 2D CNN
Communications in Computer and Information Science, ISSN: 1865-0937, Vol: 1964 CCIS, Page: 467-479
2024
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Conference Paper Description
In the field of computer-aided diagnosis (CAD) for spinal diseases, the fundamental task of multi-label segmentation for vertebrae and intervertebral discs (IVDs) assumes a significant role. However, the distinctive characteristics inherent to the spinal structure pose considerable challenges to the segmentation process, impeding its practical applicability in clinical settings. Convolutional neural networks have been widely used in this task; however, their limited receptive field restricts their capacity to capture extended-range spatial correlations. Consequently, the model’s ability to accurately delineate vertebral boundaries is compromised, leading to a notable deterioration in the quality of segmentation outputs. To address this limitation, we propose a novel two-stage convolutional neural network (CNN) framework that incorporates both 3D Transformers and 2D CNNs. By synergistically leveraging the advantages of Transformers in facilitating the integration of long-range dependencies and the ability of CNNs to learn global and local features, our proposed approach exhibits promising potential in enhancing the segmentation performance for vertebrae and intervertebral discs. Moreover, we introduce a graph convolution module into our network architecture to exploit the inherent spatial dependencies present in MRI scans of spinal structures, thereby extracting semantic feature representations and further augmenting the efficacy of segmentation. The evaluation of our proposed method is conducted on the MRSpineSeg Challenge dataset, encompassing T2-weighted MR images.
Bibliographic Details
http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85178608106&origin=inward; http://dx.doi.org/10.1007/978-981-99-8141-0_35; https://link.springer.com/10.1007/978-981-99-8141-0_35; https://dx.doi.org/10.1007/978-981-99-8141-0_35; https://link.springer.com/chapter/10.1007/978-981-99-8141-0_35
Springer Science and Business Media LLC
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know