Open Access
ARTICLE
Comparative Efficacy of Transformer and Recurrent Neural Networks in Automated Blood Clot Detection from Clinical Text
Issue Vol. 1 No. 01 (2024): Volume 01 Issue 01 --- Section Articles --- Published Date: 2024-12-20
Abstract
The accurate and timely identification of medical conditions from electronic health records (EHRs) is crucial for patient care, research, and public health surveillance. Blood clot detection, specifically, presents a significant challenge due to the nuanced, often implicit, mentions within unstructured clinical text. This study presents a comparative analysis of advanced neural network architectures—Bidirectional Encoder Representations from Transformers (BERT), Robustly Optimized BERT Pretraining Approach (RoBERTa), Text-to-Text Transfer Transformer (T5), and Recurrent Neural Networks (RNNs)—for their efficacy in identifying thrombus-related information from clinical narratives. Leveraging their distinct strengths in natural language understanding, we evaluate these models on a proprietary dataset of de-identified clinical notes, focusing on precision, recall, and F1-score. Our findings indicate that Transformer-based models, particularly those pre-trained on biomedical corpora, significantly outperform traditional RNNs, demonstrating superior ability to capture complex contextual dependencies vital for nuanced clinical concept extraction.
Keywords
References
Huang, K., Altosaar, J. and Ranganath, R., 2019. Clinicalbert: Modeling clinical notes and predicting hospital readmission. arXiv preprint arXiv:1904.05342.
Lee, J., Yoon, W., Kim, S., Kim, D., Kim, S., So, C.H. and Kang, J., 2020. BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics, 36(4), pp.1234-1240.
Si, Y., Wang, J., Xu, H. and Roberts, K., 2019. Enhancing clinical concept extraction with contextual embeddings. Journal of the American Medical Informatics Association, 26(11), pp.1297-1304.
Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., Zhou, Y., Li, W. and Liu, P.J., 2020. Exploring the limits of transfer learning with a unified text-to-text transformer. Journal of machine learning research, 21(140), pp.1-67.
Devlin, J., Chang, M.W., Lee, K. and Toutanova, K., 2019, June. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers) (pp. 4171-4186).
Open Access Journal
Submit a Paper
Propose a Special lssue
pdf