Harnessing Artificial Intelligence for Early Disease Detection: Opportunities and Challenges in Modern Healthcare
DOI:
https://doi.org/10.62411/jcta.15367Keywords:
Artificial Intelligence, Clinical Adoption, Early Disease Detection, Explainable AI, Healthcare Governance, Machine Learning, Predictive Analytics, Translational ReviewAbstract
Artificial Intelligence (AI) is increasingly recognized as a transformative enabler of early disease detection, with the potential to improve diagnostic accuracy, support predictive risk stratification, and advance preventive healthcare. Despite rapid methodological progress, many existing reviews remain performance-centric, offering limited insight into generalizability, ethical governance, and real-world implementation constraints. This paper presents a narrative and integrative review with an adoption-focused, translational perspective, synthesizing recent developments in AI-driven early disease detection across oncology, cardiology, neurology, and infectious disease surveillance. Drawing on peer-reviewed literature published primarily between 2016 and 2025, the review examines reported performance gains alongside persistent limitations related to data heterogeneity, population bias, explainability, and regulatory fragmentation. Through cross-sectional synthesis, we identify three recurring gaps in prior reviews: (i) overgeneralization of AI’s diagnostic superiority, (ii) insufficient consideration of ethical and legal accountability, and (iii) a lack of actionable guidance for scalable clinical implementation. Integrating technical, ethical, and policy dimensions into a unified conceptual framework, this review demonstrates that while AI systems can consistently enhance diagnostic accuracy and early risk stratification in well-defined tasks, sustained clinical adoption depends on aligning technical performance with governance readiness, interpretability, and workflow integration. The analysis further highlights how implementation mechanisms—such as explainable AI, continuous post-deployment monitoring, and clinician-centered deployment strategies—mediate the translation of algorithmic innovation into real-world healthcare impact. Overall, this review provides a critical reference for researchers, clinicians, and policymakers seeking to translate AI innovation into safe, equitable, and trustworthy clinical practice.References
Z. Obermeyer and E. J. Emanuel, “Predicting the Future — Big Data, Machine Learning, and Clinical Medicine,” N. Engl. J. Med., vol. 375, no. 13, pp. 1216–1219, Sep. 2016, doi: 10.1056/NEJMp1606181.
F. Jiang et al., “Artificial intelligence in healthcare: past, present and future,” Stroke Vasc. Neurol., vol. 2, no. 4, pp. 230–243, Dec. 2017, doi: 10.1136/svn-2017-000101.
A. Esteva et al., “A guide to deep learning in healthcare,” Nat. Med., vol. 25, no. 1, pp. 24–29, Jan. 2019, doi: 10.1038/s41591-018-0316-z.
E. J. Topol, “High-performance medicine: the convergence of human and artificial intelligence,” Nat. Med., vol. 25, no. 1, pp. 44–56, Jan. 2019, doi: 10.1038/s41591-018-0300-7.
M. D. Abràmoff, P. T. Lavin, M. Birch, N. Shah, and J. C. Folk, “Pivotal trial of an autonomous AI-based diagnostic system for detection of diabetic retinopathy in primary care offices,” npj Digit. Med., vol. 1, no. 1, p. 39, Aug. 2018, doi: 10.1038/s41746-018-0040-6.
S. M. McKinney et al., “International evaluation of an AI system for breast cancer screening,” Nature, vol. 577, no. 7788, pp. 89–94, Jan. 2020, doi: 10.1038/s41586-019-1799-6.
C. Damiani et al., “Evaluation of an AI Model to Assess Future Breast Cancer Risk,” Radiology, vol. 307, no. 5, Jun. 2023, doi: 10.1148/radiol.222679.
P. Rajpurkar, E. Chen, O. Banerjee, and E. J. Topol, “AI in health and medicine,” Nat. Med., vol. 28, no. 1, pp. 31–38, Jan. 2022, doi: 10.1038/s41591-021-01614-0.
Y. A. Fahim, I. W. Hasani, S. Kabba, and W. M. Ragab, “Artificial intelligence in healthcare and medicine: clinical applications, therapeutic advances, and future perspectives,” Eur. J. Med. Res., vol. 30, no. 1, p. 848, Sep. 2025, doi: 10.1186/s40001-025-03196-w.
Z. L. Teo et al., “Generative artificial intelligence in medicine,” Nat. Med., vol. 31, no. 10, pp. 3270–3282, Oct. 2025, doi: 10.1038/s41591-025-03983-2.
D. S. Stamoulis and C. Papachristopoulou, “Artificial Intelligence in Radiology, Emergency, and Remote Healthcare: A Snapshot of Present and Future Applications,” J. Futur. Artif. Intell. Technol., vol. 1, no. 3, pp. 228–234, Oct. 2024, doi: 10.62411/faith.3048-3719-38.
Z. Obermeyer, B. Powers, C. Vogeli, and S. Mullainathan, “Dissecting racial bias in an algorithm used to manage the health of populations,” Science (80-. )., vol. 366, no. 6464, pp. 447–453, Oct. 2019, doi: 10.1126/science.aax2342.
T. Eche, L. H. Schwartz, F.-Z. Mokrane, and L. Dercle, “Toward Generalizability in the Deployment of Artificial Intelligence in Radiology: Role of Computation Stress Testing to Overcome Underspecification,” Radiol. Artif. Intell., vol. 3, no. 6, Nov. 2021, doi: 10.1148/ryai.2021210097.
N. Freyer, D. Groß, and M. Lipprandt, “The ethical requirement of explainability for AI-DSS in healthcare: a systematic review of reasons,” BMC Med. Ethics, vol. 25, no. 1, p. 104, Oct. 2024, doi: 10.1186/s12910-024-01103-2.
S. Tonekaboni, S. Joshi, M. D. McCradden, and A. Goldenberg, “What Clinicians Want: Contextualizing Explainable Machine Learning for Clinical End Use,” in Proceedings of Machine Learning Research, Aug. 2019, pp. 1–21. [Online]. Available: http://arxiv.org/abs/1905.05134
R. Rosenbacke, Å. Melhus, M. McKee, and D. Stuckler, “How Explainable Artificial Intelligence Can Increase or Decrease Clinicians’ Trust in AI Applications in Health Care: Systematic Review,” JMIR AI, vol. 3, p. e53207, Oct. 2024, doi: 10.2196/53207.
U.S. Food and Drug Administration, “Good Machine Learning Practice for Medical Device Development: Guiding Principles,” FDA.gov, 2025. https://www.fda.gov/medical-devices/software-medical-device-samd/good-machine-learning-practice-medical-device-development-guiding-principles
European Commission, “European Artificial Intelligence Act comes into force,” European Commission, 2024. https://digital-strategy.ec.europa.eu/en/news/european-artificial-intelligence-act-comes-force
S. Reddy, “Generative AI in healthcare: an implementation science informed translational path on application, integration and governance,” Implement. Sci., vol. 19, no. 1, p. 27, Mar. 2024, doi: 10.1186/s13012-024-01357-9.
A. Rajkomar, J. Dean, and I. Kohane, “Machine Learning in Medicine,” N. Engl. J. Med., vol. 380, no. 14, pp. 1347–1358, Apr. 2019, doi: 10.1056/NEJMra1814259.
Z. Sadeghi et al., “A review of Explainable Artificial Intelligence in healthcare,” Comput. Electr. Eng., vol. 118, p. 109370, Aug. 2024, doi: 10.1016/j.compeleceng.2024.109370.
J. L. Raya-Povedano, “AI in breast cancer screening: a critical overview of what we know,” Eur. Radiol., vol. 34, no. 7, pp. 4774–4775, Dec. 2023, doi: 10.1007/s00330-023-10530-5.
D. Windecker et al., “Generalizability of FDA-Approved AI-Enabled Medical Devices for Clinical Use,” JAMA Netw. Open, vol. 8, no. 4, p. e258052, Apr. 2025, doi: 10.1001/jamanetworkopen.2025.8052.
A. Rocha et al., “Edge AI for Internet of Medical Things: A literature review,” Comput. Electr. Eng., vol. 116, p. 109202, May 2024, doi: 10.1016/j.compeleceng.2024.109202.
J. Biamonte, P. Wittek, N. Pancotti, P. Rebentrost, N. Wiebe, and S. Lloyd, “Quantum machine learning,” Nature, vol. 549, no. 7671, pp. 195–202, Sep. 2017, doi: 10.1038/nature23474.
B. Björnsson et al., “Digital twins to personalize medicine,” Genome Med., vol. 12, no. 1, p. 4, Dec. 2020, doi: 10.1186/s13073-019-0701-3.
D. Ardila et al., “End-to-end lung cancer screening with three-dimensional deep learning on low-dose chest computed tomography,” Nat. Med., vol. 25, no. 6, pp. 954–961, Jun. 2019, doi: 10.1038/s41591-019-0447-x.
Y. Li et al., “BEHRT: Transformer for Electronic Health Records,” Sci. Rep., vol. 10, no. 1, p. 7155, Apr. 2020, doi: 10.1038/s41598-020-62922-y.
S.-J. Hahn, S. Kim, Y. S. Choi, J. Lee, and J. Kang, “Prediction of type 2 diabetes using genome-wide polygenic risk score and metabolic profiles: A machine learning analysis of population-based 10-year prospective cohort study,” eBioMedicine, vol. 86, p. 104383, Dec. 2022, doi: 10.1016/j.ebiom.2022.104383.
A. Y. Hannun et al., “Cardiologist-level arrhythmia detection and classification in ambulatory electrocardiograms using a deep neural network,” Nat. Med., vol. 25, no. 1, pp. 65–69, Jan. 2019, doi: 10.1038/s41591-018-0268-3.
S. Nemati, A. Holder, F. Razmi, M. D. Stanley, G. D. Clifford, and T. G. Buchman, “An Interpretable Machine Learning Model for Accurate Prediction of Sepsis in the ICU,” Crit. Care Med., vol. 46, no. 4, pp. 547–553, Apr. 2018, doi: 10.1097/CCM.0000000000002936.
P. Rajpurkar et al., “CheXNet: Radiologist-Level Pneumonia Detection on Chest X-Rays with Deep Learning,” ArXiv. Dec. 25, 2017. [Online]. Available: http://arxiv.org/abs/1711.05225
V. Gulshan et al., “Development and Validation of a Deep Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs,” JAMA, vol. 316, no. 22, p. 2402, Dec. 2016, doi: 10.1001/jama.2016.17216.
N. Coudray et al., “Classification and mutation prediction from non–small cell lung cancer histopathology images using deep learning,” Nat. Med., vol. 24, no. 10, pp. 1559–1567, Oct. 2018, doi: 10.1038/s41591-018-0177-5.
M. V. Perez et al., “Large-Scale Assessment of a Smartwatch to Identify Atrial Fibrillation,” N. Engl. J. Med., vol. 381, no. 20, pp. 1909–1917, Nov. 2019, doi: 10.1056/NEJMoa1901183.
J. T. Wu, K. Leung, and G. M. Leung, “Nowcasting and forecasting the potential domestic and international spread of the 2019-nCoV outbreak originating in Wuhan, China: a modelling study,” Lancet, vol. 395, no. 10225, pp. 689–697, Feb. 2020, doi: 10.1016/S0140-6736(20)30260-9.
S. M. McKinney et al., “Addendum: International evaluation of an AI system for breast cancer screening,” Nature, vol. 586, no. 7829, pp. E19–E19, Oct. 2020, doi: 10.1038/s41586-020-2679-9.
IBM Security, “Cost of a Data Breach Report 2023,” 2023. [Online]. Available: https://d110erj175o600.cloudfront.net/wp-content/uploads/2023/07/25111651/Cost-of-a-Data-Breach-Report-2023.pdf
African Union, “AU Data Policy Framework,” African Union, 2022. https://au.int/en/documents/20220728/au-data-policy-framework
European Union, “Regulation (EU) 2024/1689 of the European Parliament and of the Council of 26 June 2024 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act),” EUR-Lex, 2024. https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng
M. Faiyazuddin et al., “The Impact of Artificial Intelligence on Healthcare: A Comprehensive Review of Advancements in Diagnostics, Treatment, and Operational Efficiency,” Heal. Sci. Reports, vol. 8, no. 1, Jan. 2025, doi: 10.1002/hsr2.70312.
J. B. Oluwagbemi, O. V. Oyetayo, and E. O. Ibam, “SysFungiNet: A Multi-Omics Data Fusion Framework with Explainable AI for Bioactive Prioritization,” J. Futur. Artif. Intell. Technol., vol. 2, no. 4, pp. 661–679, Jan. 2026, doi: 10.62411/faith.3048-3719-304.
T. R. Noviandy, G. M. Idroes, and I. Hardi, “An Interpretable Machine Learning Strategy for Antimalarial Drug Discovery with LightGBM and SHAP,” J. Futur. Artif. Intell. Technol., vol. 1, no. 2, pp. 84–95, Aug. 2024, doi: 10.62411/faith.2024-16.
N. D. Truong et al., “Convolutional neural networks for seizure prediction using intracranial and scalp electroencephalogram,” Neural Networks, vol. 105, pp. 104–111, Sep. 2018, doi: 10.1016/j.neunet.2018.04.018.
World Economic Forum, “The Future of Jobs Report 2023,” Geneva, 2023. [Online]. Available: https://www.weforum.org/reports/the-future-of-jobs-report-2023
Organisation for Economic Co-operation and Development, “AI principles,” OECD.org, 2024. https://www.oecd.org/en/topics/ai-principles.html#:~:text=The OECD AI Principles promote,stand the test of time.
UNESCO, “Recommendation on the Ethics of Artificial Intelligence,” unesco.org, 2023. https://www.unesco.org/en/articles/recommendation-ethics-artificial-intelligence
WHO guidance and World Health Organization, “Ethics and governance of artificial intelligence for health,” World Health Organization, 2021. https://www.who.int/publications/i/item/9789240029200
M. J. Sheller et al., “Federated learning in medicine: facilitating multi-institutional collaborations without sharing patient data,” Sci. Rep., vol. 10, no. 1, p. 12598, Jul. 2020, doi: 10.1038/s41598-020-69250-1.
Z. I. Attia et al., “An artificial intelligence-enabled ECG algorithm for the identification of patients with atrial fibrillation during sinus rhythm: a retrospective analysis of outcome prediction,” Lancet, vol. 394, no. 10201, pp. 861–867, Sep. 2019, doi: 10.1016/S0140-6736(19)31721-0.
IBM, “Cost of a Data Breach Report 2023,” 2023. [Online]. Available: https://www.ibm.com/reports/data-breach
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Achile Solomon Egbunu, Akindele Michael Okedoye

This work is licensed under a Creative Commons Attribution 4.0 International License.












