Decoding pan-cancer treatment outcomes using multimodal real-world data and explainable artificial intelligence

Abstract

Despite advances in precision oncology, clinical decision-making still relies on limited parameters and expert knowledge. To address this limitation, we combined multimodal real-world data and explainable artificial intelligence (xAI) to introduce novel AI-derived (AID) markers for clinical decision support. We used deep learning to model the outcome of 15,726 patients across 38 solid cancer entities based on 350 markers, including clinical records, image-derived body compositions, and mutational tumor profiles. xAI determined the prognostic contribution of each clinical marker at the patient level and identified 114 key markers that accounted for 90% of the neural network’s decision process. Moreover, xAI enabled us to uncover 1,373 prognostic interactions between markers. Our approach was validated in an independent cohort of 3,288 lung cancer patients from a US nationwide electronic health record-derived database. These results show the potential of xAI to transform the assessment of clinical parameters and enable personalized, data-driven cancer care.

Publication
medRxiv
Julius Keyl
Julius Keyl
Medical Doctor
Michael Forsting
Michael Forsting
Chair Radiology Department
Boris Hadaschik
Boris Hadaschik
Chair Departement of Urology
Ken Herrmann
Ken Herrmann
Chair Department of Nuclear Medicine
Jan Egger
Jan Egger
Team Lead AI-guided Therapies
Jens Kleesiek
Jens Kleesiek
Professor of Translational Image-guided Oncology