Analyzing breast cancer invasive disease event classification through explainable artificial intelligence

Front Med (Lausanne). 2023 Feb 2:10:1116354. doi: 10.3389/fmed.2023.1116354. eCollection 2023.

Abstract

Introduction: Recently, accurate machine learning and deep learning approaches have been dedicated to the investigation of breast cancer invasive disease events (IDEs), such as recurrence, contralateral and second cancers. However, such approaches are poorly interpretable.

Methods: Thus, we designed an Explainable Artificial Intelligence (XAI) framework to investigate IDEs within a cohort of 486 breast cancer patients enrolled at IRCCS Istituto Tumori "Giovanni Paolo II" in Bari, Italy. Using Shapley values, we determined the IDE driving features according to two periods, often adopted in clinical practice, of 5 and 10 years from the first tumor diagnosis.

Results: Age, tumor diameter, surgery type, and multiplicity are predominant within the 5-year frame, while therapy-related features, including hormone, chemotherapy schemes and lymphovascular invasion, dominate the 10-year IDE prediction. Estrogen Receptor (ER), proliferation marker Ki67 and metastatic lymph nodes affect both frames.

Discussion: Thus, our framework aims at shortening the distance between AI and clinical practice.

Keywords: 10-year follow up; 5-year follow up; breast cancer; explainable AI; invasive disease events.