skip to main content
10.1145/3638530.3664129acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article
Open access

Improving Concordance Index in Regression-based Survival Analysis: Evolutionary Discovery of Loss Function for Neural Networks

Published: 01 August 2024 Publication History

Abstract

In this work, we use an Evolutionary Algorithm (EA) to discover a novel Neural Network (NN) regression-based survival loss function with the aim of improving the C-index performance. Our contribution is threefold; firstly, we propose an evolutionary meta-learning algorithm SAGAloss for optimizing a neural-network regression-based loss function that maximizes the C-index; our algorithm consistently discovers specialized loss functions that outperform MSCE. Secondly, based on our analysis of the evolutionary search results, we highlight a non-intuitive insight that signifies the importance of the non-zero gradient for the censored cases part of the loss function, a property that is shown to be useful in improving concordance. Finally, based on this insight, we propose MSCESp, a novel survival regression loss function that can be used off-the-shelf and generally performs better than the Mean Squared Error for censored cases. We performed extensive experiments on 19 benchmark datasets to validate our findings.

References

[1]
Abdallah Alabdallah, Mattias Ohlsson, Sepideh Pashami, and Thorsteinn Rögnvaldsson. 2024. The Concordance Index decomposition: A measure for a deeper understanding of survival prediction models. Artificial Intelligence in Medicine 148 (2024), 102781.
[2]
Abdallah Alabdallah, Sepideh Pashami, Thorsteinn Rögnvaldsson, and Mattias Ohlsson. 2022. SurvSHAP: A Proxy-Based Algorithm for Explaining Survival Models with SHAP. In DSAA. 1--10.
[3]
Mohammed Ghaith Altarabichi, Yuantao Fan, Sepideh Pashami, Peyman Sheikholharam Mashhadi, and Sławomir Nowaczyk. 2021. Extracting invariant features for predicting state of health of batteries in hybrid energy buses. In DSAA. IEEE, 1--6.
[4]
Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, and Peyman Sheikholharam Mashhadi. 2021. Surrogate-assisted genetic algorithm for wrapper feature selection. In 2021 IEEE congress on evolutionary computation (CEC). IEEE, 776--785.
[5]
Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, and Peyman Sheikholharam Mashhadi. 2023. Fast Genetic Algorithm for feature selection --- A qualitative approximation approach. Expert Systems with Applications 211 (2023), 118528.
[6]
Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, Peyman Sheikholharam Mashhadi, and Julia Handl. 2024. Rolling the dice for better deep learning performance: A study of randomness techniques in deep neural networks. Information Sciences 667 (2024), 120500.
[7]
Mohammed Ghaith Altarabichi, Peyman Sheikholharam Mashhadi, Yuantao Fan, Sepideh Pashami, Sławomir Nowaczyk, Pablo Del Moral, Mahmoud Rahat, and Thorsteinn Rögnvaldsson. 2020. Stacking ensembles of heterogenous classifiers for fault detection in evolving environments. In ESREL. 1068--1068.
[8]
V. Van Belle, K. Pelckmans, J. A. K. Suykens, and S. Van Huffel. 2009. Additive survival least-squares support vector machines. Statistics in Medicine 29, 2 (2009), 296--308.
[9]
Achraf Bennis, Sandrine Mouysset, and Mathieu Serrurier. 2020. Estimation of Conditional Mixture Weibull Distribution with Right Censored Data Using Neural Network for Time-to-Event Analysis. In Advances in Knowledge Discovery and Data Mining. Springer, Cham, 687--698.
[10]
Erhan Bilal, Janusz Dutkowski, Justin Guinney, In Sock Jang, Benjamin A. Logsdon, Gaurav Pandey, Benjamin A. Sauerwine, Yishai Shimoni, Hans Kristian Moen Vollan, Brigham H. Mecham3, Oscar M. Rueda, Jorg Tost, Christina Curtis, Mariano J. Alvarez, Vessela N. Kristensen, Samuel Aparicio, Anne-Lise Børresen-Dale, Carlos Caldas, Andrea Califano, Stephen H. Friend, Trey Ideker, Eric E. Schadt, Gustavo A. Stolovitzky, and Adam A. Margolin. 2013. Improving Breast Cancer Survival Analysis through Competition-Based Multidimensional Modeling. PLoS Computational Biology (2013).
[11]
G. Bingham, W. Macke, and R. Miikkulainen. 2020. Evolutionary optimization of deep learning activation functions. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference. 289--296.
[12]
P. Chapfuwa, C. Tao, C. Li, C. Page, B. Goldstein, L.C. Duke, and R. Henao. 2018. Adversarial Time-to-Event Modeling. In ICML, Vol. 80. 735--74.
[13]
D. R. Cox. 1972. Regression Models and Life-Tables. Journal of the Royal Statistical Society. Series B (Methodological) 34, 2 (1972), 187--220.
[14]
Berkan Demirel, Orhun Buğra Baran, and Ramazan Gokberk Cinbis. 2023. Metatuning Loss Functions and Data Augmentation for Few-shot Object Detection. In CVPR. 7339--7349.
[15]
Tobias Domhan, Jost Tobias Springenberg, and Frank Hutter. 2015. Speeding up automatic hyperparameter optimization of deep neural networks by extrapolation of learning curves. In Twenty-fourth international joint conference on artificial intelligence.
[16]
Erik Drysdale. 2022. SurvSet: An open-source time-to-event dataset repository. arXiv:2203.03094 [stat.ML]
[17]
Larry J Eshelman. 1991. The CHC adaptive search algorithm: How to have safe search when engaging in nontraditional genetic recombination. In Foundations of genetic algorithms. Vol. 1. Elsevier, 265--283.
[18]
Boyan Gao, Henry Gouk, and Timothy M Hospedales. 2021. Searching for robustness: Loss learning for noisy classification tasks. In ICCV. 6670--6679.
[19]
Boyan Gao, Henry Gouk, Yongxin Yang, and Timothy Hospedales. 2022. Loss function learning for domain generalization by implicit gradient. In International Conference on Machine Learning. PMLR, 7002--7016.
[20]
S. Gonzalez and R. Miikkulainen. 2020. Improved training speed, accuracy, and data utilization through loss function optimization. In IEEE Congress on Evolutionary Computation (CEC). 1--8.
[21]
Santiago Gonzalez and Risto Miikkulainen. 2021. Optimizing loss functions through multi-variate taylor polynomial parameterization. In Proceedings of the Genetic and Evolutionary Computation Conference. 305--313.
[22]
F.E. Harrell, R.M. Califf, D.B. Pryor, K.L. Lee, and R.A. Rosati. 1982. Evaluating the Yield of Medical Tests. JAMA 247, 18 (1982), 2543--2546.
[23]
Tairan He, Yuge Zhang, Kan Ren, Minghuan Liu, Che Wang, Weinan Zhang, Yuqing Yang, and Dongsheng Li. 2022. Reinforcement learning with automated auxiliary loss search. Advances in Neural Information Processing Systems 35 (2022), 1820--1834.
[24]
Bingzhong Jing, Tao Zhang, Zixian Wang, Ying Jin, Kuiyuan Liu, Wenze Qiu, Liangru Ke, Ying Sun, Caisheng He, Dan Hou, et al. 2019. A deep survival analysis method based on ranking. Artificial intelligence in medicine 98 (2019), 1--9.
[25]
J. Kalderstam. 2015. Neural Network Approaches to Survival Analysis. Ph. D. Dissertation. Lund University, Lund, Sweden.
[26]
Jared L Katzman, Uri Shaham, Alexander Cloninger, Jonathan Bates, Tingting Jiang, and Yuval Kluger. 2018. DeepSurv: personalized treatment recommender system using a Cox proportional hazards deep neural network. BMC medical research methodology 18, 1 (2018), 24.
[27]
Håvard Kvamme, Ørnulf Borgan, and Ida Scheel. 2019. Time-to-Event Prediction with Neural Networks and Cox Regression. JMLR 20, 129 (2019), 1--30.
[28]
Changhee Lee, William Zame, Jinsung Yoon, and Mihaela van der Schaar. 2018. DeepHit: A Deep Learning Approach to Survival Analysis With Competing Risks. AAAI 32, 1 (Apr. 2018).
[29]
Hanxiao Liu, Karen Simonyan, Oriol Vinyals, Chrisantha Fernando, and Koray Kavukcuoglu. 2017. Hierarchical representations for efficient architecture search. arXiv preprint arXiv:1711.00436 (2017).
[30]
Peidong Liu, Gengwei Zhang, Bochao Wang, Hang Xu, Xiaodan Liang, Yong Jiang, and Zhenguo Li. 2021. Loss function discovery for object detection via convergence-simulation driven search. arXiv:2102.04700 (2021).
[31]
Y. Liu, Y. Sun, B. Xue, M. Zhang, G.G. Yen, and K.C. Tan. 2021. A survey on evolutionary neural architecture search. In IEEE Transactions on Neural Networks and Learning Systems.
[32]
Risto Miikkulainen, Jason Liang, Elliot Meyerson, Aditya Rawal, Daniel Fink, Olivier Francon, Bala Raju, Hormoz Shahrzad, Arshak Navruzyan, Nigel Duffy, et al. 2019. Evolving deep neural networks. In Artificial intelligence in the age of neural networks and brain computing. Elsevier, 293--312.
[33]
Chirag Nagpal, Xinyu Li, and Artur Dubrawski. 2021. Deep Survival Machines: Fully Parametric Survival Regression and Representation Learning for Censored Data With Competing Risks. IEEE Journal of Biomedical and Health Informatics 25, 8 (2021), 3163--3175.
[34]
Chirag Nagpal, Steve Yadlowsky, Negar Rostamzadeh, and Katherine Heller. 2021. Deep Cox Mixtures for Survival Regression. In Proceedings of the 6th Machine Learning for Healthcare Conference (Proceedings of Machine Learning Research, Vol. 149). PMLR, 674--708.
[35]
J.L. Powell. 1984. Least absolute deviations estimation for the censored regression model. Journal of Econometrics 25, 3 (1984), 303--325.
[36]
Prajit Ramachandran, Barret Zoph, and Quoc V Le. 2017. Searching for activation functions. arXiv preprint arXiv:1710.05941 (2017).
[37]
Christian Raymond, Qi Chen, and Bing Xue. 2023. Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence (2023).
[38]
Kan Ren, Jiarui Qin, Lei Zheng, Zhengyu Yang, Weinan Zhang, Lin Qiu, and Yong Yu. 2019. Deep Recurrent Survival Analysis. AAAI 33, 01 (Jul. 2019), 4798--4805.
[39]
Simon Wiegrebe, Philipp Kopper, Raphael Sonabend, Bernd Bischl, and Andreas Bender. 2023. Deep Learning for Survival Analysis: A Review. arXiv:2305.14961 [stat.ML]

Index Terms

  1. Improving Concordance Index in Regression-based Survival Analysis: Evolutionary Discovery of Loss Function for Neural Networks

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      GECCO '24 Companion: Proceedings of the Genetic and Evolutionary Computation Conference Companion
      July 2024
      2187 pages
      ISBN:9798400704956
      DOI:10.1145/3638530
      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 01 August 2024

      Check for updates

      Author Tags

      1. neuroevolution
      2. evolutionary meta-learning
      3. loss function
      4. neural networks
      5. survival analysis
      6. concordance index
      7. genetic algorithms

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      GECCO '24 Companion
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 46
        Total Downloads
      • Downloads (Last 12 months)46
      • Downloads (Last 6 weeks)46
      Reflects downloads up to 14 Sep 2024

      Other Metrics

      Citations

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media