Development and evaluation of a live birth prediction model for evaluating human blastocysts: a retrospective study
Abstract
Background: In infertility treatment, blastocyst morphological grading is commonly used in clinical practice for blastocyst evaluation and selection, but has shown limited predictive power on live birth outcomes of blastocysts. To improve live birth prediction, a number of artificial intelligence (AI) models have been established. Most existing AI models for blastocyst evaluation only used images for live birth prediction, and the area under the receiver operating characteristic (ROC) curve (AUC) achieved by these models has plateaued at ~0.65.
Methods: This study proposed a multi-modal blastocyst evaluation method using both blastocyst images and patient couple's clinical features (e.g., maternal age, hormone profiles, endometrium thickness, and semen quality) to predict live birth outcomes of human blastocysts. To utilize the multi-modal data, we developed a new AI model consisting of a convolutional neural network (CNN) to process blastocyst images and a multi-layer perceptron to process patient couple's clinical features. The dataset used in this study consists of 17,580 blastocysts with known live birth outcomes, blastocyst images, and patient couple's clinical features.
Results: This study achieved an AUC of 0.77 for live birth prediction, which significantly outperforms related works in the literature. Sixteen out of 103 clinical features were identified to be predictors of live birth outcomes and helped improve live birth prediction. Among these features, maternal age, the day of blastocyst transfer, antral follicle count, retrieved oocyte number, and endometrium thickness measured before transfer are the top five features contributing to live birth prediction. Heatmaps showed that the CNN in the AI model mainly focuses on image regions of inner cell mass and trophectoderm (TE) for live birth prediction, and the contribution of TE-related features was greater in the CNN trained with the inclusion of patient couple's clinical features compared with the CNN trained with blastocyst images alone.
Conclusions: The results suggest that the inclusion of patient couple's clinical features along with blastocyst images increases live birth prediction accuracy.
Funding: Natural Sciences and Engineering Research Council of Canada and the Canada Research Chairs Program.
Data availability
All processed data and code needed to reproduce the findings of the study are made openly available in deidentified form. This can be found in https://github.com/robotVisionHang/LiveBirthPrediction_Data_Code, and attached to this manuscript. All codes and software used to analyze the data can also be accessed through the link. Due to data privacy regulations of patient data, raw data cannot be publicly shared. Interested researchers are welcome to contact the corresponding author with a concise project proposal indicating aims of using the data and how they will use the data. The project proposal will be firstly assessed by Prof. Yu Sun, Prof. Ge Lin, and then by the Ethics Committee of the Reproductive and Genetic Hospital of CITIC-Xiangya. There are no restrictions on who can access the data.
Article and author information
Author details
Funding
Natural Sciences and Engineering Research Council of Canada
- Yu Sun
Canada Research Chairs
- Yu Sun
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: Informed consent was not necessary because this study used retrospective and fully de-identified data, no medical intervention was performed on the subject, and no biological samples from the patient were collected. This study was approved by the Ethics Committee of the Reproductive and Genetic Hospital of CITIC-Xiangya (approval number: LL-SC-2021-008).
Copyright
© 2023, Liu et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,824
- views
-
- 313
- downloads
-
- 18
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.