Automatic Fetal Biometry Evaluation in Ultrasound Images Using a Deep Learning-Based Approach


Mostafa Ghelich Oghli 1 , * , Shakiba Moradi 2 , Reza Gerami 3 , Ali Shabanzadeh 1

1 Intelligent Imaging Technology Research Center, Med Fanavarn Plus Co., Karaj, Iran

2 Sharif University of Technology, Tehran, Iran

3 Army University of Medical Sciences, Tehran, Iran

How to Cite: Ghelich Oghli M, Moradi S, Gerami R, Shabanzadeh A. Automatic Fetal Biometry Evaluation in Ultrasound Images Using a Deep Learning-Based Approach, Iran J Radiol. 2019 ; 16(Special Issue):e99138. doi: 10.5812/iranjradiol.99138.


Iranian Journal of Radiology: 16 (Special Issue); e99138
Published Online: December 10, 2019
Article Type: Abstract
Received: October 26, 2019
Accepted: December 10, 2019


Background: The 2D fetal ultrasound biometrics have been extensively used to establish (or confirm) the gestational age of the fetus, estimate its size and weight, and identify growth patterns and abnormalities. Typically, an ultrasound examination is routinely performed between 18 and 22 weeks of pregnancy to evaluate the growth of the fetus by measuring its head, abdomen, and femur. Automatic methods for fetal biometric measurements have been investigated recently to reduce intra- and inter-observer variability and create more accurate and reproducible measurements.

Objectives: In this paper, we proposed a deep learning-based approach to calculate fetal biometry parameters automatically.

Patients and Methods: The fetal biometry parameters came from the fetal head, abdomen, and femur evaluation. Head circumference (HC) and biparietal diameter (BPD) were related to the fetal head, whereas abdominal circumference (AC) was related to the fetal abdomen and femur length (FL) was related to the fetal femur. Figure 1 shows these parameters in ultrasound images. Our prepared dataset included three parts, as follows: (1) 1334 2D ultrasound images of the fetal head in the standard plane. This dataset was publically available from the automated measurement of fetal head circumference challenge. (2) 158 2D ultrasound images of the fetal abdomen in the standard plane. The dataset was gathered from Alvand Medical Imaging Center, Tehran, Iran, by expert radiologists. (3) 315 2D ultrasound images of the fetal femur in the standard plane. The dataset was gathered from two distinct centers: (i) Alvand Medical Imaging Center, Tehran, Iran, and (ii) Laleh Hospital, Tehran, Iran. We trained and evaluated a novel convolutional network for the segmentation of fetal head and abdomen. The proposed network, called MFP-Unet, was a combination of Unet and feature pyramid network (FPN). The network architecture is depicted in Figure 2. For fetal femur, we had a pre-processing step. We used the superpixel algorithm to remove darker parts of the image, as the femur was typically the brightest part of the US image. Then, we applied an image saliency algorithm to represent the salient features of the image. Finally, MFP-Unet was trained on these pre-processed images to segment the femur. After the segmentation process, image analysis algorithms achieved all of the required measures. An ellipse detection algorithm on the segmented area of the fetal head was to measure the HC and BPD while an ellipse detection algorithm on the segmented area of the fetal abdomen was to measure the AC. Finally, a skeletonization algorithm achieved the femur length.

Results: We used the mean absolute difference (MAD) and root mean square error (RMSE) for the measurement errors. The values of MAD and RMSE were 0.23 mm and 0.11 mm for BPD, 0.13 mm and 0.09 mm for HC, 0.17 mm and 0.08 mm for AC, and 0.18 mm and 0.12 mm for FL, respectively. Table 1 shows the results. The correlation between automatic and manual measurements was evaluated by correlation graphs. The R values were 0.97, 0.91, and 0.97 for HC, BPD, and FL, respectively.

Conclusion: According to the results, we proposed a robust and useful algorithm for automatic fetal biometry evaluation that could be extended to nuchal translucency (NT) measurement based on the providing dataset.

To see figures and table, please refer to the PDF file.

Copyright © 2019, Author(s). This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License ( which permits copy and redistribute the material just in noncommercial usages, provided the original work is properly cited.