A study published in the American Journal of Clinical Nutrition finds over 80% of women are iron deficient by their third trimester of pregnancy, even in a high-resource setting. The longitudinal study of 641 first-time pregnant women in Ireland revealed none were anemic in the first trimester, yet the vast majority became iron deficient as pregnancy progressed.
The researchers note that the iron deficiency rates exceeded those seen in some low-resource settings, but the risk of iron deficiency was reduced by iron supplementation pre-pregnancy or during early pregnancy.
Iron requirements increase nearly tenfold during pregnancy to support fetal development and maternal needs. Physiological adaptations enhancing iron absorption are often insufficient, particularly for the estimated 50% of women who begin pregnancy with depleted iron stores.
Iron deficiency, with or without anemia, has been associated with adverse maternal and infant outcomes, including postpartum depression, postpartum hemorrhage, preterm birth, low birth weight, small-for-gestational-age birth, and long-term neurodevelopmental challenges for the child.
Current screening practices vary, with no universally agreed-upon diagnostic criteria for iron deficiency in pregnancy. The US Preventive Services Task Force cited insufficient evidence to recommend routine screening, while international organizations like FIGO and the European Hematology Society advocated for universal screening in the first trimester and among women of reproductive age.
The researchers, led by Elaine K. McCarthy, PhD, et al., evaluated changes in iron biomarkers throughout pregnancy, established iron deficiency prevalence, and proposed early pregnancy benchmarks to predict third-trimester deficiency. They also examined how risk factors like obesity and smoking affected iron status.
The study used data from 641 nulliparous women participating in the IMproved PRegnancy Outcomes via Early Detection (IMPROvED) consortium project. Blood samples were collected at 15, 20, and 33 weeks gestation to assess iron status. Pregnancy, delivery, and neonatal information were obtained via postpartum interviews and medical record review.
Despite nearly 75% of participants taking iron-containing supplements with 15-17 mg of iron (the recommended daily allowance in Ireland/Europe), iron deficiency was highly prevalent. The authors proposed a ferritin threshold of ≤60 μg/L at 15 weeks gestation as predictive of iron deficiency (defined as ferritin ≤15 μg/L) at 33 weeks.
The study noted that iron-containing supplements taken pre-pregnancy or in early pregnancy were associated with reduced deficiency risk throughout gestation, including the third trimester.
An accompanying editorial by Michael Auerbach, MD, FACP, and Helain Landy, MD, also published in the American Journal of Clinical Nutrition criticizes the current approach to anemia and iron deficiency in pregnancy. They call on professional organizations to revise guidelines and recommend universal screening and supplementation for iron deficiency in pregnant women, regardless of anemia status.
The authors acknowledge the need for additional large-scale longitudinal studies examining iron status and inflammatory markers concurrently. They emphasize the importance of establishing consensus on early pregnancy iron biomarkers and thresholds that align with clinically meaningful health outcomes.
Disclosures were not made available at time of publishing.