Book

Gibson RS1 & Friel JK
Principles of Nutritional
Assessment: Iron

3rd Edition     July 2024


Abstract

Dietary iron is present as heme and non­heme iron, each absorbed by dif­fer­ent mech­anisms. Once absorbed, the iron released can be stored or bound to plasma trans­ferrin for distribution to other tissues. Quantitatively, most iron is used by immature red blood cells in the bone marrow for hemo­globin (Hb) produc­tion. Senescent erythrocytes are degraded by macrophages, and the iron released from catabolized Hb re-enters the circu­lation. Absorption of iron is modulated in response to the level of body iron stores and by the amount of iron needed for erythro­poiesis. Hepatic hepcidin is the master regulator of iron homeo­stasis. Hepcidin levels are suppressed by iron depletion or increased iron demand (i.e., enhanced erythro­poiesis), thereby increasing absorp­tion and mobilization of iron from body stores into plasma. In contrast, when iron stores are replete (and during inflam­mation), hepcidin levels are increased, so iron cannot efflux into the circu­lation, thus preventing iron overload.

Anemia is the most common sign of iron deficiency in low- and middle-income countries, with with infants, young children, and women of child-bearing age at greatest risk. Features of iron defi­ciency anemia (IDA) include impairments in work capacity and cognition, and possibly risk during preg­nancy of low birth­weight and prem­aturity. Despite low specificity and sensitivity, a low Hb concen­tration is the most widely used test for IDA. Devel­opment of IDA occurs in three stages, the first being a decrease in iron stores, reflected by a low serum ferritin, followed by iron-deficient erythro­poiesis. At this stage, iron stores are exhausted (i.e., serum ferritin < 12µg/L), iron supply to the erythro­poietic cells is progressively reduced, and decreases in trans­ferrin satu­ration occur. At the same time, serum soluble trans­ferrin receptor (sTfR) and erythrocyte proto­porphyrin increase. Only in the third stage is there a decline in Hb, decreases in hematocrit and red-cell indices, and frank microcytic, hypo­chromic anemia, confirmed by examination of a stained blood film.

This chapter describes how to assess the adequacy of dietary iron intakes, followed by details of the hematological parameters used to diagnose anemia, the serum bio­markers to identify iron depletion (serum ferritin), and iron deficient erythro­poiesis (serum ferritin; serum iron; trans­ferrin satu­ration; sTfR; and erythrocyte protoporphyrin). Advantages and limitations are discussed together with details of the measure­ment and interpretive criteria for each bio­marker. Use of a regression modeling approach to adjust for the effect of inflam­mation on serum ferritin and sTfR is highlighted, in view of the challenge of distin­guishing between IDA and anemia of chronic disease. The final section emphasizes the simul­tane­ous use of multiple iron bio­markers to provide a more valid assessment of iron status and minimize misclassification that may occur when using a single measure. The advantages of using the “Total Body Iron Model” based on serum ferritin and sTfR expressed as body iron (mg/kg), with a cutoff of < 0mg/kg to define iron defi­ciency, is described. Finally, details of emerging iron indicators, notably hepcidin, non-trans­ferrin-bound iron, and some reticulocyte indices, are presented. CITE AS: Gibson RS, and Friel JK. Principles of Nutritional Assessment: Iron. https://nutritionalassessment.org/iron/
Email: rosalind.gibson@otago.ac.nz
Licensed under CC-BY-4.0
( PDF )

17.1 Introduction and functions of iron

The assessment of the iron status of the popu­lation is critical: iron defi­ciency is a widely recognized micro­nutrient defi­ciency, and a public health problem worldwide.

The iron content of the human body is carefully regulated, normally con­taining about 3 to 4g of iron. Of this, about 65% is present in hemoglobin (Hb), the oxygen-carrying pigment of the red blood cells, that plays a critical role in trans­ferring oxygen from lung to tissues. Hemo­globin is made up of four heme subunits, each with a poly­peptide chain of globin attached. Each mole­cule of heme consists of a proto­porphyrin IX mole­cule with one iron atom. In addition, about 3.5% of body iron is present in myoglobin, the oxygen-binding storage protein found in muscle. The structure of myo­globin is similar to Hb, except that it con­tains only one heme unit and one globin chain.

17.1.1 Distribution of body iron in men and women

Table 17.1 shows the amount of iron and its distribution in the body in males and females. Trace amounts of iron are also associated with electron trans­port and several enzymes. Examples include the heme-con­taining cyto­chromes that serve as electron carriers within the cell, iron-sulfur proteins (flavo­proteins, heme-flavo­proteins) that are required for the first reaction in the electron trans­port chain, and hydrogen peroxidases (e.g., catalase and peroxidase).
Table 17.1. Distribution of body iron in men and women. Data from Elsayed et al. (2016).
Amount
of Iron
Male
(mg)
Female
(mg)
% of
   Total   
Total 3000‑4000 2000‑3000 100
Hemo­globin 2400 1700 65
Intracellular
storage
(ferritin and
hemosiderin)
1000 500 30
Myoglobin 150 120 3.5
Trans­ferrin-
bound iron
4 3 0.1
The cyto­chrome-P450 family of enzymes also con­tain heme and are located in micro­somal membranes of liver cells and intestinal mucosal cells. Key functions of cyto­chrome-P450 involve detox­ific­ation of foreign substances in the liver, and synthesis of prosta­glandins, steroid hormones, and bile acids (Yip and Dallman, 1996; Beard et al., 1996).

In addition to these functional forms, as much as 30% of total body iron is present as storage iron, found primarily in the liver. Smaller amounts occur in the reticulo-endothelial cells of the bone marrow and spleen; and in the muscle tissues. Of the storage iron, approx­imately two-thirds consists of ferritin, the soluble fraction of the non­heme iron stores. The remainder of storage iron is insoluble hemo­siderin. Small quantities of ferritin can be synthesized in all cells of the body, even those with no special iron storage function. Ferritin also appears in small concen­trations in the serum but is not involved in iron trans­port. In healthy individuals with no evidence of inflam­mation, serum ferritin concen­tration is closely corre­lated with the size of the iron store (Yip and Dallman, 1996).

Stored iron serves as a reservoir to supply cellular needs, mainly Hb produc­tion, and is especially impor­tant in the third trimester of preg­nancy. The size of the storage compo­nent is most strongly influenced by age, sex, body size, and either the magnitude of iron losses or the presence of diseases of iron overload (Brittenham et al., 1981).

Iron trans­port is carried out by the trans­port protein trans­ferrin, which comprises only about 0.1% of total body iron (Table 17.1). Trans­ferrin delivers iron to the tissues by means of cell membrane receptors specific for trans­ferrin (i.e., sTfR1 and sTfR2). About 20‑30mg of iron cycles through the trans­port compo­nent each day.

17.1.2 Absorption and metabolism of iron

Four main factors in the body operate to maintain iron balance and prevent iron defi­ciency and iron overload. These are: (a) intake, (b) absorp­tion (c) storage, and (d) loss of iron. The inter­relation­ship of these factors has now been described mathematically, so that the amount of storage iron can be predicted as long as iron losses and bioavail­ability of iron are known (Hallberg et al., 1998).

In an iron-sufficient adult male, absorp­tion of iron from the diet only con­tributes about 1mg/d, while for menstruating women, about 1.5mg/d is absorbed to compensate for menstrual blood loss. There are two dif­fer­ent forms of iron in the diet, heme and non­heme, each absorbed primarily in the duodenum. Heme iron is derived mainly from Hb and myoglobin in red meat, poultry, and fish and is absorbed readily as the intact iron porphyrin. Absorption of heme iron ranges from 15‑35% and is little affected by dietary factors; absorp­tion is regulated primarily in response to total body iron. As demand for body iron increases there is a corresponding up­reg­ulation in the uptake of heme iron and also in the rate of transfer to the duodenal entero­cytes. Details of the exact mech­anism controlling heme iron absorp­tion are poorly under­stood (Anderson and Frazer, 2017). How­ever, once absorbed, iron (Fe2+) is released from heme iron by heme oxygenase, after which the iron follows the same pathway as non­heme iron, as described below (EFSA, 2015).

Non­heme iron (i.e., inorganic iron) is found primarily in plant-based foods, but also in meat, eggs, and some dairy foods. Liver, iron-fortified foods, and the seeds of legumes, such as beans, con­tain non­heme ion in the form of ferritin, although the mech­anism whereby ferritin iron is absorbed is poorly under­stood (EFSA, 2015). Plant ferritin is readily released during cooking and digestion (Lynch et al., 2018). The absorp­tion of non­heme iron is less efficient than heme iron and can range from 2‑20%, depending on the iron status of the individual and the simul­tane­ous ingestion of other dietary compo­nents; some inhibit and others enhance non­heme iron absorp­tion. Most of the food compo­nents classed as inhibitors generally act by binding iron in the gastro­intestinal tract, preventing its absorp­tion. In contrast, the enhancers act by forming com­plexes that can be taken up by the intestinal iron trans­port proteins, thus preventing the iron from binding to the inhibitors. Alternatively, they act by reducing the reactive ferric (Fe3+) iron to its less reactive and more soluble ferrous (Fe2+) state.

Inhibitors of non­heme iron absorp­tion include phytic acid, poly­phenols, calcium, and peptides from partially digested proteins (e.g., soybean protein). Of these, phytic acid (myoinositol hexa-phosphate, IP6) is the main inhibitor of non­heme iron absorp­tion from plant-based diets. How­ever, traditional food processing methods such as soaking, ger­mina­tion, and fer­menta­tion can lower the phytic acid content, either by reducing the water-soluble phytate content, or by phytase hydrolysis converting the phytic acid to lower myo-inositol phosphate forms that no longer inhibit non­heme iron absorp­tion (i.e., IP2; IP1) (Gibson et al., 2018). Poly­phenol compounds from beverages such as tea, coffee, cocoa, and red wine, veg­etables (spinach, aubergine), legumes (colored beans), and cereals such as red sorghum, also inhibit non­heme iron absorp­tion. Their inhibitory effect is dose-dependent, the strength depending on the structure of the phenolic compounds, with the gallate-con­taining tea poly­phenols having the largest inhibitory effect (Hurrell et al., 1999). Calcium inhibits both non­heme and heme iron absorp­tion, although the inhibitory effect is weak and short-term, with a mixed effect on iron status and no reduction in Hb (Abioye et al., 2021). The mech­anism is not well under­stood (Lynch et al., 2018).

In contrast, ascorbic acid and muscle tissue from meat, poultry, fish and liver, enhance non­heme iron absorp­tion when eaten in meals con­taining inhibitors (Lynch et al., 2018). Of these, ascorbic acid is the most potent enhancer of non­heme iron absorp­tion through its ability to convert ferric (Fe3+) iron to ferrous (Fe2+) iron at low pH as well as its chelating prop­erties (Conrad and Scade, 1968). The enhancing effect of ascorbic acid is dose-dependent and most marked when consumed with meals con­taining high levels of inhibitors, including phytate and poly­phenols. Ascorbic acid can also enhance absorp­tion of many iron fortif­ication compounds (Hurrell et al., 2004), except NaFeEDTA (Troesch et al., 2009). The mech­anism whereby muscle tissue enhances non­heme iron absorp­tion is not clear. It may be linked to cysteine-con­taining peptides released during digestion, which have the potential to reduce ferric (Fe3+) iron to ferrous (Fe2+) iron and chelate iron as described for ascorbic acid (Taylor et al., 1986).

The intestinal absorp­tion of non­heme iron has been extensively studied; for details see Anderson and Frazer (2017). Briefly, the non­heme iron in foods that exists predominantly as ferric (Fe3+) iron is sol­ubil­ized in the gastro­intestinal lumen, where first it is reduced by duodenal cyto­chrome b reductase (DCYTB) to ferrous iron (Fe2+) before being trans­ported by the iron-import protein divalent metal-ion trans­porter 1 (DMT1) into the entero­cytes of the small intestine. When demand for iron is low, the iron can be stored within the cell as the iron storage protein ferritin. When the demand for body iron is high, iron is either taken up by mito­chondria for the synthesis of heme, or exported into the circu­lation by the iron export protein ferroportin 1 (FPN1). To leave the cell and bind to plasma trans­ferrin, how­ever, Fe2+ must be oxidized back to Fe3+. This oxidation is catalyzed by a membrane-bound ferrox­idase hephaestin. Trans­ferrin-bound iron is then distributed via the circu­lation throughout the body to sites of utilization (Figure 17.1).
Figure 17.1
Figure 17.1. Body iron homeo­stasis. Dietary iron is present as both heme and non­heme iron. The mech­anisms underlying heme absorp­tion are poorly under­stood. Dietary non-heme iron is first reduced to the ferrous form by DCYTB and then it is trans­ported by DMT1 into the small intestinal entero­cytes. When demand for iron is low, iron is stored as ferritin, mainly in the liver whereas when demand is high, iron is exported into the plasma by FPN1 where it is trans­ported bound to trans­ferrin, and delivered to the target tissues: bone marrow, liver, and other tissues, such as muscle. Quantitatively, most iron is used by immature red blood cells in the bone marrow for Hb produc­tion in red blood cells. Senescent erythrocytes are phagocytosed by macrophages, and the iron is released from catabolized Hb and re-enters the plasma. Hepcidin plays a critical role in regulating whole body iron levels and is regulated by body iron demand.
DCYTB, duodenal cytochrome b;
DMT1, divalent metal-ion trans­porter 1;
FPN1, ferroportin1;
diferric Tf, diferric trans­ferrin.
Modified from Anderson and Fraser (2017) and Kondaiah et al. (2019).
Most of the iron is used by immature red blood cells in the bone marrow for Hb produc­tion.

Cells can take up iron in a variety of forms. All nucleated cells are capable of using trans­ferrin-bound iron, although some cell types can take up iron in other forms, including non-trans­ferrin-bound iron, or iron present within ferritin, heme, or Hb. Uptake and storage of iron in cells is tightly controlled by iron regulatory proteins. These ensure the iron supply to the cell is maximized when the cell has an iron deficit. Alternatively, when the cell is iron replete, the supply of iron to the cell is restricted and storage pro­moted. For more details of the regulation of iron homeo­stasis at the cellular level, see Anderson and Fraser (2017).

Whole-body iron levels are also strictly regulated. This regulation is mediated by the liver-derived peptide hepcidin which is responsible in part for the control of dietary iron absorp­tion as well as the export of iron from body stores. When iron stores are low or iron utilization such as erythro­poiesis is increased, and when plasma trans­ferrin concen­tration is reduced, hepcidin produc­tion is suppressed, stimu­lating absorp­tion and delivery of iron to the plasma from storage sites. As a conse­quence, more iron enters the circu­lation. In contrast, when hepatic storage iron and circu­lating trans­ferrin concen­trations are high, the produc­tion of hepcidin is increased, thereby reducing intestinal iron absorp­tion and inhibiting the release of iron from body stores (Figure 17.1, Anderson and Frazer, 2017). Hepcidin produc­tion is also stimulated by cytokines such as inter­leukins 1 and 6 induced by inflam­mation (see “Acute and chronic inflam­mation” in Section 17.7.1 for more details).

The primary storage sites for iron are the cells of the liver, spleen, and bone marrow. Iron is stored in the form of the soluble protein complex ferritin, or insoluble hemo­siderin, as noted earlier. Iron from both forms of storage iron can be mobilized efficiently when required else­where in the body (Anderson and Frazer, 2017). The amount of iron in the stores varies widely, depending on iron status and sex. When the iron supply exceeds the cell’s functional needs, ferritin produc­tion in the liver is increased, whereas when the supply is insufficient, iron bound to ferritin will be mobilized from the liver store.

Unlike many other micro­nutrients, there are no active pathways for excreting iron. Instead, mech­anisms exist to maintain both cellular and whole-body iron levels within the optimal physiologic range, as noted above. Total daily iron losses are small and occur mainly in the feces (0.6mg/d), although very small amounts are also lost in desquamated skin cells and sweat (0.2‑0.3mg/d), and in urine (< 1mg/d). Most of the iron excreted in the feces is from unabsorbed dietary iron, although a small amount is systemic iron derived primarily from biliary secretions. Total daily iron losses are larger (≈ 1.3mg/d) in pre­menopausal women, because of the additional loss of iron in menstrual blood. On average, menstrual blood loss is 30‑40mL per cycle or 0.4‑0.5mg iron per day, although in some women it is much greater (Yip and Dallman, 1996).

Most of the heme iron in erythro­cytes is recycled for Hb synthesis at the end of the erythro­cyte's functional lifetime (on average 120d). At this time, the erythro­cytes are phagocytosed by specialized macrophages in the spleen, liver, and bone marrow. The heme iron is released from catabolized Hb and is either returned to the plasma via ferroportin where it is bound to trans­ferrin, or it is incor­porated into ferritin for temporary storage, depending on the iron status of the individual. Smaller quantities of iron are exported by other cells, particularly hepatocytes. This process is termed “iron turnover”; each day 25mg systemic iron is recycled, representing 0.66% of the total iron content of the body.

17.1.3 Micro­nutrient interactions with the potential to affect iron absorp­tion and/or metabolism

Deficiencies of iron, vitamin A, iodine, and zinc are major public health problems in low-income countries. They often occur concurrently, especially among infants, young children, and women of reproductive age living in impoverished settings. These high-risk groups all have high micro­nutrient requirements, but frequently low intakes of poorly bioavailable micro­nutrients. Interactions have been described between these micro­nutrients, although the mech­anisms are not always fully under­stood.

Iron-vitamin A interactions. Vitamin A may affect several stages of iron metabolism, including erythro­poiesis, incor­poration of iron into Hb, and the mobil­ization of iron from ferritin stores (Zimmermann et al., 2006). Vitamin A is also said to affect iron absorp­tion, but whether the effect is determined by vitamin A and/or iron status is uncertain (Hurrell and Egli, 2010). Reports of a positive impact of vitamin A on Hb and some iron status indices have been observed in several vitamin A supple­men­tation / fortif­ication trials (Michelazzo et al., 2013), although the precise mech­anism remains uncertain. Based on these reports, vitamin A defi­ciency is now recognized as a contributor to anemia, especially in children (Calis et al., 2008).

Iron-iodine interactions. Numerous animal studies indicate that iron defi­ciency with or without anemia impairs thyroid metabolism (Hess et al., 2002). An iron-dependent enzyme, thyro­per­oxidase, acts by catalyzing the two initial steps in thyroid hormone synthesis (Hess et al., 2002). Iron defi­ciency may also reduce thyroid hormone synthesis by inducing alterations in the thyroid hormone feedback system (Lynch et al., 2018). In inter­vention studies in Morocco and the Ivory Coast, administration of both iron and iodine to iron-deficient goitrous children decreased goiter rates more effectively than did the administration of iodine alone (Zimmerman, 2002).

Iron-zinc interactions. Iron and zinc appear to compete for absorp­tion, possibly via the shared divalent metal trans­porter 1 (DMT1) and/or via trans­porters in the apical membrane of small intestine entero­cytes; the mech­anism remains uncertain (Kondaiah et al., 2019). This inhibitory effect only occurs when thera­peutic doses (2‑3‑fold molar excess or higher, iron relative to zinc) are given to humans in aqueous solutions but not in a complex food matrix (Olivares et al., 2012). To date there is no evidence that this interaction is associated with any clinically signif­icant adverse effects (Fischer-Walker et al., 2005; Lim et al., 2013), obviating concerns regarding the use of both iron and zinc supple­ments in public health programs. For a compre­hen­sive review of the possible mech­anisms for this iron-zinc interaction, see Kondaiah et al. (2019).

Iron-cadmium and iron-lead. Iron defi­ciency is a risk factor for increased concen­trations of cadmium and lead in the blood. For cadmium, this is likely due to increased absorp­tion arising from the elevated levels of divalent metal trans­porter 1 (DMT) that occur in iron defi­ciency, but for lead, the mech­anism is uncertain (EFSA, 2015).

17.1.4 Defi­ciency of iron in humans

Iron defi­ciency may arise from inadequate intakes of dietary iron, poor absorp­tion, excessive iron losses, or a combination of these factors. Individuals at greatest risk for iron defi­ciency include those with high iron requirements for growth (i.e., infants, children and pregnant women), and women of child bearing age with high menstrual losses. Several risk factors have been identified. In industrial­ized countries, prem­aturity or low-birth­weight, the use of a non-iron-fortified formula, the introduction in young children to cow's milk in the first year of life, and exclusive breast­feeding after 6mos without the regular feeding of iron fortified complementary foods, are all frequently observed causal factors. Among pregnant women, diets low in iron-rich foods, a short interpreg­nancy interval, and increasingly, obesity, are implicated (Kemper et al., 2017). Additional contributing factors associated with an increased risk of non-anemic iron defi­ciency among women of child-bearing age with high menstrual losses may include low intake of flesh foods, recent blood donations, nose bleeds, and low body mass index (Heath et al., 2001).

In some impoverished settings, additional factors causing iron defi­ciency may include: bleeding arising from exposure to parasitic helminths (e.g., hookworm) or schistosomiasis, impaired absorp­tion from infection / inflam­mation or possibly chronic Helicobacter pylori gastritis. In rare cases, genetic distur­bances in iron homeo­stasis arising from genetic mutations, such as in the TMPRSS6 gene, may result in iron-refractory iron defi­ciency anemia (i.e., iron defi­ciency that is resistant to oral iron therapy) (Lynch et al., 2018).

Anemia is the most common symptom of iron defi­ciency. In the past, angular stomatitis, glossitis, sidero­penic dysphagia, and koilonychia (spoon nails) were also considered physical mani­festations of iron defi­ciency, although rarely observed today.
Table 17.2. Functional conse­quences of iron defi­ciency. Data from Lynch (2011).
Conse­quences
Preg­nancy Increased risk of anemia and maternal morbidity
Increased risk of prem­aturity and lower birth weight
Higher infant mortality
Infants and
young children
Motor and cognitive devel­opmental delays in infancy
Effects on emotional maturation
Poorer academic achievement in school-age children
    who were iron deficient in early childhood
Increased risk of severe morbidity from malaria in
    children < 5y
Increased preva­lence and duration of upper respiratory
    tract infections in children
All agesImpaired physical performance and earning capacity
Suboptimal response to iodine in popu­lations with
    endemic goiter and increased risk of impaired thyroid
    function in the presence of iodine defi­ciency
Increased risk of chronic lead poisoning in high-lead
    environments
Increased risk of “restless leg syndrome”
These con­ditions probably resulted from the presence of iron defi­ciency along with other nutritional or environmental problems. Behav­ioral distur­bances are still considered symptoms of iron defi­ciency and may include pica, charac­ter­ized by the abnormal consumption of nonfood items such as dirt (geo­phagia) and ice (pago­phagia), and restless leg syndrome, a con­dition of uncon­trollable, uncom­fortable leg movement during sleep. Table 17.2 summarizes the main functional conse­quences of iron defi­ciency during the life cycle.

Some of the less-specific physio­logical mani­festations associated with the functional conse­quences of iron defi­ciency include fatigue, anorexia, tiredness, shortness of breath, and impaired physical performance. These are all symptoms that result from the diminished oxygen-carrying capacity of the blood. The effects of iron defi­ciency on impaired physical performance, notably exercise or work capacity, have been extensively reviewed by Haas and Brownlie (2001) and McClung and Murray-Kolb (2013). Based on their reviews of both animal and human studies, a profound negative effect of iron defi­ciency anemia on work capacity was reported; whether a comparable effect is apparent in the absence of anemia is less certain.

Numerous human studies have also investigated the associ­ation between iron defi­ciency and cognitive function. Inter­pretation of these data are difficult because neither iron defi­ciency nor the assessment of cognitive function outcomes have been adequately charac­ter­ized in some of the studies. In a system­atic review by Hermoso et al. (2011) in which only randomized controlled trials with an adequate control group were included (n=14), a modest positive effect of iron sup­plementation on cognition and psycho­motor outcomes was reported in anemic infants and children following supple­mentation with iron for at least 2mos.

Nevertheless, iron is known to be essential for the normal devel­opment and function of the brain. Animal research indicates that changes in neuro­transmitter homeo­stasis and decreases in basal ganglia are all caused by iron defi­ciency. Hence, early identifi­cation and treatment of iron defi­ciency-induced brain dysfunction appears to be critical to recover neural function and thus prevent any long-term effects on brain devel­opment. How­ever, such effects will depend on timing and the degree of iron defi­ciency. Long-term effects are most likely if the defi­ciency occurs during a time period when the need for iron for neuro­develop­ment is high (Prado and Dewey, 2014).

Numerous studies have investigated associ­ations between maternal iron defi­ciency anemia or iron defi­ciency (without anemia) and risk of functional adverse preg­nancy outcomes such as prem­aturity and low birth­weight, and maternal and neonatal mortality. Most have been based on comparisons between the effects of daily oral supple­ments con­taining iron (usually with folic acid) versus no iron or placebo during preg­nancy. Despite reducing the risk of maternal anemia and iron defi­ciency, the effects of iron supple­mentation during preg­nancy on functional maternal and infant outcomes have been inconsistent. Some studies have shown signif­icant reductions, for example, in risk of low birth weight (Imdad and Bhutta, 2012; Dibley et al., 2012), while in others, the results on maternal and infant outcomes have been less clear (Peña‑Rosas et al., 2015).

Concern has been raised that some of these inconsistencies across studies may be linked to the defin­ition of anemia applied, the timing of exposure, the pattern of changes in iron bio­markers during preg­nancy, and the definitions of normal or “healthy” reference ranges for Hb applied for comparison. In an effort to address at least some of these concerns, Young et al. (2019) examined the associ­ations of maternal Hb concen­trations with a range of maternal and infant health outcomes, taking into account the timing of the Hb measure­ment, etiology of anemia, and Hb cutoff category. Results of their system­atic review and meta-analysis confirmed that maternal Hb has an impor­tant role in relation to several maternal and child health outcomes; for more specific details for the outcomes examined, see Young et al. (2019). Never­the­less, in the studies included, very few distin­guished between iron defi­ciency anemia and non-iron defi­ciency anemia in relation to maternal and child functional outcomes. Table 17.3
Table 17.3. Meta-analysis summary estimates of associ­ation of IDA and non-IDA with birth outcomes. From Young et al. (2019) The number of studies in each category is: LBW/IDA: n=2; LBW/non-IDA: n=3; SGA/IDA: n=2; SGA/non-IDA: n=4; PTB/non-IDA: n=4.
Outcome IDA OR (95% CI) Non-IDA OR (95% CI)
LBW 1.17 (0.95‑1.43) 1.43 (0.82‑2.50)
SGA 0.77 (0.68‑0.87) 1.20 (0.85‑1.70)
PTB 1.07 (0.68‑1.70)
presents their meta-analysis summary estimates of associ­ations of iron defi­ciency anemia (IDA) and non-IDA with birth outcomes: low birth-weight (LBW), small for gesta­tional age (SGA) and preterm birth (PTB).

Note that the summary estimates for the associ­ation of low birth-weight with iron defi­ciency anemia and non‑iron defi­ciency anemia were both positive, although not statistically signif­icant, whereas for preterm birth the pooled estimate with non‑iron deficient popu­lations was non-signif­icant, and data insufficient to derive a pooled estimate for iron defi­ciency anemia. The odds of small for gesta­tional age with iron defi­ciency were decreased (2 studies) although there was no signif­icant associ­ation between small for gesta­tional age and non‑iron defi­ciency anemia (4 studies). Given the complexity of the changes in iron homeo­stasis and associated iron bio­markers during preg­nancy together with the diverse factors associated with etiology of anemia, research that includes some of the newer bio­markers such as serum hepcidin may help to resolve some of the reasons for the inconsistencies in the results observed.

17.1.5 Food sources and dietary intakes

High-iron foods include liver, kidney, mussels, and red meat. Foods with a medium iron content include chicken, processed meat, fish, and legumes (non-heme iron only). Milk and milk products and many fruits and veg­etables are poor sources of dietary iron (USDA, 2019). Flesh foods are especially impor­tant because of their high content of bioavailable heme iron, and their enhancing effect on non­heme iron absorp­tion (Hallberg and Rossander, 1984). Ground water may also be an impor­tant source of iron in some countries. Studies have found positive associ­ations between consumption of ground water with a high iron content and bio­markers of iron status and Hb in Bangladesh (Merrill et al., 2011; Rahman et al., 2016).

In many industrialized countries, cereal products fortified or enriched with iron, provide the most dietary iron, followed by meat, poultry, and fish, and then veg­etables and fruits (Gregory et al., 1990; McLennan and Podger, 1998; Russell et al., 1999). As an example, based on data from nine European countries, cereal products con­tributed more than 20% and up to 49% of the iron intake in all popu­lation groups except infants (EFSA, 2015). In the U.S. NHANES III survey (2003‑2006), about 40% of the iron in the diets of individuals > 2y was con­tributed by foods fortified or enriched with iron (Fulgoni et al., 2011).

Data on iron intakes are available from several nationally representative surveys. In the U.S. NHANES III survey (1988‑1994), the median usual intakes of iron for adult women and men were 12mg/d, and 16‑18mg/d, respectively (IOM, 2001). For adult Europeans (> 18y), average intakes ranged between 9.4 and 17.9mg/d (EFSA, 2015). In general, average iron intakes of males tend to be slightly higher than females because males generally consume larger quantities of food each day (EFSA, 2015).

The bioavail­ability of iron from mixed Western-style diets (for individuals with no iron stores) is estimated to range from 14‑18%, provided the diets con­tain ample quantities of flesh foods and ascorbic acid. In low-income countries, how­ever, dietary iron absorp­tion is often probably only about 5‑12% for individuals with no iron stores because in these plant-based diets the proportion of heme iron is low, while the content of absorp­tion inhibitors is often signif­icant (WHO/FAO, 2005).

The rise in the variety of food products fortified with iron has prompted research on the prop­erties of iron fortificants. Factors affecting their choice include absorbability (i.e., the highest relative bioavail­ability (RBV) compared with ferrous sulfate), absence of unacceptable changes to the sensory prop­erties (i.e., taste, color, texture) of the food vehicle, and the cost of the fortificant. The iron compounds used for fortif­ication or for enrichment can be divided into three categories — water soluble; poorly water soluble but soluble in dilute acid; water insoluble and poorly soluble in dilute acid. Almost all enter the common non­heme iron pool in the gastro­intestinal tract and are absorbed like native non­heme iron compounds. The third category, how­ever, — water insoluble and poorly soluble in dilute acid — do not fully enter the common non­heme iron pool, and their absorp­tion is poor.

The compounds currently used as iron fortificants in order of frequency of use are : (1) ferrous sulfate; (2) ferrous fumarate; (3) encapsulated ferrous sulfate or encapsulated ferrous fumarate; (4) electrolytic iron (a pure form of iron powder with a small particle size) or ferric pyrophosphate; (5) sodium-iron-ethylene-diamine-tetra-acetic-acid (NaFeEDTA); and (6) iron amino acid chelates, particularly iron-glycinate chelates (Lynch et al., 2018). The newer iron fortificants such as NaFeEDTA and ferrous bisgly­cinate provide better protection against non­heme iron absorp­tion inhibitors. Sodium iron EDTA is used in China to fortify wheat products, whereas ferrous bisgly­cinate, although expensive, is especially suited to the fortif­ication of liquid whole milk and dairy products. For a detailed review of iron fortificants, see WHO (2006).

17.1.6 Effects of high intakes of iron

Cases of acute iron toxicity have been reported, mainly among children who accidentally ingest medicinal iron supple­ments (IOM, 2001). More common are adverse gastro­intestinal effects including nausea, vomiting and diarrhea following the administration of high doses of iron supple­ments, especially when they are taken without food (Brock et al., 1985). Excessively high intakes of iron supple­ments may induce copper defi­ciency by an adverse effect on copper absorp­tion, although this effect may only occur when copper status is low or marginal (Cohen et al., 1985). High intakes of iron supple­ments taken without food may reduce zinc absorp­tion (Sandström et al., 1985), but generally not when iron supple­ments or iron fortificants, such as “sprinkles”, are taken with food (Olivares et al., 2012; Esamai et al., 2014; Davidsson et al., 1995), as noted earlier. In anemic Cambodian women, how­ever, a daily multi-micro­nutrient supple­ment con­taining 60mg of iron consumed with food blunted the increase in serum zinc observed among the women receiving the same supple­ment without iron; no data on zinc absorp­tion were collected (Holmes et al., 2019).

Possible toxic effects of rising iron stores over time have been implicated in the pathogenesis of several common diseases of aging. How­ever, evidence for a causal relationship in the general popu­lation between the level of dietary iron intake or body iron content and risk for cancer (Turner and Lloyd, 2017) and coronary heart disease (Hunnicutt and Xun, 2014) remains incon­clusive. Moreover, there is no evidence that dietary iron is associated with arthritis, diabetes mellitus, or neuro­degenerative disease (EFSA, 2015).

Certain genetic mutations can cause iron overload. One of the most well-known examples is hereditary hemo­chromatosis arising from a mutation in HFE gene that results in a reduced or negligible expres­sion of hepcidin. This results in an inability to limit absorp­tion of dietary iron, so iron accumulates in many organs. Hemo­chromatosis is relatively common in Northern European popu­lations (Beutler et al., 2002); hetero­zygotes for the mutation associated with hemo­chromatosis, how­ever, do not appear to be at an increased risk of iron overload.

Iron overload has also been associated with other genetic defects such as atrans­ferrinemia, acerulo­plasmin­emia, and Friedreich ataxia. African iron overload, previously termed “Bantu cirrhosis”, is also thought to have a genetic compo­nent in addition to the excessive intakes of iron from food and beer prepared in iron utensils (EFSA, 2015). Iron overload may also result from the multiple blood transfusions used to treat certain inherited disorders such as iron-loading anemia (e.g., β‑thalassemia). In most iron overload syndromes, iron is sequestered in ferritin and hemo­siderin in all tissues throughout the body.

Concern has been raised that supple­mentation of iron given in non-physio­logical amounts can increase the risk of bacterial and protozoal infections (see “malaria” in Section 17.11.1 for more details). As approx­imately 40% of the popu­lation in the world is exposed to malaria and it is endemic in over 100 countries, resolving this concern is essential. Neuberger et al. (2016) reviewed the available evidence based on a meta-analysis of 35 randomized controlled trials (31,955 children). They concluded that oral iron supple­mentation does not increase the risk of clinical malaria provided regular malaria prevention or management services are provided. There­fore, in resource-poor settings, routine iron supple­mentation can be given to children without screening for anemia or for iron defi­ciency provided regular malaria surveillance and treatment services are provided. How­ever, such services for malaria are not always available in low-income countries. For a review of the safety and benefits of iron supple­ments for pregnant women in low-income countries, consult Mwangi et al. (2017).

Recom­mendations by the World Health Organization (WHO) on the prevention and control of iron defi­ciency in infants and young children in malaria-endemic areas are available (WHO, 2007). Instead of using iron supple­ments, a safer strategy for improving iron status may be the use of lower quantities of iron provided within a food matrix (i.e., fortified food); see Prentice et al. (2017) for more details.

The U.S. Food and Nutrition Board has set the Tolerable Upper Intake Level (UL) based on the gastro­intestinal side effects associated with high levels of iron consumed on an empty stomach. The UL for iron for adults, including pregnant and lactating women ≥ 19y is 45mg/d, whereas the level for infants and children 1‑18y is 40mg/d (IOM, 2001). No UL has been set for iron by the European Food Safety Authority — EFSA (2006; 2015) or by WHO/FAO (2005).

17.1.7 Bio­markers of exposure

To estimate exposure to the intake of dietary iron, conventional dietary assessment methods must be used. The method selected depends on several factors, including the study objective, the setting, and available resources; for more details, see Chapter 3. Estimates of habitual usual iron intakes are needed in view of the length of time required by iron status bio­markers to reach a steady state in response to changes in dietary intakes. Usual iron intakes are also essential to estimate the preva­lence of inadequate intakes for popu­lation groups and for the investigation of associ­ations between dietary intakes and health outcomes. Details of the statistical adjust­ments for converting observed iron intakes to usual iron intakes can also be found in Chapter 3. Alternatively, for large-scale popu­lation studies, a food frequency questionnaire (FFQ) that is designed to assess intakes of iron-rich foods and iron-absorp­tion modifiers over a longer time period may be used, provided the FFQ has been validated for the study setting (see Chapter 3, Section 3.1.6).

As noted earlier, inadequate intakes of poorly available dietary iron are a major concern in the plant-based diets of many low-income countries. Bioavailability is defined as the extent to which the iron is absorbed from the diet and used for normal bodily functions (Lynch et al., 2018). Several algorithms have been developed to predict the bioavail­ability of iron based on dietary data that include the form of the iron (i.e., heme or non­heme) and the content of known absorp­tion modifiers in a meal. Of the algorithms available, the model of Hallberg and Hulthén (2000) is the most detailed, taking into account the effects of all known modifiers of non­heme iron absorp­tion as well as interactions, with adjust­ments for the iron status of the individual. How­ever, as with all the other algorithms developed, the model of Hallberg and Hulthén only predicts non­heme iron absorp­tion so assumptions about the intake and absorp­tion of heme iron must be made. Further­more, the model is difficult to apply, in part because of incomplete food compo­sition data for the absorp­tion modifiers. Indeed, all algorithms have limitations and often under­estimate bioavail­ability, with a 3-fold variation in estimates when using dif­fer­ent algorithms (Beard et al., 2007).

In view of the limitations of the algorithms to estimate iron bioavail­ability, FAO/WHO (2002) have developed a qualitative approach to estimate iron absorp­tion based on data on iron absorp­tion from typical meals in Asia, India, Latin America, and Western countries. Applying the FAO/WHO approach, diets can be categorized as having a low, intermediate, or high bioavail­ability based on the major food sources of iron, as shown in Box 17.1. The estimates of absorp­tion given in Box 17.1 refer to non­anemic persons with normal iron trans­port (i.e., with normal Hb) with no iron stores. When individuals have iron defi­ciency anemia, absorp­tion may be increased by 50% (i.e., increasing to percen­tages of 7.5, 15, and 22.5, respectively, for the low-, intermediate-, and high-bioavail­ability diets). Note that for Western-type diets, FAO and WHO now propose two categories of bioavail­ability — 12% and 15% — depending on the meat content of the diet; see Gibson and Ferguson (2008) for further details.
Box 17.1. Using the FAO/WHO qualitative approach to estimate iron bioavail­ability

Uses the data on major food sources to estimate iron bioavail­ability Modified from FAO/WHO (2002).
Recently a new approach to calculate dietary iron bioavail­ability has been developed using a probability model. The approach is based on calculated iron requirements, data on daily total iron intake, and the distribution of serum ferritin values in the popu­lation sample; total iron absorp­tion at any level of iron status can be estimated (Dainty et al., 2014). Unlike the earlier algorithms, this new approach does not require data on iron absorp­tion modifiers or the heme content of the diets.

Using the approach described above, data from a representative group of adults from the U.K. National Diet and Nutrition Survey were used to estimate total iron absorp­tion from a mixed Western style diet, using a target serum ferritin concen­tration of 30µg/L for iron storage status. At this level, estimates for the predicted iron absorp­tion for men and pre­menopausal women were 16% and 18%, respectively (EFSA, 2015). Note an interactive tool is available (Fairweather-Tait et al., 2017) that could be adapted for use in popu­lations con­sum­ing habitual plant-based diets with low-iron bioavail­ability, provided the appro­priate correction for the impact of inflam­mation on serum ferritin concen­trations is performed, when necessary (see “Acute and chronic inflam­mation” in Section 17.7.1 for more details). Never­the­less, because both the iron requirements and dietary iron must be in a steady state for at least a year when applying this approach, it is not suitable for certain groups such as children, pregnant women, or immediately after the onset of menopause.

17.1.8 Inter­pretation of iron intakes

The final stage of the assessment of exposure to the intake of iron is to evaluate the usual intakes of the popu­lation or an individual in relation to the nutrient reference values (NRVs); see Chapter 8a for details of the derivation of the NRVs. To estimate the physio­logical requirements for iron, the factorial approach, based on estimates of the quantity of absorbed iron needed to replace iron losses, is used by most expert groups, including WHO/FAO (2005), IOM (2001), and EFSA (2015). Next, adjust­ments are made to the estimates of physio­logical requirements to yield a dietary iron requirement (i.e., Average Require­ment; AR) by taking into account the bioavail­ability of iron in the habitual diet. The chosen bioavail­ability factors (see Chapter 8a, Section 8a.8 for more details) are partly responsible for differences in the dietary iron requirements (i.e., ARs) across countries. There­fore, care must be taken to ensure that the dietary iron requirements chosen for the evaluation of iron intakes are appro­priate for the habitual diet of the popu­lation under study.

Some expert groups (e.g., IOM, 2001; EFSA, 2015) apply a fixed bioavail­ability factor for iron to estimate the dietary requirement (AR), even though the efficiency of absorp­tion may vary with life-stage and health status. Others, including WHO/FAO (2005), do not present average dietary requirements (ARs) for iron, instead providing only estimates of Recom­mended Intakes (RIs) from which the ARs for dietary iron can be derived. These calculated AR values are available in WHO (2006) and are the values that should be used to evaluate the preva­lence of inadequate intakes of popu­lation groups in a low-income setting. In the higher-income countries, how­ever, the AR values set by IOM (2001) or EFSA (2015) may be appro­priate. The full-probability approach must be applied to estimate the preva­lence of inadequate intake for iron for certain popu­lation subgroups, notably children (1‑8y), menstruating adolescents (14‑18y) and adult women, because their iron requirements are not symmetrical about the AR. For the details of the application of the full probability approach both manually and using appro­priate software, see Chapter 8b. For the manual calculation, the reader is also advised to consult Gibson and Ferguson (2008).

To estimate iron bioavail­ability in the diet of an individual, the qualitative model of FAO/WHO (2002) (Box 17.1.) that takes into account the major food sources of iron and the presence of potential absorp­tion modifiers can be used. Alternatively, bioavail­ability can be calculated from an appro­priate iron algorithm. Inferences can then be made about the adequacy of the iron intake of the individual by comparing the difference between the estimate of the usual iron intake and the corres­ponding selected values for the AR and RI, taking the assumed bioavail­ability into account. For details on the guidelines for the qualitative inter­pretation of individual iron intakes, see Chapter 8b.

17.1.9 Introduction: bio­chem­ical bio­markers of iron status

In the past, iron defi­ciency has been assumed to cause 50% of all cases of anemia worldwide. The proportion is now thought to be much lower. In a system­atic review, Petry et al. (2016) estimated that only 25% of the anemia in pre-school children, and 39% in non-pregnant women of repro­ductive age, is attributable to iron defi­ciency. In their review, unlike in many earlier studies, the probable exposure of a country to infection and chronic inflam­matory con­ditions was taken into account. Their findings highlighted the challenge of distin­guishing between anemia due to iron defi­ciency and that due to inflam­mation (i.e., anemia of chronic disease). This is a concern in low-income countries and in the elderly who are also at high risk for inflam­mation (see “Acute and chronic inflam­mation” in Section 17.7.1 for more details).

The gold standard test for the diagnosis of iron defi­ciency anemia is a stained bone marrow aspirate or biopsy. This provides a semi-quantitative estimate of the size of the body iron store. How­ever, this test is invasive and costly, and requires an experienced person to evaluate the stained specimen, and so is not commonly used (Lynch et al., 2018). Instead, several hematological parameters are used to diagnose anemia, along with serum-based iron bio­markers and bio­markers of inflam­mation to identify iron defi­ciency or iron defi­ciency anemia; these are discussed in detail in the following sections. For a summary table of each of these measures, and their advantages and limitations, the reader is referred to Lynch et al. (2018). Three stages in the devel­opment of iron defi­ciency anemia can be recognized and are best charac­ter­ized by the use of multiple bio­markers (Section 17.10). The three stages are the following:

Iron depletion, the first stage, is charac­ter­ized by a progressive reduction in the amount of storage iron in the liver. At this stage, the supply of iron to the functional compartment is not compro­mised so levels of trans­port iron and hemo­globin are normal. How­ever, the progressive depletion of iron stores will be reflected by a fall in serum ferritin concen­trations.

Iron-deficient erythro­poiesis, the second stage, is charac­ter­ized by the exhaustion of iron stores and is also referred to as “iron defi­ciency without anemia”. At this stage the iron supply to the erythropoietic cells is progressively reduced and decreases in trans­ferrin satu­ration occur (Section 17.6). At the same time, there are increases in serum trans­ferrin receptor (Section 17.9) and erythro­cyte proto­porphyrin concen­trations (Section 17.8). Hemo­globin levels may decline slightly at this stage, although they usually remain within the normal range.

Iron-defi­ciency anemia, the third and final stage of iron defi­ciency, is charac­ter­ized by the exhaustion of iron stores, declining levels of circu­lating iron, and presence of frank micro­cytic, hypo­chromic anemia (i.e., red blood cells are small and con­tain less Hb than normal). The main feature of this stage is a reduction in Hb (Section 17.2), arising from the restriction of iron supply to the bone marrow. Decreases in the hemato­crit (Section 17.3) and red-cell indices also occur (Section 17.4). Examination of a stained blood film allows confirm­ation of the presence of hypo­chromia (a lower Hb than normal) and micro­cytosis (abnormally small red-blood cells).

17.2 Hemo­globin

Iron is an essential compo­nent of the Hb mole­cule, the oxygen-carrying pigment of red blood cells. Each Hb mole­cule is a conjugate of a protein (globin) and four mole­cules of heme, as noted earlier. Anemia develops when circu­lating red blood cells are insufficient to meet physio­logical oxygen-carrying needs, and is identified by a Hb concen­tration below a defined cutoff-value. Measure­ment of the Hb concen­tration in whole blood is probably the most widely used test for iron-defi­ciency anemia, even though only a subset of anemic individuals in a popu­lation will have iron defi­ciency anemia. Anemia may also be due to malaria, infection, hemo­globin­opathies, or defi­ciency of several other micro­nutrients besides iron.
Table 17.4. Within-subject, day-to-day coefficient of variation (CV%) for Hb, serum ferritin, and serum iron for “healthy” subjects. M, male; F, female.
Reference Hemo-
globin
Serum
ferritin
Serum
iron
Dawkins et al. (1979) 15 (M, F)
Gallagher et al. (1989) 1.6 (F) 15 (F)
Statland and Winkel (1977) 29 (F)
Statland et al. (1976) 27 (F)
Statland et al. (1978) 3 (M, F)
Pilon et al. (1981) 15 (M, F) 29 (M, F)
Romslo and Talstad (1988) 13 (M, F) 33 (M, F)
Borel et al. (1991) 4 (M, F) 14 (M) 27 (M)
Borel et al. (1991) 4 (F) 26 (F) 28 (F)
Hence, the specificity of Hb for iron defi­ciency anemia is poor. Hemo­globin is also an insen­sitive measure of iron defi­ciency because it only falls during the third stage of iron defi­ciency, when iron stores are exhausted and the supply of iron to the tissues is compro­mised. In addition, considerable overlap exists in the Hb values of normal non­anemic and iron-deficient individuals, further reducing its sensitivity for screening for iron defi­ciency. Conse­quently, Hb should not be used as the only measure of iron defi­ciency anemia in individuals, except in very specific circumstances (Section 17.2.4). Limi­tations of using Hb as a measure of iron defi­ciency are discussed below.

17.2.1 Factors affecting Hb concen­trations

Daily biological variation in Hb concen­trations is signif­icant (Van Wyck et al., 2011; Sennels et al., 2011). Hemo­globin values tend to be lower in the evening than in the morning, by amounts of up to 10g/L. Within-subject day-to-day variation is low (CV≈3%) (Table 17.4).

Age is an impor­tant determinant, particularly for infants and children. Hemo­globin concen­trations are higher (130‑180g/L) at birth (Lorenz et al., 2013) than any other time in life, reflecting fetal adaption to the oxygen- deficient environment of the uterus. After birth, Hb concen­trations fall, reaching their lowest point at about 2mos This fall is probably due to the sudden decrease in erythro­poiesis arising from the increase in oxygen delivery to the tissues (Zierk et al., 2015). (Figure 17.2).
Figure 17.2
Figure 17.2. Changes in median concen­tration Hb with age. Arrow shows the time point when 50% of the subjects reach Tanner stage PH2 and sex differences become signif­icant. Redrawn from Zierk et al. (2015).
Thereafter, Hb concen­trations gradually rise (Domellöf et al., 2002a), with a marked increase at puberty in males as a result of accelerated growth (Figure 17.2). The arrow shows the point when 50% of the popu­lation reach Tanner stage PH2 (i.e., the appearance of pubic hair), empha­sizing the sex-dependent changes after puberty (Zierk et al., 2015). Around the fifth decade of life, Hb levels decline. Several factors have been implicated in this decline, including the devel­opment of anemia of chronic disease with aging.

Sex differences in Hb are apparent at 6mos, with boys having slightly lower Hb concen­trations than girls. Such discrepancies appear to be greater in infants with a birth weight < 3500g (Domellöf et al., 2002). By the second decade of life, how­ever, females have values that change very little after 12y (Table 17.5). In young adults, the Hb concen­tration for men is on average about 20g/L higher than for women, due to testosterone which results in both a larger body size and larger erythro­cyte mass. For women, the trend for lower Hb values is due to chronic menstrual blood loss. These sex-related differences, although diminishing gradually with increasing age, remain (Table 17.5).
Table 17.5. Mean (SD) hemo­globin concen­trations (g/L) by age and sex. Data from National Health and Nutrition Examination Survey III (1988‑1994). Individuals of all races with abnormal or missing values for trans­ferrin satu­ration, erythro­cyte proto­porphyrin, serum ferritin, or mean cell volume were excluded. From Looker et al. (1997).
Age (y) Females Males
1‑2 122.0 (7.34)
3‑5 124.4 (7.57)
6‑11 130.9 (7.92)
12‑15     134.3 (9.27)         142.4 (10.0)    
16‑19     133.7 (8.21)         152.9 (10.03)    
20‑49     134.8 (9.12)         153.0 (9.68)    
50‑69     136.5 (9.82)         150.1 (10.64)    
≥ 70     135.6 (10.68)         145.3 (12.87)    

Ethnicity is known to influence Hb concen­trations. Individuals of African descent in the United States have Hb values 5‑10g/L lower than Caucasians, irrespective of age, income, or iron defi­ciency. It is likely that a genetic factor is involved. In contrast, Hb concen­trations for other U.S‑based races are similar to those of U.S Caucasians (Nestel, 2002; Cheng et al., 2004). In 2001, WHO / UNICEF / UNU (2001) recom­mended adjusting Hb concen­trations downward by 10g/L for individuals of African descent, irrespective of age. This recom­mendation has now been abandoned in view of the paucity of data related to ethnicity. Instead, standard cutoffs for Hb, irrespective of race, are now recom­mended (WHO, 2011).

Cigarette smoking is associated with higher concen­trations of Hb (0.3‑0.7g/L) in adults. This is attributed to a reduction in the oxygen-carrying capacity of the blood arising from the carbon monoxide-induced increase in carboxy­hemo­globin levels, which has no oxygen-carrying capacity. An adjust­ment to define anemia for smokers has been defined by the U.S. CDC based on U.S. NHANES II data (see Table 17.6). The adjust­ments should be subtracted from the measured values. These adjust­ments are additive so that if smokers are living at higher altitudes (see below), two adjust­ments would be needed.

Higher altitudes generate an adaptive response to the lower partial pressure of oxygen and the reduced oxygen satu­ration of blood.
Table 17.6. Altitude and smoking adjust­ments to measured hemo­globin concen­trations. From Nestel (2002) and WHO (2011): Hemo­globin concen­trations for the diagnosis of anemia and assessment of severity. Vitamin and Mineral Nutrition Information System, Geneva.
Altitude (metres
above sea level)
Measured Hb
adjust­ment (g/L)
< 1000 0
1000− 2
1500− 5
2000− 8
2500− 13
3000− 19
3500− 27
4000− 35
4500− 45
Smoking statusadjust­ment (g/L)
Non-smoker 0
Smoker (all) −3
0.5‑1 packet/d−3
1‑2 packets/d−5
≥ 2 packets/d−7
As a result, red blood cell produc­tion increases to ensure sufficient oxygen is supplied to the tissues (Hurtado et al., 1945). This adaptive response becomes signif­icant at elevations above 1000m, when adjust­ments for altitude for Hb cutoffs published by the CDC should be applied (CDC, 1989). The changes with altitude are curvilinear, with the increase in Hb concen­tration becoming more marked as altitude increases (Table 17.6). Even individuals living at altitudes with Hb concen­trations that are above the cutoff said to be indicative of anemia, but with concomitant iron defi­ciency, may respond to iron supple­mentation by an increase in Hb (Berger et al., 1997).

Iron-defi­ciency anemia develops during the third stage of iron defi­ciency. At this stage, iron stores are exhausted, the supply of iron to support the produc­tion of red blood cells is compro­mised, and as a conse­quence Hb concen­tration decreases. Iron-defi­ciency anemia is charac­ter­ized by micro­cytic, hypo­chromic anemia. Considerable overlap exists in the Hb values of normal non­anemic and iron-deficient individuals, noted earlier, which further decreases the sensitivity of Hb measure­ments (WHO, 2017).

Preg­nancy results in an expansion of both plasma volume and the red-cell volume. Near the end of the second trimester of preg­nancy, the increase in plasma volume is not matched by a propor­tionate increase in red cells, so Hb becomes even more diluted. In women who are not taking iron supple­ments, the decrease in Hb reaches a low point at about 28‑36wks (on average, ≈ 2g/dL lower than pre-preg­nancy Hb levels) (Figure 17.3). Among iron-sup­ple­mented women, Hb concen­trations are reportedly about 1g/dL higher at term compared with those unsup­ple­mented during preg­nancy (Fisher and Nemeth, 2017).

Figure 17.3
Figure 17.3. Mean hemo­globin concen­trations (n=60) during normal unsupple­mented preg­nancy. Redrawn from Fisher and Nemeth (2017).

Other micro­nutrient deficiencies besides iron are associated with anemia and, by definition, with low Hb concen­trations. These include deficiencies of vitamin A, B6, and B12, ribo­flavin, folic acid, and copper; details of the role of vitamins in the etiology of anemia are given in Fishman et al. (2000), and for copper in Myint et al. (2018). More recently, zinc, selenium, and vitamin D defi­ciency have also been implicated in anemia (Gibson et al., 2008; Smith & Tangpricha 2015; Houghton et al., 2016). Details of the associ­ation and the proposed mech­anisms linking zinc with Hb concen­trations in both pre­school children and women of repro­ductive age are reported in Greffeuille et al. (2021). For selenium, reductions in the activity of two seleno­enzymes, glutathione peroxidase and thio­redoxin reductase, have been linked with anemia (Houghton et al., 2016). There is emerging evidence that vitamin D may be protective against anemia by supporting erythropoiesis. For more details of proposed mech­anisms, see Smith and Tangpricha (2015).

Blood donations, when regular and repeated, may cause a decrease in body iron stores (Heath et al., 2001), leading to a fall in Hb concen­trations and the devel­opment of iron defi­ciency anemia in some individuals. This is not surprising as 1mL of blood con­tains 0.5mg of iron, so a single donation of 450mL blood will con­tain nearly 250mg of iron. There­fore, if blood is donated thrice yearly, nearly 1g iron annually will be lost. Currently, iron supple­mentation for blood donors is often not a standard care. Hence, the finding that for regular blood donors in India, for example, body iron stores were inversely proportional to the lifetime number of donations, is not unex­pected (Reddy et al., 2020).

Anemia of chronic disease (ACD) is a normo­chromic, normo­cytic anemia (i.e., the average size and Hb content of red cells are within normal limits) with character­istically mild (90g/L) to moderate (80g/L) Hb concen­trations. It can result from a variety of dif­fer­ent conditions. Acute and chronic infections such as malaria, tuberculosis, bacterial infection and HIV/AIDS as well as inflam­matory con­ditions such as rheumatoid arthritis or malignancy can all lead to the anemia of chronic disease. All these con­ditions induce the acute-phase response whereby pro-inflam­matory cytokines (e.g., IL‑6 and leptin) stimulate the release of hepatic hepcidin into the circu­lation. Hepcidin in turn, blocks the release of iron into the circu­lation from storage sites (e.g., reticulo-endothelial cells) independent of iron status, resulting in an increase in serum ferritin. At the same time, hepcidin inhibits iron absorp­tion. As a conse­quence, serum iron levels fall, the iron available for erythro­poiesis is reduced, and Hb concen­trations decline (Anderson and Frazer, 2017). In chronic inflam­mation, erythropoietin produc­tion may also be impaired and red cell survival diminished (Raiten et al., 2015).

Malaria parasites cause low Hb concen­trations and thus anemia by: (a) increased hemolysis with erythrocyte destruction of both infected and non-infected red cells; (b) suppression of red blood cell formation (i.e., suppressed erythro­poiesis) by the action of hemozoin, a product of Hb catab­olized by the parasites; and (c) induction of the acute phase response and the subsequent up­reg­ulation of hepcidin. The latter inhibits both the release of iron from storage sites into the circu­lation and the absorp­tion of dietary iron, thus also blocking erythro­poiesis (Ghosh and Ghosh, 2007; Drakesmith and Prentice, 2012).

Malaria is the most signif­icant parasitic disease in humans and one of the primary causes of anemia globally. It is caused through infection by parasites of the Plasmodium genus, with the P. falciparum causing the most serious conse­quences. Groups at highest risk are pregnant women and children, especially younger children (i.e., < 72mos) with high levels of storage iron (Barffour et al., 2017).

Intestinal parasitic infections such as Helminths (e.g., hookworms: Necator americanus and Ancylostoma duodenale) may cause low Hb values and subsequent anemia due to chronic gastro­intestinal blood loss. Hookworms become attached to the mucosa of the upper small intestine and feed on blood which is subsequently expelled through the intestinal tract of the hook worm. Secondary blood loss may also occur from bleeding of the damaged intestinal mucosa (Stoltzfus and Dreyfuss, 1998). The magnitude of the blood loss is proportional to the burden of hookworm infection in an individual (WHO, 2017). Infection with hookworm is treated easily with anti­helminthic medication such as Albendazole.

Schistosomiasis is caused by the blood-dwelling worm of the genus Schistosoma, and leads to blood loss, and low Hb concen­trations. The precise mech­anism is unclear, although splenic sequestration of erythro­cytes, increased hemolysis, or the anemia of chronic disease may play a role (WHO, 2017). Infection occurs through exposure to contaminated fresh water, primarily in Sub-Saharan Africa where this devastating parasitic disease is widespread. All mature species of Schistosoma can be treated with the drug Praziquantel (Gryseels, 2012).

Helicobacter pylori infection is a gram-negative bacterium that can be acquired by means of oral-oral or fecal-oral transmission, the latter often via ingestion of contaminated water. The pathogen colonizes the gastric epithelium, surviving the highly acidic lumen. Most individuals with H. pylori remain asymptomatic. How­ever, infection predisposes the devel­opment of diseases such as peptic ulcers, gastric cancers, and lymphoma, all of which can cause chronic bleeding, resulting in iron defi­ciency anemia (de Brito et al., 2019).

Genetic hemo­globin disorders can cause low Hb concen­trations and anemia. They are classified as thalassemias when there is impaired synthesis of Hb chains. Alternatively, in structural Hb variants (i.e. hemo­globin­opathies), defective Hb chains result from genetic alterations in the physical structure of Hb. The clinical features of genetic Hb disorders vary with the severity of the genetic defect and their effect on the structure and function of Hb. Some present as mild-to-severe anemia, whereas others are associated with multiple clinical complications that may sometimes be severe enough to cause death in utero (Weatherall, 2010; Weatherall and Clegg, 2001). Certain serum bio­markers (e.g., ferritin and trans­ferrin receptor) are affected, limiting their usefulness for detecting iron defi­ciency where these disorders are widespread; see “Genetic Hb disorders” in both Section 17.7.1 and Section 17.9.1 for more details.

In a recent study of the global burden of anemia, cases of anemia in 11.6% of females and 9.9% of males were said to be caused by genetic Hb disorders (Kassebaum et al., 2016), the most common being α‑ and β‑thalassemia, together with sickle cell anemia, Hb C and Hb E. Heterozygous carriers of these diseases may have mild or no abnormalities (Barrera-Reyes and Tejero, 2019). These genetic Hb disorders are highly prevalent in areas of the world where there is a high burden of malaria as even the hetero­zygous con­ditions appear to confer some protection against severe malaria.

Genetic disorders of red-cell enzymes are also associated with the risk of anemia, the most frequent being glucose-6-phos­phate dehydro­genase defi­ciency (G6PD). This disorder renders red-blood cells susceptible to chronic, or acute intermit­tent, hemolytic anemia under some con­ditions that induce oxidative stress (e.g., certain medications, during infections, exposure to certain foods). Like some genetic Hb disorders, G6PD is also common in malaria-endemic regions, with the highest preva­lence reported in Sub-Saharan Africa (Barrera-Reyes and Tejero, 2019).

Certain disease states such as HIV and tuber­culosis can reduce Hb concen­trations. In HIV infection, low Hb values may arise from indirect effects arising from oppor­tunistic infections, nutritional deficiencies, and the negative effect of anti­retro­viral therapy. Direct effects of the HIV virus include an effect on hematopoietic progenitor cells and a reduced responsiveness to erythropoietin (WHO, 2017).

In tuberculosis, low Hb values arise from increased blood loss, hemolysis (blood in sputum), and decreased produc­tion of erythro­cytes. Poor appetite and thus reduced food intake, are additional contributing factors which in turn, may result in inadequate intakes of multiple micro­nutrients (WHO, 2017).

Both HIV infection and tuberculosis are also linked to the anemia of chronic disease, as noted above. For discussion of the impact of anemia of chronic disease on ferritin concentrations; see “Acute and chronic liver disease” in Section 17.7.1.

17.2.2 Interpretive criteria — use of the distribution of hemoglobin reference values

At the popu­lation level, Hb concen­trations can be evaluated by comparison with a distribution of Hb reference values using percentiles. The reference values are obtained from a cross-sectional analysis of a reference sample group, which in most cases consists of an apparently “healthy” popu­lation sampled during a nationally representative survey and assumed to be disease-free. Conse­quently, some of the participants in the sample will have Hb concen­trations lower than normal because they are affected by some of the factors, discussed above, known to influence Hb concen­trations. Nevertheless, such reference distributions are often used as a standard for comparison with the Hb distribution measured in a study popu­lation. Examples of the reference distributions of Hb concen­trations compiled in this way are shown below: In some cases, the distribution of Hb concen­trations has been drawn from a “healthy” reference sample in which participants with conditions known to affect Hb concen­trations have been excluded. Examples include:

17.2.3 Hemo­globin cutoff values

Individuals in a popu­lation can be classified as “at risk” by comparing Hb concen­trations with either statistically pre­deter­mined reference limits drawn from the reference distribution, or clinically or functionally defined “cutoff values”. For Hb, where a single reference limit has been defined, in most cases this has been set at the 5th percentile of a “healthy” reference distribution or one in which the population sampled are assumed to be disease free, as noted in Section 17.2.2. Technically, a statistically defined reference limit is not a “cutoff value” because it has not been linked with physio­logical or health outcomes, so these two terms should not be used interchangeably. Nevertheless, invariably, the term “cutoff values” is applied irrespective of the method used to define them. For more details, see Chapter 15: Evaluation of nutritional bio­markers.

Table 17.7 compares the Hb “cutoff” values (g/L) set in 1968 by WHO (WHO, 1968) and by Looker et al. (1997) for the United States. The WHO cutoffs were based initially on data from small studies of predominantly Caucasian Europeans and Canadians, whereas the U.S. Hb data were from the NHANES III survey (1988‑1991). In both cases, the cutoffs set are based on the 5th percentile values of the reference distribution, although the latter compiled by Looker et al. (1997) comprised a “healthy” reference sample because those with conditions known to affect Hb concen­trations were excluded, as noted in Section 17.2.2. Technically because these “cutoff-values” were statistically derived rather than being linked with physio­logical or health outcomes, they both should be referred to as “reference limits”, as noted above.

Concern has been raised over the appro­priateness of these statistically derived reference limits for defining anemia among certain age groups and ethnicities (Johnson-Spear and Yip, 1994; Beutler and Waalen, 2006; Jorgensen et al., 2019). As a result, Addo et al. (2021) have used cross-sectional data from a multi­national sample across 27 nutrition surveys from 25 countries to re-examine the appro­priateness of using the 5th percentile WHO (1968) Hb cutoffs across countries for pre­school children and nonpregnant women. Only data from “healthy” individuals were included, so persons with iron defi­ciency (ferritin < 12µg/L for children or < 15µg/L for women), vitamin A defi­ciency (serum retinol-binding protein or retinol < 0.70µmol/L), inflam­mation (C‑reactive protein (CRP) > 5mg/L or α‑1‑acid glyco­protein (AGP) > 1g/L), or known malaria, were excluded.
Table 17.7. Two dif­fer­ent sets of hemo­globin cutoffs used to define anemia. The INACG / WHO / UNICEF cutoffs are for individuals living at sea level and are from Stoltzfus and Dreyfuss (1998).
Hemo­globin cutoff value (g/L)
Age (y) Males and
Females
MalesFemales
NHANES III (from Looker et al., 1997).
1‑2 < 110
3‑5 < 112
6‑11< 118
12‑15 < 126 < 119
16‑19 < 136 < 120
20‑49 < 137 < 120
50‑69 < 133< 120
≥ 70< 124< 118
INACG / WHO / UNICEF
0.5‑5 < 110
5‑11 < 115
12‑13 < 120
Men < 130
Pregnant women< 110
Nonpregnant women < 120
The inter­survey variance around the Hb 5th percentile was low (3.5% for women and 3.6% for children) supporting the approp­riateness of pooling multi­national Hb data to derive a single-population group specific 5th percentile cutoff. The pooled 5th percentile estimates for apparently “healthy” individuals were: 96.5g/L (95% CI, 92.6‑100.4g/L) for children aged 6‑59mos, and 108.1g/L (95% CI, 103.5‑112.7g/L) for nonpregnant women aged 15‑49y, both lower than the WHO, 1968 estimates for these age groups shown in Table 17.7. Moreover, when the Hb fell below 96.1g/L for the children and below 110.1g/L for the women there was a compen­satory increase in erythro­poiesis, as indicated by a plot of Hb concen­trations against sTfR (a bio­marker of tissue iron defi­ciency and a physio­logical indicator of erythropoiesis) (Section 17.9). Nevertheless, the invest­igators highlight the need for more studies based on clinical outcomes to further confirm the validity of these revised Hb cutoffs for anemia for these two age groups; see Addo et al. (2021) for more details.

Cutoffs for infants and young children differ across countries and agencies. For iron-replete, breastfed Swedish infants aged 4‑9mos, cutoffs defined by Domellöf et al. (2002) for anemia range from < 105g/L at 4mos; < 105g/L at 6mos; and < 100g/L at 9mos. In contrast, WHO (1968) set a single cutoff for infants and young children aged 0.5‑5y of < 110g/L, higher than the pooled 5th percentile Hb cutoff reported above (96.5 g/L) (Addo et al., 2021). Further, Addo et al. report that infants aged 6‑11mos had lower Hb levels than those of children > 48mos (P< 0.001) after adjust­ment for sex, Hb assessment method, and survey. Several other investigators have questioned the use of the same Hb cutoff for infants 6‑11mos and young children (Emond et al., 1996; Sherriff et al., 1999; Wharton, 1999). The American Academy of Pediatrics recom­mends a cutoff of < 110g/L for children aged 1‑3y (Baker et al., 2010), which is within the range set by Looker et al. (1997) for U.S. children (Table 17.7).

Cutoffs for adults are also shown in Table 17.7. For non-pregnant women, cutoffs set by WHO (1968) and Looker et al. (1997) are comparable, although higher than the multi­national cutoff (108.1g/L) proposed for non-pregnant women (15‑49y) by Addo et al. (2021). Sekhar et al. (2017)
Table 17.8. Hemo­globin levels (g/L) to diagnose anemia at sea level. Source: Hemo­globin concen­trations for the diagnosis of anemia and assessment of severity. Geneva: World Health Organization (2011).
Anemia — Hemo­globin (g/L)
Popu­lation No anaemia Mild Moderate Severe
Children, 6‑59mos≥ 110 100‑109 70‑99 < 70
Children, 5‑11y ≥ 115 110‑114 80‑109< 80
Children, 12‑14y ≥ 120 110‑119 80‑109< 80
Non-pregnant
women, ≥ 15y
≥120 110‑119 80‑109 < 80
Pregnant women ≥ 110 100‑109 70‑99< 70
Men, ≥ 15y≥ 130 110‑129 80‑109< 80
have also revised the cutoff-values for detecting iron defi­ciency anemia among non‑pregnant women of reproductive age (120g/L) shown in Table 17.7. They used receiver operator characteristic (ROC) curves based on U.S. NHANES III data (2003‑2010) to define Hb cutoffs for iron defi­ciency anemia with enhanced sensitivity, in an effort to improve the detection of iron defi­ciency in this population. These were: < 128g/L for women of reproductive age; < 127g/L for older women; < 125g/L for black women; < 130g/L for non-black women. Iron defi­ciency was defined by body iron calculated using serum ferritin and trans­ferrin receptor (Section 17.10.1). Note that the selection criteria applied by Sekhar et al. (2017) to define a healthy U.S population were not as rigorous as those used by Addo et al. (2021) for their multi­national study.

Note that the Hb cutoff defined by Sekhar et al. (2017), based on U.S. NHANES data for non-pregnant women aged 12‑49y, is higher (128g/L) than the pooled 5th percentile estimate based on non-pregnant women aged 15‑49y in the multi­national study (108.1g/L).
Table 17.9. Classification of anemia as a problem of public health significance. Source: Hemo­globin concen­trations for the diagnosis of anemia and assessment of severity. Geneva: World Health Organization (2011).
Trigger levels
for the pre-
valence of
anemia (%)
Category of public
health significance
≤ 4.9 No public
health problem
5.0‑19.9 Mild public
health problem
20.0‑39.9 Moderate public
health problem
≥ 40 Severe public
health problem
Differences in the exclusion criteria applied, study settings, age range, and the statistical methods used to derive the cutoff values may account for some of this discrepancy, empha­sizing the importance of careful selection and justification for any Hb cutoff applied.

WHO have also adopted “thresholds” to classify anemia as mild, moderate, or severe (Table 17.8). These thresholds can be used in conjunction with the “trigger” levels based on the preva­lence of anemia in each category to determine the public health significance of anemia in a popu­lation, as shown in Table 17.9.

Cutoffs for the elderly are also included in Table 17.7. Note that no specific Hb cutoff has been set by WHO for the elderly, even though there is some support for such an approach (Röhrig, 2016). As noted above, Sekhar et al. (2017) recom­mended a Hb cutoff of < 127g/L for older U.S. women based on ROC curves.

Cutoffs during preg­nancy that differ according to trimester should now be used. WHO has adopted the CDC trimester-specific cutoffs to identify anemia in pregnant women (WHO, 2016); these are: 1st trimester: 110g/L; 2nd trimester: 105g/L; 3rd trimester: 110 g/L. They were derived from gesta­tional month-specific 5th percentile values for pooled data from four European studies with very small samples sizes.

Table 17.10. Thresholds for individuals according to deviations from the new normative trajectories for hemo­globin (Hb) in preg­nancy. Data from Ohuma et al. (2020).
Individual women (for clinical use)
Gestational age-
specific cut­off
(normative percentile)
Probable diagnosis
< 3rd percentile Low Hb concentration
3rd‑4.99th percentile At high risk of
low Hb concentration
5th‑9.99th percentile At moderate risk of
low Hb concentration
≥ 10th percentileNormal Hb
More recently, Ohuma et al. (2020) have developed gesta­tional-age specific cutoffs derived from the percentiles presented in Appendix 17.2. Cutoffs to determine whether the Hb concentration for a pregnant women deviates from the gesta­tional-age-specific normal Hb concentration (defined by Hb ≥ 10th percentile) are shown in Table 17.10. Figure 17.4 shows the summary estimates of the associ­ation of maternal Hb concentration (g/L) measured at any point during preg­nancy and low birth­weight by Hb cutoffs ranging from ≤ 70 to ≥ 140g/L. In this meta-analysis, low maternal Hb (at any time during preg­nancy by cutoff) was associated with an increased odds of low birth­weight (OR 95% CI): 1.42 (1.31‑1.55). For high maternal Hb, the trend was similar (OR 95% CI): 1.80 (0.86‑3.77) but not signif­icant.

In some preg­nancy studies, a fixed high Hb cutoff > 130g/L has been applied (Oaks et al., 2019), despite dif­fer­ent gesta­tional ages at the time of blood sampling. Uncertainties remain about the use of a fixed high Hb cutoff and the reason for the higher Hb concen­trations; whether they are associated with failure to expand plasma volume, which in turn is associated with increased blood viscosity and decreased placental perfusion, is not clear (Fisher and Nemeth, 2017).
Figure 17.4
Figure 17.4. Meta-analysis summary of estimates of the associ­ation of maternal Hb concentration (g/L) measured at any point during preg­nancy and low birth­weight by Hb concentration cutoffs. Redrawn from Young et al. (2019).

17.2.4 Using hemo­globin distribution to assess popu­lation iron status

The distribution of Hb can be used in large-scale field studies to assess the iron status of a popu­lation (Yip et al., 1996). This simplified approach is designed to assess the preva­lence and etiology of anemia, based solely on Hb, and is especially useful when it is not feasible to use multiple bio­chem­ical tests for iron status because of cost or operational constraints. It is also useful in developing countries where factors other than inadequate intakes of dietary iron, such as parasitic infections and genetic Hb disorders, often affect red cell produc­tion and thus interfere with the inter­pretation of iron status measures.

The Hb distribution approach involves comparing the Hb distribution curves for men, women, and children of the study popu­lation, with optimal Hb distributions derived from a “healthy” reference sample.
Figure 17.5
Figure 17.5 Distribution of hemo­globin in children aged 1‑5y and in women and men aged 18‑44y. Data from NHANES II after exclusison of subjects with abnormal values for indicators of iron status. Redrawn from Yip et al. (1996).
These have been compiled from U.S. NHANES II (Pilch and Senti, 1984) and NHANES III (Looker et al., 1997) popu­lations by excluding subjects with bio­chem­ical evidence of iron defi­ciency, as noted earlier. Figure 17.5 depicts the Hb distributions for the NHANES II reference samples. These distributions can be used as a standard for comparison with the Hb distributions from other surveys.

If anemia is prevalent in the target popu­lation then the Hb distribution will be shifted to the left relative to the reference. The distribution approach can also indicate when inadequate dietary intake of iron is the main factor causing iron defi­ciency in a popu­lation. If this is the case, then the Hb distributions for children and women are both affected. Both subgroups will have signif­icantly lower median Hb values when compared with their respective reference distributions, whereas the median of the distribution for adult men in the target popu­lation is virtually unaffected. For example, comparison of the Hb distributions for school-aged children from Zanzibar with the corres­ponding U.S. reference sample for African Americans shows a marked shift of Hb concen­trations (Figure 17.6). In Zanzibar, the decreased Hb levels were related to gastro­intestinal blood loss as a result of hookworm infection rather than inadequate dietary iron intake.
Figure 17.6
Figure 17.6. Distribution of Hb in a reference popu­lation of African American children with no bio­chem­ical signs of iron defi­ciency and school-age children from Zanzibar. Redrawn from Yip et al. (1996).

17.2.5: Measure­ment of hemo­globin

Hemo­globin can be determined in fasted or non-fasted blood samples and is best determined using venous blood, anti­coagulated with EDTA. Alternatively, capillary blood from the heel, ear, or fingerpricks, collected in heparinized capillary tubes can be used, although less precise compared to measure­ments on venous blood. This is primarily because interstitial fluid may dilute capillary samples (Burger and Pierre-Louis, 2003).

The cyanmethemo­globin method, recom­mended by the International Committee for Standardization in Hematology (ICSH, 1987), is the most reliable, provided that the blood specimen has been accurately diluted. Incorrect dilution of the sample is one of the main sources of error in this method (Pilch and Senti, 1984). The method involves converting all of the usually encountered forms of Hb (oxyhemo­globin, methemo­globin, and carboxy­hemo­globin) to cyanmet-hemo­globin. The analysis is then normally performed with a spectro­photometer. The reagents for this method are light-sensitive and poisonous. The coefficients of both analytical and biological variation for Hb by the cyan­methemo­globin method using venous blood are often less than 4% (Worwood, 1996). A WHO inter­national standard is available to assess the accuracy of the assay (ICSH, 1987). Hemo­globin can also be determined in this way from blood spots collected in the field on filter paper discs; levels remain unchanged during storage for 1mo (Feraudi and Mejia, 1987).

Alternatively, Hb is best measured using a Coulter Counter using EDTA-anti­coagulated blood. Most automated cell counter analyzers use the cyanmet­hemo­globin method. A portable hemo­globin photometer can be used for popu­lation assessment in remote field settings. The “HemoCue” is battery operated and uses a dry reagent (sodium azide) in a microcuvette for direct blood collection and measure­ment. In a new model (Hb-301), cuvettes can be stored at an extended temperature range (10‑40°C) compared to earlier models (15‑30°C) and the use of a control cuvette is no longer necessary. The accuracy and precision of Hb values based on the HemoCue are comparable to those obtained using the cyanmet­hemo­globin method, provided standardized procedures for the sample collection and analysis are followed (von Schenck et al., 1986). Details of standardization procedures to enhance the accuracy and reliability of Hb concen­trations measured by the HemoCue are given in Burger and Pierre-Louis (Burger and Pierre-Louis, 2003). or in the U.S. CDC Nutrition Survey Toolkit.

17.3 Hematocrit or packed cell volume

The hematocrit is defined as volume percen­tage or volume fraction of packed red blood cells (RBC) in blood, sometimes termed the “packed cell volume” (PCV). Variation with age and sex is shown in Figure 17.7,
Figure 17.7
Figure 17.7. Median values for hematocrit by age and sex. Redrawn from Yip et al. (1984).
with higher concen­trations in males than in females after puberty, as observed for Hb concen­tration (Zierk et al., 2011). As described for Hb, hematocrit concen­trations decline steadily during preg­nancy as a result of a greater increase in plasma volume relative to the increase in red blood cell mass, reaching a nadir at about 28‑36wks in women not taking iron supple­ments. How­ever, the hematocrit is only useful if Hb values are not available.

In iron defi­ciency, the hematocrit falls after Hb formation has become impaired. As a result, in early cases of moderate iron defi­ciency, a marginally low Hb value may be associated with a nearly normal hematocrit (Graitcer et al., 1981). Only in more severe iron-defi­ciency anemia are both Hb and hematocrit reduced. Limitations of the hematocrit determination are shown in Box 17.2.
Box 17.2: Limitations of the hematocrit determination
Hematocrit, like Hb, is usually determined on EDTA‑anti­coagulated blood from a venipuncture or capillary blood. The measure­ment is relatively easy, rapid, and often used in screening for iron-defi­ciency anemia, although the technical errors for the measure­ment of hematocrit are greater than for Hb. Measure­ment errors may result from:

17.3.1 Interpretive criteria

Table 17.11. Hematocrit cutoffs used to define anemia. From Stoltzfus and Dreyfuss (1998).
Age (y) or group Hematocrit cutoff
(volume fraction)
0.5‑5 < 0.33
5‑11 < 0.34
12‑13 < 0.36
Men < 0.39
Nonpregnant women < 0.36
Pregnant women < 0.33
Results from two of the earlier U.K. Diet and Nutrition Surveys (Gregory et al., 1990; Gregory et al., 2000) present the mean, median, and lower and upper 2.5 or 5th percentiles for hematocrit for adults and older children respectively. Information dealing with U.K. National Survey years 9‑11 (2016‑2017 and 2018‑2019) includes methodological details. A condensed (“zipped”) compre­hen­sive set of data tables based on information from U.K. National Survey years 1‑9 includes more recent data on Hb but not hematocit.

The hematocrit cutoff values at sea level used to define anemia and compiled by INACG / WHO / UNICEF, are presented in Table 17.11.

17.3.2 Measure­ment of hematocrit

Hematocrit can be measured manually by centrifuging a small amount of blood in a heparin­ized capillary tube until the red cells have been reduced to a constant packed cell volume. The hematocrit is calculated by comparing the height of the column of packed red cells with the height of the entire column of red cells and plasma, and expressed as the volume percen­tage or volume fraction of packed red blood cells. How­ever, values measured in this way frequently have poor reproducibility, especially when the power supply is unstable and the equipment not well standardized (WHO, 2007).

Alternatively, hematocrit can be derived using automated electronic cell counters. These measure mean cell volume (MCV) and the number of red blood cells from which hematocrit is calculated as MCV × RBC concen­tration. Hematocrit values generated in this way tend to be about 1% higher than those generated manually (Looker et al., 1995).

17.4 Red-cell indices

Red-cell indices are derived from measure­ments of Hb, hematocrit, and red-blood cell count. They can indicate cell size and the concen­tration of Hb within the cell allowing dif­fer­ent types of anemia to be diagnosed. Fresh samples of whole anti­coagulated blood are required. The accuracy, precision, and use of red-cell indices have increased with the growing availability of automated electronic cell-counting instruments.

Red-cell indices lack specificity. If subnormal values are noted in the absence of thalassemia trait, anemia of chronic disease, and other known causal con­ditions, additional measures of iron status are recom­mended to confirm the diagnosis of iron defi­ciency. The confirmatory tests commonly used are serum iron and total iron-binding capacity, serum ferritin, erythro­cyte proto­porphyrin, and more recently, soluble trans­ferrin receptor (sTfR). These tests are discussed in Sections 17.6 to 17.9.

Table 17.12
Table 17.12. Expected levels of the red-cell indices during iron-defi­ciency anemia, macrocytic anemia, and anemia of chronic disease. MCV, mean cell volume; MCH, mean cell Hb; MCHC, mean cell Hb concen­tration. From Wintrobe et al. (1981).
Red cell
index
Iron-defi­ciency
anemia (microcytic
hypo­chromic)
Macrocytic
anemia
(macrocytic)
Anemia of chronic
disease (normocytic
normochromic)
MCV Low High Normal
MCH Low High Normal
MCHC Low Normal Normal
summarizes the expected levels of the more impor­tant red-cell indices during iron-defi­ciency anemia, macrocytic anemia resulting from vitamin B12 or folic acid defi­ciency, and the anemia of chronic disease. The latter arises from inflam­mation due to infectious diseases (as noted earlier in “Acute and chronic inflam­mation” in Section 17.2.1), when pro-inflam­matory cytokines stimulate the produc­tion of hepcidin. This in turn reduces both the absorp­tion of dietary iron and the release of iron from body stores and from macrophages (which recycle iron from senescent erythro­cytes). If the inflam­matory con­dition persists, the supply of iron available for erythro­poiesis is reduced, and anemia may develop (Anderson and Frazer, 2017). In chronic inflam­mation erythropoietin produc­tion may also be impaired and red-cell survival diminished (Raiten et al., 2015). The anemia of chronic disease is generally mild, and can be normo­cytic and normo­chromic (i.e., the average size and Hb content of the red blood cells are within normal limits) or micro­cytic and slightly hypo­chromic (i.e., red blood cells are small and con­tain less Hb than normal) (Andrews, 1999).

Changes in red-cell indices during vitamin B12 and folic acid defi­ciency are described in more detail in Chapter 22. In iron-defi­ciency anemia, the fall in Hb is followed by a fall in mean cell volume (MCV), then mean cell Hb (MCH), and finally mean cell Hb concen­tration (MCHC). With the widespread use of automated hematology analyzers, MCV has become a key red cell parameter used to classify anemia (Table 17.12), although seldom used in the diagnosis of iron defi­ciency today.

Reference ranges derived from U.S. NHANES III stratified by nine age groups, sex and racial categories are available, based on MCV, MCH, MCHC, and red-cell distribution width (RDW). These are based on Coulter Counter data in which rigorous criteria were applied to exclude individuals with evidence of acute or chronic disease. Representative plots for each red-cell index by sex for non-Hispanic white; non-Hispanic black; Mexican Americans with the following percentiles are depicted: 2.5th, 5th, 50th, 80th, 95th, 97.5th (Cheng et al., 2004).

17.4.1 Mean cell volume

Mean cell volume or mean corpuscular volume (MCV) is a measure of the average size of the red blood cells expressed in femtoliters (fL). Cells may be abnormally large (macro­cytosis), as in vitamin B12 or folic acid defi­ciency, or abnormally small (micro­cytosis), as in iron and vitamin B6 defi­ciency. Mean cell volume is best determined directly with a Coulter Counter as results obtained in this way are highly reproducible. Reported within-subject biological variation for MCV is low, varying between 0.6 and 1.3% (Buttarello, 2016). If a Coulter Counter is not available, MCV can be calculated from the hematocrit and the red blood cell count determined manually (Wintrobe et al., 1981): \[\small \mbox{MCV (fL) = }\frac{\mbox{hematocrit (volume fraction})}{ \mbox{ red blood cell count }(10^{12}/\mbox{L})}\] Mean cell volume is less affected by sampling errors in skin puncture capillary blood samples than Hb, because red-cell size is unaffected if the sample is diluted by the interstitial fluid.

A low MCV value only occurs when iron defi­ciency is severe. It is a relatively specific index for iron-defi­ciency anemia, provided that the anemia of chronic disease, certain genetic Hb disorders (α‑ and β‑thalassemias including hetero­zygotes), and lead poisoning are excluded. In macrocytic anemias associated with vitamin B12 or folate defi­ciency, MCV values are high (Table 17.12). In preg­nancy, MCV decreases slightly between 26 and 38wks gestation, likely attributed to the transfer of placental iron being most intense at this time, thus decreasing the iron availability for maternal erythro­poiesis (Fisher and Nemeth, 2017). Spuriously high values for MCV may be apparent as a conse­quence of hyper­glycemia or hyper­natremia (high level of sodium in the blood).

The MCV increases progressively from 6mos to early adulthood (Yip et al., 1984). Differences according to sex are small; MCV is slightly higher in young adult females than in males. Individuals of African descent have lower MCV values than do Caucasians. Such differences are said to have a genetic basis, as noted for Hb under "Ethnicity" (Section 17.2.1).

Percentile distributions for MCV levels by race, sex, and age, for all persons 1‑74y in the U.S. NHANES II popu­lation, are available, and for the U.S. NHANES II “healthy” reference popu­lation for all races as described for Hb in Section 17.2.2 (Pilch and Senti, 1984). Pregnant women and those with a higher risk of iron defi­ciency based on an abnormal value or missing value for 3 of the 4 tests (free erythro­cyte proto­porphrin, trans­ferrin satu­ration, serum ferritin, and MCV) were excluded.

Cheng et al. (2004) report reference interval diagrams (2.5th, 50th, and 97.5th percentiles) based on the U.S. NHANES III database stratified by nine age groupings, sex, and race (i.e., Mexican American, Non-Hispanic Black, and Non-Hispanic White). Stringent exclusion criteria were applied to generate a “healthy” reference sample popu­lation (10‑75y); details are available in Cheng et al. (2004).

Age and sex-specific data for MCV are also available for the older U.K. Diet and Nutrition surveys. Gregory et al., 1990; Gregory et al., 2000 present the mean, median, and lower and upper 2.5 or 5thnbsp;percentiles for MCV for adults and older children respectively. Potentially iron-deficient individuals were not specifically excluded. Information dealing with U.K. National Survey years 9‑11 (2016‑2017 and 2018‑2019) includes methodological details. A condensed (“zipped”) compre­hen­sive set of data tables based on information from the U.K. National Survey (years 1-9) includes more recent data on Hb and ferritin, but not MCV.

Table 17.13. Cutoffs for defining abnormal values of mean cell volume (MCV), trans­ferrin satu­ration (TS), and serum ferritin (SF) that were considered indicative of iron defi­ciency. Data from the third National Health and Nutrition Examination Survey (NHANES III). Subjects with high blood lead values were excluded. Iron defi­ciency was defined as having abnormal values for ≥ 2 of 3 tests (MCV, TS, and SF). From Mei et al. (2003).
Abnormal values

Age
Mean cell
volume (fL)
Trans­ferrin
satu­ration (%)
Serum
ferritin (µg/L)
1‑2y < 74 < 10 < 10
3‑4y < 76 < 12 < 10
5y < 77 < 12 < 10
15‑49y < 81 < 15 < 12

Cutoff values for MCV used to define abnormal values for the U.S. NHANES III survey were determined from values in the U.S. Healthy People 2000 Final Review; these are shown in Table 17.13 and differ slightly from those used earlier to identify abnormal values in the U.S. NHANES III phase 1 (1988‑1991) (Dallman et al., 1996). Values for individuals with high blood lead values were excluded when defining these new cutoffs for three age groups for children plus those 15‑49y. Values greater than 98fL indicate macrocytosis.

17.4.2 Mean cell hemo­globin

Mean cell hemo­globin (MCH) is the mean Hb content of individual red blood cells, and provides information that is comparable to that of MCV. It is derived from the ratio of Hb to the red blood cell count:
\[\small \mbox{MCH (pg) = }\frac{\mbox{hemo­globin (g/L})}{ \mbox{ red blood cell count }(10^{12}/\mbox{L})}\] Percentile distributions by age and sex for MCH are available for the older U.K. Diet and Nutrition surveys. Gregory et al., 1990; Gregory et al., 2000 present the mean, median, and lower and upper 2.5 or 5th percentiles for MCH for adults and older children respectively. Potentially iron-deficient individuals were not specifically excluded. Information dealing with U.K. National Survey years 9‑11 (2016‑2017 and 2018‑2019) includes methodological details. A condensed (“zipped”) compre­hen­sive set of data tables based on information from the U.K. National Survey (years 1-9) includes more recent data on Hb and ferritin, but not MCH.

In contrast, stringent precautions were applied by Cheng and co-workers (2004) to generate a health-associated sample popu­lation based on the U.S. NHANES III survey from which composite reference intervals for MCH according to age and race, classified by sex were obtained; race-related trends in MCH were apparent. The MCH changed progressively from infancy to adulthood, when values ranged from 27‑32pg. The MCH changes in iron-defi­ciency anemia are similar to those for MCV; MCH is low in iron-defi­ciency anemia but high in the macrocytic anemias of both vitamin B12 and folate defi­ciency (Table 17.12). In the latter, the red blood cells are laden with Hb but are reduced in number. In severe iron defi­ciency, the relative fall in MCH is greater than the corres­ponding fall in MCV (Dallman, 1977).

17.4.3 Mean cell hemo­globin concen­tration

If both the Hb concen­tration and the hematocrit are known, the concen­tration of Hb in the red blood cells can be determined. This is known as the mean cell hemo­globin concen­tration (MCHC) and is calculated as: \[\small \mbox {MCHC (g/L)} = \frac {\mbox {hemo­globin (g/L)} }{\mbox {hematocrit (vol. fraction) }}\] The MCHC is normally determined with a Coulter Counter, although manual determinations are possible. After the first few months of life, MCHC is less affected by age than any other red-cell index (Matoth et al., 1971). Never­the­less, this index is the least useful of the red-cell indices because it is the last to fall during iron defi­ciency.

Percentile distributions by age and sex for MCHC are available for the older U.K. Diet and Nutrition surveys. Gregory et al., 1990; Gregory et al., 2000 present the mean, median, and lower and upper 2.5 or 5th percentiles for MCHC for adults and older children respectively. Potentially iron-deficient individuals were not specifically excluded. Information dealing with U.K. National Survey years 9‑11 (2016‑2017 and 2018‑2019) includes methodological details. A condensed (“zipped”) compre­hen­sive set of data tables based on information from the U.K. National Survey (years 1-9) includes more recent data on Hb and ferritin, but not MCHC.

In the U.S. NHANES III sample of Cheng et al. (2004), stringent exclusion criteria allowed reference intervals for MCHC based on a “healthy” reference sample to be generated.

Mean cell hemo­globin concen­trations are low in iron-defi­ciency anemia but normal in the macrocytic anemia of vitamin B12 and folic acid defi­ciency, and in the anemia of chronic disease (Table 17.12). Values in normal adults range from 320‑360g/L. Values of < 300g/L indicate hypochromia and are associated with advanced iron defi­ciency.

17.5 Red-cell distribution width

RDW is a measure of the variation in red-cell size (i.e., aniso­cytosis). It is normally expressed as the percen­tage coefficient of variation of the mean cell volume: \[\small \mbox {RDW (%)} = \frac{\mbox {SD of MCV (fL) × 100%}}{\mbox {MCV(fL)}}\] The RDW can be determined routinely as part of a complete blood count on many Coulter Counters. Some counters output only the standard deviation as the measure of RDW.

RDW increases in iron-defi­ciency anemia and was included as a marker of iron status in the U.S. NHANES III survey (Looker et al., 1995), when a single cutoff for RDW of 14% was used. Based on their results on adults and children, Looker et al. (1995), how­ever, suggested that dif­fer­ent cutoffs are needed for children and adults. In the neonate, RDW is elevated because the relatively larger red blood cells con­taining fetal Hb are replaced by smaller cells con­taining Hb A (Lynch et al., 2018).

In the early 1980's Bessman et al. (1983) classified anemic disorders based on a combination of a low, normal, and high MCV and two categories of RDW — normal (< 15.1%) and high (indicating anisocytosis). How­ever, in view of the numerous reported exceptions to this classification, together with differences in the reference intervals for the RDW depending on the analyzer manufacturer, this classification is seldom used today (Buttarello, 2016).

Clearly, a high RDW is not specific for iron defi­ciency. High values also occur in folate or vitamin B12 defi­ciency and when iron and folate, or iron and vitamin B12 deficiencies coexist. Also, in some genetic Hb disorders (e.g. in S β‑thalassemia, sickle cell disease (SS), and sickle cell trait (SC), RDW is elevated, whereas in the anemia of chronic disease, RDW is normal. The inter­pretation of RDW is further complicated by the wide distribution of RDW values within a specific disease, and the discrepancy in cutoffs using Coulter Counters from dif­fer­ent manufacturers (Buttarello, 2016).

Percentile distributions for RDW are available based on the U.K. and New Zealand national surveys. In the U.K. mean, median, and lower and upper 2.5 percentiles by age and sex are presented (Gregory et al., 1990; Gregory et al., 2000), whereas in the New Zealand Children's Nutrition Survey, data on RDW by age, sex, race or ethnicity, and the preva­lence of elevated values (defined as >14%) were compiled (Parnell et al., 2003).

Cheng et al. (2004) have compiled reference interval plots (2.5th, 50th, and 97.5th percentiles) based on data from a “healthy” reference sample of the U.S. NHANES III data. These data for persons 10‑75y were stratified into nine groups by age, sex and race. A tendency for RDW to increase with age was noted, suggesting that age-specific cutoffs may be required.

17.6 Serum iron, TIBC, trans­ferrin, and trans­ferrin satu­ration

Figure 17.8
Figure 17.8. Laboratory measure­ments of iron indicators needed to calculate trans­ferrin satu­ration. TIBC: total iron-binding capacity; UIBC: unsaturated iron-binding capacity. Modified from Pfeiffer and Looker (2017).
Three interrelated variables, serum iron, total iron-binding capacity (TIBC), and trans­ferrin satu­ration have been used in the past for dif­fer­entiating between nutritional iron defi­ciency and anemia arising from chronic infections, inflam­mation, or chronic neoplastic diseases; their inter-relationship is shown in Figure 17.8.

How­ever, in view of the high biological variation of serum iron and trans­ferrin satu­ration, they are used mainly in clinical settings to assess iron status. Today, in popu­lation surveys, serum iron and trans­ferrin satu­ration have been replaced by serum ferritin and serum soluble transferrin receptor (sTfR).

Nutritional iron defi­ciency is charac­ter­ized by an elevated serum trans­ferrin and TIBC (not shown in Table 17.14), coupled with a low serum iron, and thus low trans­ferrin satu­ration, whereas in chronic infections, inflam­mation, or chronic neoplastic diseases, concen­trations of serum trans­ferrin, serum iron, and TIBC are all low, so trans­ferrin satu­ration is reduced (Table 17.14; Pfeiffer and Looker, 2017).
Table 17.14. Response of iron status indicators to a depletion of body iron compartments with and without concomitant inflam­mation and to an overload of body iron compartments.
Symbols: ↓ Decreased; ~ Normal; ↑ Increased; ACD, anemia of chronic disease; EP, erythro­cyte proto­porphyrin; IDA, iron defi­ciency anemia; IDA + ACD, combined iron defi­ciency anemia and anemia of chronic disease; NA, not applicable; S.Ferritin, serum ferritin; S.iron. serum iron; sTfR, soluble trans­ferrin receptor; TSAT, trans­ferrin satu­ration. Modified from Pfeiffer and Looker (2017).
Compartment Indicator IDA ACD IDA +
ACD
Over-
load
Stored iron S.Ferritin ~ to
↓ to
~
Trans­port iron S.Iron
Trans­ferrin
TSAT
EP
sTfR ~ ~ to
~
Functional iron Hemo­globin ~
Inflam­matory
response
NA ~ NA

Serum iron represents the fraction of iron that circulates bound to the iron trans­port protein, trans­ferrin. The iron in the serum is derived mostly from iron recycled from catabolized red blood cells in the reticulo-endothelial system, with a concen­tration that normally ranges from 8.9‑21.5µmol/L (50‑120µg/dL). Almost all the iron in the serum is bound to trans­ferrin; non-trans­ferrin bound iron (NTBI) usually comprises < 1% of the total iron pool in the serum (see Section 17.11.3 for details of NTBI).

In the fasting state, serum iron levels reflect the fraction of iron that circulates bound principally to trans­ferrin and which is in transit from the reticulo-endo­thelial system to the bone marrow. After a meal, concen­trations of serum iron rise due to the release of iron absorbed by entero­cytes into the plasma pool. Hence, at this time, the measure­ment of serum iron will be repre­sentative of iron absorp­tion. Serum iron is reduced by infection and inflam­mation.

Serum trans­ferrin is the trans­port protein for serum iron. Theoretically each mole­cule of trans­ferrin (MW ≈ 80kDa) is capable of binding to two moles of iron (MW ≈ 55.8kDa). When iron stores are exhausted, and serum iron concen­trations are < 40‑60µg/dL, serum trans­ferrin increases in response to an increase in iron absorp­tion. Hence, serum trans­ferrin does not identify iron defi­ciency during the first stage of iron defi­ciency (i.e., iron depletion); for more details see Elsayed et al. (2016).

Total Iron Binding Capacity (TIBC) measures the total number of binding sites for iron atoms on trans­ferrin. Hence, serum TIBC is closely related to, and often used as a proxy measure of, trans­ferrin, rising once the iron stores are exhausted due to an increase in trans­ferrin synthesis in response to increased iron absorp­tion. Unlike serum iron, TIBC is not subject to rapid changes in concen­tration either within individuals or from day-to-day (i.e., circadian variation).

Trans­ferrin satu­ration (as percent) is an estimate of the percen­tage of the two iron binding sites on all trans­ferrin mole­cules that are occupied with iron. Trans­ferrin satu­ration measures the iron supply to the erythroid bone marrow: as the iron supply decreases, the serum iron concen­tration falls and the satu­ration of trans­ferrin decreases. Serum iron and TIBC are usually determined at the same time, so trans­ferrin satu­ration is calculated as the ratio of serum iron to total iron binding capacity as shown below: \[\small \mbox {Trans­ferrin Sat. (%)} = \frac {\mbox {Serum iron (µmol/L)}}{\mbox {TIBC (µmol/L)}}\mbox { × 100%}\] Trans­ferrin satu­ration can also be determined if unsaturated iron-binding capacity (UBIC) and serum iron are measured. In this case, TIBC is determined from the sum of serum iron and UIBC. \[\small \mbox {TIBC (µmol/L) = UIBC (µmol/L) + Serum iron (µmol/L)}\] Then \[\small \mbox {Trans­ferrin Sat. (%)} = \frac {\mbox {Serum iron (µmol/L)}}{\mbox {TIBC (µmol/L)}}\mbox { × 100%}\] Alternatively, trans­ferrin satu­ration can be calculated from the ratio of serum iron to serum trans­ferrin (Ritchie et al., 1999).

Trans­ferrin satu­ration was one of the multiple iron bio­markers in the “ferritin model” employed in the U.S. NHANES II and U.S. NHANES III surveys (see Section 17.10). In healthy individuals, trans­ferrin satu­ration (as percen­tage) usually ranges from 20‑45%. Levels < 15% satu­ration indicate that the rate of delivery of iron to the developing erythro­cytes is insufficient to maintain normal Hb synthesis, and hence are an early indicator of suboptimal iron supply. A prolonged period of time with trans­ferrin satu­ration below 15%, results in iron deficient erythro­poiesis which leads to changes in the number and shape of newly released reticulo­cytes and erythro­cytes (WHO 2007). Never­the­less, low trans­ferrin satu­ration levels are not specific for iron defi­ciency, as discussed below.

17.6.1 Factors affecting serum iron, TIBC, serum trans­ferrin, and trans­ferrin satu­ration

Biological variation for serum iron may be quite large, resulting in large daily variation in trans­ferrin satu­ration: coefficients of variation among individuals for serum or plasma iron may exceed 30% — mainly as a result of variation in the release of iron from the reticulo­endothelial system to the plasma. As a result, more than one sample is needed for a clinical evaluation. Table 17.4 compares the overall variability of serum iron with Hb and serum ferritin as indicated by within-subject, day-to-day coefficient of variation (as %) for healthy subjects. In contrast, serum trans­ferrin concen­trations are steadier.

In general, values for serum iron and thus trans­ferrin satu­ration tend to be elevated in the morning (about 35%, ranging from 20‑55%) but decrease in the afternoon and evening. As a result, measure­ments should be determined on fasting morning blood samples. In this way, the effects of both recent dietary intake and diurnal variation are minimized. Note that among night-shift workers, the pattern of diurnal variation may be reversed (Lynch et al., 2018).

TIBC is less subject to biological variation than is serum iron as noted earlier, but it is more susceptible to analytical errors. Day-to-day variation is about 8‑12%
Figure 17.9
Figure 17.9. Median trans­ferrin satu­ration (as %) by time of blood collection, Caucasian reference sample 20‑49y, U.S. NHANES III phase 1. From Looker et al. (1995).
The large biological variation for serum iron has obvious implications for its use at both the popu­lation and individual levels. Further­more, these limitations highlight the importance of using multiple indices of iron status to provide a more valid assessment of iron status than any single measure­ment (Section 17.10).

In U.S. NHANES III, the large diurnal variation for serum iron is reflected in the results for trans­ferrin satu­ration shown in Figure 17.9. This measure was included in U.S. NHANES II and III in the earlier MCV and ferritin models used to assess iron status, but has now been abandoned.

Age-related changes in serum iron and serum trans­ferrin are marked. Serum iron and trans­ferrin levels rise during childhood until about the second decade of life (Figure 17.10). Values decline steadily with advancing age (Yip et al., 1984).

Sex has only a small effect on serum iron and trans­ferrin satu­ration until the teenage years, after which levels in males increase whereas those for females remain relatively constant (Figure 17.10). In U.S. NHANES II, there was a tendency for serum iron levels to be higher in males than females after the middle of the second decade of life (Pilch and Senti, 1984). In U.S. NHANES III, the median trans­ferrin satu­ration was 26‑30% for men and 21‑24% for women (Appendix 17.3). This same trend of higher values in adult males than females 20‑70y, was also seen in the median trans­ferrin satu­ration values compiled by Ritchie et al. (2002).
Figure 17.10
Figure 17.10. Median values for serum iron/total iron binding capacity (TIBC) (%), serum TIBC (µmol/L), and serum iron (µmol/L), by age and sex. Redrawn from (Yip et al., 1984).

Iron defi­ciency uncom­plicated by the inflam­matory response, leads first to a gradual reduction in iron stores, that once exhausted, results in a fall in serum iron levels. The low serum iron (and serum ferritin) levels down-regulate hepcidin synthesis, thus promoting iron absorp­tion and greater iron release from storage sites into the circu­lation. As a conse­quence, serum trans­ferrin increases to enhance the delivery of iron to the bone marrow to maintain effective erythro­poiesis, so TIBC rises, resulting in a low trans­ferrin satu­ration (Elsayed et al., 2016; Table 17.14).

Because these changes appear during the second stage of iron defi­ciency — termed iron deficient erythro­poiesis — when iron stores are exhausted, trans­ferrin satu­ration is a more sensitive index of iron status than the red-cell indices discussed earlier in Section 17.4, but less sensitive than serum ferritin (Section 17.7). Trans­ferrin satu­ration is more consistently useful for diagnosing iron defi­ciency than either serum iron or total TIBC alone: a low trans­ferrin satu­ration in associ­ation with an elevated TIBC is specific for iron defi­ciency. Never­the­less, because of the diurnal variation in levels, and the confounding effect of inflam­mation (see below), percen­tage trans­ferrin satu­ration is not a suitable bio­marker for popu­lation surveys.

Oral contraceptive agents and estrogen replacement therapy (i.e., con­ditions of estrogen excess) may lead to an increase in trans­ferrin synthesis that is independent of body iron status. Levels of TIBC are also elevated, similar to those observed in iron defi­ciency (Pilch and Senti, 1984).

Preg­nancy leads to a steady decline in serum iron as the plasma volume expands, whereas serum trans­ferrin concen­tration increases, resulting in a decrease in trans­ferrin satu­ration. Because of these rapid changes during preg­nancy, serum iron and trans­ferrin satu­ration are rarely used for diagnosing iron defi­ciency in preg­nancy (Fisher and Nemeth, 2017). The magnitude of these changes is less in women sup­ple­mented with iron during preg­nancy.

Iron overload results in an elevated body iron store and, thus, a high serum iron (and ferritin) level because of excessive absorp­tion of iron and because there is no regulatory mech­anism for controlling iron excretion. As a conse­quence, serum trans­ferrin is reduced, TIBC falls, and trans­ferrin satu­ration level is elevated (Table 17.14, Pfeiffer and Looker, 2017). In earlier U.S. NHANES III surveys, a trans­ferrin satu­ration level > 70% was the criterion for detecting iron overload in adults. How­ever, to detect iron overload today, WHO advises the inclusion of trans­ferrin satu­ration and serum ferritin, plus details on known genetic and clinical risk factors (Garcia-Casal et al., 2018).

The most common genetic risk factor for iron overload is hereditary hemo­chrom­atosis, with a worldwide preva­lence in Caucasian popu­lations of 3.5‑4.5 per 1000 popu­lation. It is more common in males. Clinical risk factors for othertypes of iron overload include hemolytic anemia, chronic liver disease, long-term transfusion therapy for genetic Hb disorders, and African iron overload (Elsayed et al., 2016).

Anemia of chronic disease is associated with infectious diseases (see Section 17.2.1) that induce the acute-phase response, the local reaction of which is inflam­mation. Pro-inflammatory cytokines promote the release of hepatic hepcidin which in turn blocks the release of iron from the body stores into the circu­lation (independent of iron status) and inhibits absorp­tion of dietary iron. There­fore, iron is no longer available for trans­port to the bone marrow for erythro­poiesis. Conse­quently, concen­trations of both serum iron and serum trans­ferrin fall, so levels of TIBC are low. Hence, percen­tage trans­ferrin satu­ration is decreased, a response primarily reflecting the decreased serum iron levels. There­fore, serum trans­ferrin acts as a negative acute phase protein, decreasing in the anemia of chronic disease, whereas in iron defi­ciency anemia, serum trans­ferrin concen­trations are increased (Table 17.14).

The rate at which serum iron concen­trations return to normal after the acute phase response varies among individuals, sometimes normalizing with 24‑48hrs of an acute infection. In contrast, during chronic inflam­matory states such as arthritis, serum iron levels may remain low for a prolonged period of time, thus decreasing the availability of iron to cells leading to the anemia of chronic disease. Such variability in the response causes uncertainty in the inter­pretation of serum iron in inflam­mation (WHO, 2007).

Obesity is associated with lower serum iron and tansferrin concen­trations leading to iron defi­ciency unrelated to dietary intakes; Hb concen­trations are not reduced (Menzie et al., 2008).
Table 17.15. Hematologic characteristics, iron status, and inflam­matory bio­markers in normal weight (NW) and overweight/obese (OW/OB) Swiss women (n=62). Difference between NW and OW/OB was assessed with independent samples t-test. P-values < 0.05 were considered signif­icant. NW (BMI 18.5‑24.9kg/m2), OW/OB (BMI 25‑39.9kg/m2), Hb Hemo­globin, TIBC total iron binding capacity, sTfR soluble trans­ferrin receptor, IQR interquartile range. Values are: a mean (± s.d.), and b geo­metric mean (95% confidence interval), c signif­icantly dif­fer­ent from OW/OB. Data from Cepeda-Lopez et al. (2019).
NW
n=24
OW/OB
n=38
Blood volume (mL/kg)a 74.9 ± 5.8c 61.8 ± 6.7
Blood volume (L)a 4.59 ± 0.46c 5.21 ± 0.82
Plasma volume (mL/kg)a 46.6 ± 4.8c 38.5 ± 4.7
Plasma volume (L)a 2.86 ± 0.30c 3.24 ± 0.54
RBC (mL/kg)a 28.2 ± 2.1c 23.4 ± 2.8
RBC volume (L)a 1.74 ± 0.22c 1.97 ± 0.33
Hb (g/dL)a 13.5 ± 0.9 13.7 ± 1.1
Hb mass (g/kg)a 9.2 ± 0.7c 7.6 ± 0.9
Total Hb (g)a 566 ± 71c 641 ± 107
Serum iron (µg/mL)a 1.06 ± 0.44 0.89 ± 0.33
Total serum iron (mg)a 3.02 ± 1.28 2.82 ± 1.03
TIBC (µg/mL)a 3.68 ± 0.66 3.39 ± 0.69
Total TIBC (mg)a 10.5 ± 2.28 10.9 ± 2.66
Trans­ferrin satu­ration (%) 29.8 ± 13.4 26.8 ± 10.6
sTfR (mg/L)b 6.61 (5.86, 7.45) 6.84 (6.16, 7.59)
Total sTfR (mg)b 18.8 (15.0, 21.5)c 21.9 (16.6, 28.7)
SF (µg/L)b 50.6 (40.2, 63.4) 61.35 (48.5, 77.6)
Total SF (mg)b 0.14 (0.10, 0.18)c 0.20 (0.13, 0.29)
Body iron (mg/kg)a 5.64 ± 2.70 5.70 ± 3.17
Serum hepcidin (ng/mL)a 9.20 ± 6.44 11.92 ± 6.3
Total hepcidin (µg)a 25.5 ± 17.6 38.3 ± 21.2
IL6 (pg/mL)b 0.69 (0.53, 0.91)c 0.97 (0.81. 1.17)
Total IL6 (ng)b 1.95 (1.36, 2.63)c 3.1 (2.19, 4.42)
CRP (mg/dL)b 1.05 (0.60, 1.85)c 2.68 (1.78, 4.01)
Total CRP (mg)b 29.87 (15.4, 53.5)c 85.6 (48.1, 151.6)
AGP (g/L)a 0.79 ± 0.2c 0.95±0.3
Total AGP (g)a 2.24 ± 0.5c 3.10 ± 0.82
The mech­anisms involved may include hemo­dilution arising from an increased plasma volume in over­weight and obese individuals and/or the presence of adiposity-related inflam­mation. Support for these two mech­anisms has been provided by an elegant study in which plasma volume and inflam­mation were measured (Cepeda-Lopez et al., 2019).

Data in Table 17.15 confirm both a higher plasma volume and the presence of adiposity-related inflam­mation in the overweight / obese (OW/OB) women (BMI 25‑39.9kg/m2) compared to their normal-weight peers (BMI 18.5‑24.9kg/m2). Note the elevated serum concen­trations and total mass of IL-6 levels in the OW/OB women. These higher levels stimulate an increase in circu­lating hepcidin which, in turn, is likely to impair iron absorp­tion, and induce the hypoferremia (i.e., low serum iron levels) observed in the OW/OB women compared to their normal weight peers (Cepeda-Lopez et al., 2019). There is also an increase in Hb mass (i.e., total Hb in g) in OW/OB women, thus increasing their iron requirements for erythro­poiesis and contributing to the increased risk for iron defi­ciency in OW/OB women.

Decreased erythro­poiesis reduces the demand for iron for Hb synthesis, so levels of serum iron remain at normal or slightly above normal levels, and TIBC is normal or low. Conse­quently, trans­ferrin satu­ration may be high. This con­dition occurs with chronic renal failure, and aplastic anemia as well as with the action of certain drugs and toxins.

Increased erythro­poiesis occurs in con­ditions in which there is an increased demand for iron for Hb synthesis. Conse­quently, serum iron concen­trations fall below normal limits, whereas TIBC is often high-normal or even high, and levels of circu­lating serum trans­ferrin increase. These same trends occur in iron defi­ciency, so additional measures of iron status (e.g., serum ferritin) should be used to distin­guish between increased erythro­poiesis and iron defi­ciency. Increased erythro­poiesis occurs during the stages of the life cycle when needs for iron are increased, such as growth and preg­nancy, after recovery from bone marrow depression, in response to vitamin B12 and folate therapy, hypoxia, and sometimes in polycy­themia.

Hemolysis leads to falsely elevated serum iron concen­trations because the concen­tration of iron in red blood cells is higher than that of serum. There­fore, hemolyzed serum samples should not be used for the measure­ment of serum iron.

17.6.2 Interpretive criteria — use of the distribution of reference values for serum iron, serum trans­ferrin, TIBC, and trans­ferrin satu­ration (as %)

Data shown below for the U.S., U.K., and New Zealand are based on an apparently “healthy” popu­lation sampled during a national nutrition survey. Also included is an additional set of U.S. data from a “healthy” reference sample group.

17.6.3 Cutoffs for serum iron, trans­ferrin and trans­ferrin satu­ration (as %)

Cutoffs for infants are not clearly defined because age-related changes in serum iron and trans­ferrin are marked as shown in Figure 17.10. In newborns, serum iron concen­trations vary markedly, prohibiting the use of percen­tage trans­ferrin satu­ration as an indicator of iron status in neonates (Lynch et al., 2018).

Table 17.16. Cutoff points used for identifying abnormal values of iron status measures in the analysis of the NHANES III data. a80µg/dL red blood cells (RBCs). b70µg/dL RBCs. From Looker et al. (1997).
Age
group
(y)
Trans­ferrin
satu­ration
(%)
Serum
ferritin
(µg/L)
Erythrocyte
proto­porphyrin
µmol/L RBCs
1‑2 < 10 < 10 > 1.42a
3‑5 < 12 < 10 > 1.24b
6‑11 < 14 < 12 > 1.24
12‑15 < 14 < 12 > 1.24
≥ 16 < 15 < 12 > 1.24

Cutoffs for children for trans­ferrin satu­ration (as percent) are given in Table 17.16, where they are classified into four age groups with values ranging from < 10% to < 14%. These cutoffs approximate the 12th percentile and are slightly lower than used in U.S. NHANES II because of a change in the serum iron method in U.S. NHANES III (see below) and the fact that more blood samples were drawn late in the day, when serum iron values are normally lower (Looker et al., 1997).

Cutoffs for adults were also compiled by Looker et al. (1997) with a trans­ferrin satu­ration below 15% said to be associated with iron-deficient erythro­poiesis, if subjects with infection and inflam­mation are excluded (Lynch et al., 2018).

Cutoffs for iron overload in adult males and females are based on trans­ferrin satu­ration values > 70%. This value was used in both the NHANES II and III surveys, in combination with elevated serum ferritin levels (Section 17.7.1). In some studies, a lower cutoff value for trans­ferrin satu­ration for screening for iron overload in women (i.e., > 50%) has been used (Hallberg, 1995).

17.6.4 Measure­ment of serum iron, TIBC, and serum trans­ferrin

Serum iron can be assayed by a variety of methods, some of which can be performed using a clinical chemistry autoanalyzer. Colorimetric procedures frequently use ferrozine as the chromogen to react with Fe++ to form a violet complex (Giovanniello et al., 1968). Other chromogens that can be used include batho­phen­anthro­line, sulfonate, and ferrozine (Elsayed et al., 2016). Alternatively, serum iron can be assayed directly by atomic absorp­tion spectro­photometry. Often there are large discrepancies between serum iron concen­trations assayed by dif­fer­ent methods (Tietz et al., 1996), particularly at the lower end of the range where the diagnosis of iron defi­ciency is impor­tant (Worwood, 1996). Plasma iron concen­trations normally range from 8.9‑21.5µmol/L (50‑120µg/dL), with those indicative of iron defi­ciency from 8.9‑10.7µmol/L (50‑60µg/dL). Factors that must be addressed when assaying for serum iron to ensure reliable results are shown in Box 17.3.
Box 17.3: Procedures to ensure reliable serum iron results Modified from WHO (2007).

Total iron binding capacity can be calculated using a direct assay method that involves saturating the trans­ferrin iron-binding capacity with excess ferric (Fe3+) iron, and then removing the excess unbound iron by using an adsorbent agent such as magnesium carbonate, charcoal, or anion exchange resin. The iron that is fully saturating trans­ferrin (i.e., trans­ferrin-bound iron) in solution is then measured (ICSH, 1978).

Uncertainties in this assay include variations in the type and amount of magnesium carbonate used to remove unbound iron, and in the concen­tration of the saturating-iron solution (Pilch and Senti, 1984). In 1990, the ICSH published revised recom­mendations for this assay (ICSH, 1990).

An indirect method can also be used to calculate TIBC. In this method, serum is incubated at near neutral pH with a pre­deter­mined amount of Fe3+ that saturates all the available free sites on serum trans­ferrin. A chromogen is then added to complex with the free unbound iron, and as the total amount of iron added is known, UIBC can be measured. TIBC is then calculated based on the sum of UIBC and serum iron as shown in Section 17.6.

Potential sources of error in the measure­ment of both serum iron and TIBC include contamination by exogenous iron, interference by lipids and bilirubin in the spectrometric methods, and a copper interference with colorimetric assays (Tietz et al., 1996). To overcome the copper interference, the assay employing ferrozine, used for serum iron and TIBC in U.S. NHANES III, was improved by the addition of 1% thiourea (Looker et al., 1997). This led to lower serum iron values (by about 1.6µmol/L) in this survey than those for U.S. NHANES II, with a much larger discrepancy at lower concen­trations than at higher ones (Looker et al., 1995). Quality-control sera, with certified values for serum iron and TIBC and covering a wide range, should be included with each assay. A reference material for serum iron is available from the National Institute for Standards and Technology (NIST SRM 937), although no reference preparation exists for TIBC. Iron-free water should always be used.

Serum trans­ferrin can be assayed using several immunological assays involving immuno­chemical turbidimetry, nephelometry, radial immunodiffusion methods, and enzyme-linked immuno­sorbent assays (ELISA). Gambino et al. (1997) recom­mend using an automated nephelo­metric method to assay serum trans­ferrin and not the manual radial immuno­diffusion method. Variations between these dif­fer­ent methods have been reported, some involving failure of the methods to take into account other iron-binding proteins (Elsayed et al., 2016). A certified reference material (CRM 470 reference preparation for protein in human serum RPPHS) is now available for serum trans­ferrin.

Measure­ments of iron trans­port are now often omitted from nutrition surveys because of the large volume of serum sample required, cumbersome analytical methods, and poor sensitivity and specificity for iron defi­ciency; their greatest value is in distin­guishing between the anemia of chronic disease and that of true iron defi­ciency (Table 17.14) (Cook, 1999).

17.7 Serum ferritin

Ferritin was first identified in human serum by Addison et al. (1972), where it represents only a small fraction of the body's ferritin pool. Its function in the serum is unknown. Ferritin appears to enter the serum by secretions from hepatocytes or specialized macrophages (Cook and Skikne, 1982). The ferritin mole­cule consists of an intracellular hollow protein shell composed of 24 subunits that surrounds an iron core con­taining 4000‑4500 iron atoms (WHO, 2020).

In most healthy individuals with no evidence of concurrent infection or inflam­mation, the concen­tration of serum or plasma ferritin parallels the total amount of storage iron (Cook et al., 1974). Once iron stores become exhausted, how­ever, serum ferritin concen­trations no longer reflect the severity of the iron-defi­ciency state. Serum ferritin is the only measure of iron status that can reflect a deficient, excess, or normal iron status. Evidence for the quantitative relationship between serum ferritin and storage iron is shown in Box 17.4.
Box 17.4: Evidence for the quantitative relationship between serum ferritin and storage iron

Translation of serum ferritin values into the size of the iron stores should be made cautiously, in view of the limited knowledge of this relationship and the confounding factors that influence serum ferritin concen­trations. There is also considerable between-subject variation in the relationship between serum ferritin and tissue iron. In otherwise healthy individuals, a serum ferritin concen­tration of 1µg/L is said to be equivalent to approx­imately 8‑10mg of storage iron in an adult (Cook, 1999). During iron overload, how­ever, the proportion of iron stored as insoluble hemo­siderin increases. Besides providing a body store of iron, the uptake of iron into ferritin limits the capacity of iron to generate free radicals and prevents oxidative damage to tissues (Ghio et al., 2006).

17.7.1 Factors affecting serum ferritin

Biological variation in serum ferritin levels is less marked than for serum iron. The overall within-subject day-to-day coefficient of variation for serum ferritin in healthy subjects over a period of weeks is about 15% (see Table 17.4), compared with approx­imately 30% for serum iron: the diurnal variation appears to be minimal. Analytical variation across differing automated methods for serum ferritin ranges from a CV of 10‑15%, whereas when the same analytical method is used the CV is < 10% (Blackmore et al., 2008).

Table 17.17. Changes in serum ferritin concen­trations with age. The U.S. NHANES III results are abstracted from IOM (2001) and are the median values and representative of all races. Breast-feeding infants and children were excluded.
Age group (y) Males (µg/L) Females (µg/L)
1‑3 22.9
4‑8 28.7
9‑13 32.9 30.6
14‑18 46.7 27.0
19‑30 112.5 36.0
31‑50 155.9 39.9
51‑70 154.5 89.4
≥ 71 137.1 95.3
Pregnant 28.0
Lactating 32.9
Age-related changes in serum ferritin concen­trations are relatively marked and differ according to sex. At birth, concen­trations are high, as iron stores in the liver are abundant. During the first 2mo, serum ferritin levels rise further, as iron is released from the fetal red cells and the rate of erythro­poiesis is slow. Serum ferritin concen­trations then fall throughout later infancy (Domellöf et al., 2002a). At about 1y, ferritin values increase again, and continue to rise into adulthood (WHO, 2020, Table 17.17).

Sex differences in serum ferritin concen­trations exist. Among infants, males have lower values than females at 4, 6, and 9mo, and these values are not responsive to iron sup­ple­mentation (Domellöf et al., 2002a). As a result, there may be a need to develop sex-specific cutoffs for serum ferritin in infancy. During adolescence, females have lower values than males, a trend that persists into late adulthood, reflecting their lower iron stores due to menstruation and childbirth. Levels in males reach a maximum between 30 and 39y, after which levels remain constant until about 70y. In females, serum ferritin levels remain relatively low until menopause, after which they rise steeply (Table 17.17) (Zacharski et al., 2000; WHO, 2020).

Race is known to influence serum ferritin concen­trations. In U.S. NHANES III, adult male African Americans had higher serum ferritin values than Caucasians and Hispanics of comparable age for each decade of life (Figure 17.12).
Figure 17.12
Figure 17.12. Distribution of mean serum ferritin concen­tration by age group for American men and women from the NHANES III survey by age and racial group. Redrawn from Zacharski et al. (2000).
In the adult females, serum ferritin levels were only higher in the African Americans after menopause. Overall, serum ferritin values (as geometric means) were approx­imately 7%‑8% greater for the African Americans than for the Caucasians throughout the second half of life. Such differences are unlikely to be associated with increased intakes of dietary iron but, instead, are likely due to genetic factors (Zacharski et al., 2000).

Iron defi­ciency leads to lower iron stores: serum ferritin values fall progressively as the stores decline, and before the characteristic changes in serum iron and TIBC. The decline in serum ferritin arises because hepcidin levels are suppressed in iron defi­ciency, thus facilitating the greater mobilization of iron from body stores. How­ever, once body iron stores are depleted (i.e., serum ferritin concentrations < 15µg/L), serum ferritin concentrations are not indicative of the severity of iron defi­ciency (Pfeiffer and Looker, 2017). In frank iron-defi­ciency anemia, when classic microcytic hypo­chromic anemia occurs, serum ferritin levels are very low or zero. A low concen­tration of serum ferritin is characteristic only of iron defi­ciency (Dallman et al., 1980).

The decline in iron stores in uncom­plicated iron defi­ciency has been confirmed in a system­atic review by Garcia-Casal et al. (2018). Serum ferritin and stained bone marrow aspirates were measured simul­tane­ously and compared in studies of otherwise healthy individuals without inflam­mation and with both normal iron stores and iron defi­ciency. In Table 17.18
Table 17.18. Mean ferritin concen­trations (µg/L) in healthy adults classified as iron deficient or iron sufficient by bone marrow iron content in a system­atic review. From Garcia-Casals et al. (2018).
Apparently
healthy
adults
Non-healthy
adults
Bone marrow Fe
content = 0µg/L
15.1082.43
Number of studies
included
9 (n=390)38 (n=1023)
Bone marrow Fe
content ≥ 1µg/L
70.37 381.63
Number of studies
included
3 (n=151) 38 (n=1549)
the mean serum ferritin level for those with iron defi­ciency (based on the gold standard test of a bone marrow Fe content = 0µg/L) was lower (15.1µg/L (range of = 12‑18µg/L)) compared to the mean serum ferritin value when iron was present in the bone marrow (i.e., bone marrow Fe content > 1µg/L). The mean difference in serum ferritin concen­tration between iron deplete and iron replete adults was approx­imately 55µg/L, as shown in Table 17.18 (Garcia-Casal et al., 2018).

Acute and chronic inflam­mation increase levels of pro-inflam­matory cytokines in the systemic circu­lation, which in turn elevate serum ferritin, an acute phase protein. At the same time, hepatic synthesis of hepcidin, also an acute phase protein, is increased, thereby inhibiting both the release of iron from body stores and the absorp­tion of dietary iron from the gastrointestinal tract. Reductions in food intake, and hence iron intake during inflam­mation, may also impair iron status. Hence, the assessment of iron status in settings where concomitant infection and inflam­mation occur is challenging and may result in an under­estimate of the preva­lence of iron defi­ciency in a popu­lation (Suchdev et al., 2017). This is illustrated by the high mean serum ferritin concen­tration (i.e., 82.43µg/L) (n=1023) in Table 17.18 for non-healthy individuals, even though iron depletion had been confirmed by a bone marrow content = 0µg/L in these individuals (Garcia-Casal et al., 2018). For more details of the impact of inflam­mation on iron status indicators, the reader is referred to Raiten et al. (2015) and Lynch et al. (2018).

Elevated concentrations of bio­markers of acute and chronic inflam­mation are associated with misleadingly high serum ferritin concen­trations during inflam­mation. The WHO (2020) recom­mends the concurrent measure­ment of serum C‑reactive protein (CRP) and α‑1‑acid-glycoprotein (AGP), each providing a measure of the severity and duration of acute and chronic inflam­mation, respectively. The assay of CRP and AGP clearly aids the inter­pretation of serum ferritin concen­trations in regions where infection or inflam­mation are widespread; see Section 17.7.3 for details of their use.

Preg­nancy results in a gradual decrease in serum ferritin concen­trations, reaching the lowest concen­tration in the third trimester, a decline partly attributed to hemo­dilution together with progressive depletion of iron stores in unsup­ple­mented women. At this stage, if iron stores are exhausted, serum ferritin will no longer reflect the severity of the iron deficient state. In women sup­ple­mented with iron, how­ever, the decline in serum ferritin in the third trimester is reduced (Figure 17.13).
Figure 17.13
Figure 17.13. Geometric mean serum ferritin concen­trations during preg­nancy in 63 women with iron sup­ple­mentation and 57 women without iron supple­mentation. Redrawn from Fisher and Nameth (2017).
In addition, because preg­nancy is an inflam­matory state, the potential effect of inflam­mation on serum ferritin concen­trations must be taken into account in their inter­pretation. As a conse­quence, including concurrent measure­ment of inflam­matory markers such as CRP and AGP with serum ferritin during preg­nancy is essential.

In view of these uncertainties, the use of low serum ferritin concen­trations alone to define iron defi­ciency during preg­nancy is not recom­mended. Additional iron bio­markers such as soluble trans­ferrin receptor (sTfR) should also be included (see Section 17.9.1) as used in the U.S. NHANES 1999‑2006 survey (Mei et al., 2011).

Concerns over high serum ferritin concentrations during the second or third trimester have been raised in view of their association with adverse pregnancy outcomes such as the increased risk of preterm delivery (Scholl, 1998). Whether these high ferritin concentrations reflect excessively high maternal iron stores, the presence of inflammation in complicated pregnancies, or the failure of the plasma volume to expand, is unclear (Fisher and Nemeth, 2017).

Oral contraceptive agents have been associated with higher serum ferritin concen­trations (Miller, 2014). Mechanisms proposed include lighter menstrual blood loss and possibly a direct relationship between estrogen and iron homeo­stasis. High levels of estrogen are said to inhibit the expres­sion of hepcidin, thus increasing uptake of iron into the body (Yang et al., 2012).

Iron overload is associated with a high serum ferritin concen­tration arising from excessive absorp­tion of iron. It is usually caused by autosomal recessive genetic conditions such as hereditary hemochromatosis, as well as other conditions shown in Table 17.19.
Table 17.19. Mean serum ferritin concen­trations (µg/L) (n=number of individuals) with pathologies of dif­fer­ent origins classified as iron overloaded or non-iron overloaded according to liver iron content determined by biopsies in studies included and extracted for the diagnostic test accuracy system­atic review. From Garcia-Casals et al. (2018).



Disease/Condition
No iron overload
(Liver iron concen­tration
< 3.2mg/g dry liver wt.
or defined by trialist)
Iron overload present
(Liver iron concen­tration
> 3.2mg/g dry liver wt.
or defined by trialist)
Hemochromatosis 401.16 (n=78) 1043.82 (n=247 )
Alcoholism, liver
cirrhosis, liver disease
755.2 (n=154) 1322.63 (n=131)
Chronic hepatitis C 135.9 (n=526) 350.3 (n=213 )
Non-alcoholic fatty
liver disease,
chronic viral hepatitis
1356.2 (n=6) 1425.5 (n=4)
All diseases/con­ditions 496.3 (n=764) 1052.0 (n=595)
Over time, the excess iron accumulates in these conditions, mainly as ferritin with a small amount as insoluble hemosiderin, most notably in the liver, eventually resulting in hepatic or cardiac complications unless treated. How­ever, in view of the high variability in the mean serum ferritin concen­trations noted in the cases with and without iron overload in Table 17.19, serum ferritin should not be used alone to detect iron overload. Instead, serum ferritin should be used alongside other iron bio­markers (e.g., trans­ferrin satu­ration) and known genetic and clinical risk factors (Garcia-Casal et al., 2018).

Obesity may be associated with normal to increased concen­trations of serum ferritin as a result of adiposity-related chronic inflam­mation. The cytokines, notably inter-leukin‑6 (IL‑6) and leptin, released from the enlarged adipocites, stimulate an increase in circu­lating hepcidin levels that block the release of iron from body stores into the circu­lation (independent of iron status), and down regulate intestinal iron absorp­tion (Gartner et al., 2013). Overweight and obese women also have a higher Hb mass than normal weight women (Table 17.15), and thus a higher iron requirement (Cepeda-Lopez et al., 2019). There­fore, the impairment in iron absorp­tion combined with an increased iron requirement may both con­tribute to the increased risk for iron defi­ciency in overweight and obese women, despite an apparently normal to increased concentration of serum ferritin (McClung and Karl, 2009; Cepeda-Lopez et al., 2019).

Certainly, in a study of Greek school children (9‑13y; n=1493), lower values for serum iron and trans­ferrin satu­ration and elevated serum sTfR concen­trations indicative of iron-deficient erythro­poiesis were observed in boys in the highest quartile of percentage body fat (determined via biolectrical impedance). In contrast, corresponding concen­trations of ferritin were higher, compared to those in the lower quartile for body fat, as shown in Table 17.20.
Table 17.20. Bio­chem­ical and dietary indices of iron status across quartiles of percen­tage body fat mass in prepubertal children (mean values and standard deviations). TIBC, total Fe-binding capacity; TS, trans­ferrin satu­ration. *Derived from ANOVA. Mean values were signif­icantly dif­fer­ent between lower and higher quartiles after post-hoc multiple comparisons (< 0·05, Bonferroni rule). Mean values were signif­icantly dif­fer­ent between middle and higher quartiles after post-hoc multiple comparisons (< 0·05, Bonferroni rule). §Mean values were signif­icantly dif­fer­ent between lower and middle quartiles after post-hoc multiple comparisons (< 0·05, Bonferroni rule). From Moschonis et al. (2012).
Percen­tage
body fat mass
Lower quartile
Boys (n=183)
Middle quartiles
Boys (n=374)
Higher quartile
Boys (n=183)
p*
Bio­chem­ical serum indices
Fe (mg/L) 935(339) 883 (341) 782†‡ (299) 0·001
TIBC (mg/L) 3388 (578) 3323 (497) 3421 (493) 0·144
TS (%) 27·8 (9·6) 26·8 (9·9) 23·2†‡ (9·1) 0·001
Ferritin (ng/ml) 27·3§† (16·0) 33·6§ (21·8) 36·3 (23·9) 0·001
A similar trend was observed for girls (Moschonis et al., 2012). Clearly, the increased levels of serum ferritin in the obese children could have con­tributed to an under­estimation of iron defi­ciency among this cohort, if the additional iron bio­markers had not been measured.

Decreased erythro­poiesis is associated with a decreased utilization of iron for Hb synthesis, so serum ferritin levels may be normal or slightly above normal. Decreased erythro­poiesis may be associated with disease conditions such as chronic renal diseases and aplastic anemia.

Increased erythro­poiesis leads to a decline in storage iron, reflected by a fall in serum ferritin levels owing to the increased demand for iron for Hb synthesis. Conditions in which increased erythro­poiesis occurs include the stages of the life cycle when iron needs are increased, such as during growth and preg­nancy, after recovery from bone-marrow depression, hypoxia, and in response to vitamin B12 and folate therapy.

Genetic Hb disorders such as thalassemia traits and Hb EE may elevate serum ferritin (and sTfR) concen­trations because of ineffective erythro­poiesis and the short life span of red blood cells. As a conse­quence of these distur­bances, levels of hepcidin are inappro­priately low resulting in an excessive absorp­tion of iron that leads to iron overload, and thus elevated serum ferritin levels (Manolova et al., 2019). As an example, in non-pregnant Cambodian women aged 18‑45y (n=420), the geometric mean serum ferritin (and sTfR) concen­tration was signif­icantly higher in women classified with Hb genotype EE compared to women with the normal Hb AA genotype (Karakochuk et al., 2015). Even during preg­nancy, high serum ferritin concen­trations have been reported among women with genetic Hb disorders such as sickle cell disease (Hb SS and Hb SC), some of whom had negligible or no iron in the bone marrow (Oluboyede, 1980). These findings highlight the difficulty of using serum ferritin (and sTfR) to assess the preva­lence of iron defi­ciency in popu­lations with a high preva­lence of genetic Hb disorders.

Exercise when moderate, has little effect on serum ferritin concen­trations, but if severe leads to an increase due to muscle damage and inflam­matory reactions (WHO, 2007).

Malaria is associated with a high serum ferritin concen­tration irrespective of the parasite load. Several mech­anisms are involved, including the destruction of red blood cells, the acute phase response, suppressed erythro­poiesis, and the release of ferritin from damaged liver or spleen cells (WHO, 2007); see also “Malaria” in Section 17.2.1.

Acute and chronic liver disease may lead to abnormally high serum ferritin concen­trations (Table 17.19), probably arising from the release of ferritin from damaged liver cells. Liver damage may also interfere with the clearance of ferritin from the circu­lation (Worwood, 1997). These elevated serum ferritin levels do not necessarily reflect a high intracellular concen­tration of ferritin, as indicated by the high mean serum ferritin in the absence of iron overload with liver disease in Table 17.19.

Leukemia and Hodgkin's disease also result in raised serum ferritin concen­trations. In leukemia, these may be associated with: (a) increased deposition of iron in cells of the reticulo-endothelial system; (b) circu­lating leukemic cells con­taining high levels of ferritin; or (c) increased release of ferritin from damaged cells, as in liver disease. In Hodgkin's disease, the increased ferritin in the serum possibly comes from the lymphocytes (Worwood, 1979).

Other disease states such as alcoholism, liver cirrhosis, chronic hepatitis C, and rheumatoid arthritis are also associated with elevated serum ferritin concen­trations (Table 17.19) (Garcia-Casal et al., 2018; Schutte et al., 2019). In some of these disease states, elevated serum ferritin values are observed even in cases of iron depletion as indicated by a bone marrow iron content of zero.

Pre-diabetes and Type 2 diabetes mellitus have been associated with elevated serum ferritin concen­trations in several epidemiological studies (Cheung et al., 2013; Kunutsor et al., 2013; Yeap et al., 2015) although the underlying mech­anism is poorly under­stood.

17.7.2 Interpretive criteria ‑ use of the distribution of serum ferritin reference values

Reference value distributions for serum ferritin concen­trations currently available have been drawn from a reference sample of unselected, apparently “healthy” individuals from the general popu­lation sampled during a nationally representative survey. There­fore, some of the persons in the sample will have almost no storage iron without being anemic, and a small proportion may even be anemic. As a result, even the “reference range” will include ferritin concen­trations of persons who are iron deficient. This approach has been used in the following examples:

17.7.3 ‑ Cutoffs for serum ferritin

Clearly, the assessment of the preva­lence of iron defi­ciency based on a low serum ferritin concen­tration is challenging in settings where infection or inflam­mation are widespread. Many approaches have been proposed for adjusting for the effect of inflam­mation on ferritin and other iron bio­markers (Thurnham, 2014; Suchdev et al., 2017). Recently, a new regression modeling approach has been developed by the BRINDA group (Bio­markers Reflecting Inflam­mation and Nutritional Determinants of Anemia). In this approach, the inflam­matory bio­markers (CRP and AGP) are treated as continuous variables because no clear cutoff was found for CRP or AGP at which there was a change in the relationship between inflam­mation and ferritin. By using this new approach, the full range and severity of inflam­mation can be accounted for, with greater corrections applied when the inflam­matory bio­markers indicate severe inflam­mation; for more details see Raiten et al. (2015) and Suchdev et al. (2017).

The BRINDA approach differs from the earlier correction factor method developed by Thurnham (2014). In this earlier method, the stages of inflam­mation (incubation, early, late convalescence) were first classified, after which specific cutoffs for elevated levels of serum CRP (i.e.,> 5mg/L) and AGP (i.e.,> 1.0g/L) were applied; see Thurnham (2014) for more details. Table 17.21
Table 17.21. Impact of inflam­mation on serum ferritin concen­trations of Indonesian infants at age 12mos.  * Ferritin cutoff indicative of iron defi­ciency < 12µg/L. From Diana et al. (2017).
Bio­marker in serum Geometric mean
(95% CI)
Proportion
at risk (%)
Ferritin*: No adjust­ment 14.5µg/L (13.6‑17.5) 44.9
Ferritin: Brinda adjust­ment 8.8µg/L (8.0‑9.8) 64.9
compares the geometric means (95% CI) and the proportion with low serum ferritin (i.e., < 12µg/L) with and without adjust­ment for inflam­mation in a study of Indonesian infants age 12mos. Note the adjusted geometric mean value for serum ferritin was lower, and the proportion of infants at risk for iron defi­ciency (serum ferritin < 12µg/L) was higher, after applying the new BRINDA regression adjust­ment method (Diana et al., 2017). These findings are consistent with those reported else­where in pre­school children and women of child-bearing age (Engle-Stone et al., 2017; Namaste et al., 2017). Furthermore, they emphasize the importance of measuring the two inflam­matory bio­markers (CRP and AGP) recom­mended by WHO, and applying the BRINDA adjust­ment method prior to interpreting serum ferritin concen­trations in settings where infection and inflam­mation co-exist with iron defi­ciency.

Cutoffs for infants aged 4, 6, and 9 mos indicative of depleted iron stores and compiled by Domellöf et al. (2002a) are shown in Table 17.22.
Table 17.22. Suggested serum ferritin thresholds by age to classify individuals as iron deficient during epidemiological studies.
aData derived from studies of iron replete, breast-fed infants.   bData from NHANES III.
Age Serum ferritin
concen-
tration (µg/L)
Reference
4mos < 20Domellöf et al. (2002a) a
6mos < 9
9mos < 5
1‑2y < 10Looker et al. (1997) b
3‑5y< 10
6‑11y< 12
12‑15y< 12
≥ 16y< 12
0 ‑ 23mos< 12 WHO (2020)
24 ‑ 59mos< 12
5 ‑ < 10y< 15
10 ‑ < 20y< 15
20 ‑ 59y< 15
≥ 60y< 15
Pregnant
women
< 15
first trimester
They are based on serum ferritin concen­trations set at −2SD for iron-replete Swedish breastfed infants. How­ever, the rapid changes in iron status in the first year of life as fetal Hb is replaced by Hb A complicate the inter­pretation of serum ferritin during infancy.

Cutoffs for young children and adolescents ranging in age from 1y to > 16y differ slightly according to their source (Daru et al., 2017). Table 17.22 summarizes the cutoffs developed for five age groups by Looker et al. (1997) for use in the NHANES III surveys together with those recom­mended by WHO (2020) for infants, preschool children, school-aged children, and adolescents.

Concerns have been raised on the appropriateness of the cutoffs for preschool children shown in Table 17.22 in view of the limited data available to support these cutoffs (Daru et al., 2017). There is only one study (in apparently healthy Malawian children age 6‑66mos) in which serum ferritin concentrations have been compared against the gold standard for iron defi­ciency (i.e., the absence of iron in bone marrow aspirates) (Jonker et al., 2014). Their results suggested that the current recom­mended cutoff for children (i.e., 12µg/L) although specific, identifying those persons who are genuinely well nourished, is not sensitive, with a poor ability to identify those persons with iron defi­ciency.

More recently, Mei et al. (2021) have used a population-based, physio­logical approach to define serum ferritin cutoffs in children age 12‑59mos using two independent biomarkers of iron deficient erythropoiesis — Hb and sTfR. Their study was based on U.S. NHANES data over a 15y period. They suggest the use of a higher cutoff for children age 12‑59mos (< 20µg/L) to determine the prevalence and distribution of iron defi­ciency in a population. A higher serum ferritin cutoff was also identified by Abdullah et al. (2017) in an earlier study of Canadian children (12‑36mos) (i.e., 17.9µg/L) applying a similar physio­logical approach, although using only Hb and not sTfR.

Mei et al. (2021) emphasize that cutoffs defined using their physio­logical approach reflect the onset of iron-deficient erythropoiesis, whereas the values for preschool children and non-pregnant women shown in Table 17.22 are intended to characterize a more advanced stage of iron deficient erythropoiesis when bone marrow iron stores are exhausted. Consequently, the higher serum ferritin cutoffs of Mei et al. (2021), when applied, will reflect points where responses to iron depletion can be detected (Braat and Pasricha, 2021).

Cutoff for adults shown in Table 17.22 characterize an advanced stage of iron deficient erythropoiesis when bone marrow iron stores are absent, based on limited evidence (Daru et al., 2017).

Mei et al. (2021) also applied their population-based, physio­logical approach to define ferritin cutoffs in non-pregnant women (15‑49y) from the U.S. NHANES data. Again, they suggest a higher cutoff for serum ferritin for non-pregnant women (< 25µg/L) which reflects the onset of iron-deficient erythropoiesis, and which may be used to determine the prevalence and distribution of iron defi­ciency in a population.

Clearly, in view of the long-term adverse health outcomes that may be related to iron defi­ciency in children, and its impact on pregnancy outcome, more research to validate serum ferritin cutoffs using this new physio­logical approach in non‑U.S. populations is urgently required.

Cutoffs for the elderly defined by WHO (2020) are shown in Table 17.22. There is some concern that they may need to be higher when used to diagnose iron depletion in this age group, although so far no consensus has been reached (Wawer et al., 2018).

Cutoffs for pregnancy are difficult to define in view of the numerous physio­logical changes that occur, including hemodilution and inflammation. WHO, based on a supporting Cochrane system­atic review, failed to establish a cutoff for iron depletion during late pregnancy, because few pregnancy studies applied the gold standard bone marrow biopsy test (Garcia-Casal et al., 2018). As a consequence, WHO (2020) set a cutoff for serum ferritin to define iron defi­ciency only for the first trimester of pregnancy (i.e., < 15µg/L) as shown in Table 17.22.

Cutoffs for iron overload disorders based on elevated serum ferritin values are shown in Table 17.23.
Table 17.23. Cutoff values for serum ferritin (µg/L) indic­ative of iron overload in adults. From Pilch and Senti (1984).
Age group (y) Males Females
20‑44 > 200 > 150
45‑64 > 300 > 200
65‑74 > 400 > 300
They were applied in the U.S. NHANES II and III in conjunction with a trans­ferrin satu­ration > 70%.

WHO (2020), how­ever, have specified cutoffs for serum ferritin for all apparently healthy individuals age ≥ 5y at risk of iron overload as > 200µg/L for males and > 150µg/L for females. They recom­mend using serum ferritin along with other bio­markers (e.g., trans­ferrin satu­ration) and known genetic and clinical risk factors to diagnose iron overload. In this way, an incorrect diagnosis of iron overload when using serum ferritin alone will be avoided (McKinnon et al., 2013; Garcia-Casal et al., 2018).

WHO (2020) have also compiled cutoff values to define risk of iron overload in non-healthy individuals. For school-age children (5‑12y), adolescents (13‑19y), and adults of all ages, the suggested serum ferritin cutoff is > 500µg/L. How­ever, WHO caution that a serum ferritin concentration > 500µg/L may indicate risk of iron overload or other diseases, and highlight the need for further clinical and laboratory evaluation to establish the diagnosis and underlying causes of the elevated ferritin levels.

Cutoffs for serum ferritin indicative of iron defi­ciency for individuals with infection or inflammation have been compiled by WHO (2020). For infants 0‑23mos, preschool children (24‑59mos), the WHO cutoff recom­mended is a serum ferritin < 30µg/L. For school-age children (5‑12y), adolescents (13‑19y), adults (20‑59y), and older persons (≥ 60y), a serum ferritin < 70µg/L is recom­mended. How­ever, many factors influence the effect of inflammation on serum ferritin, including the degree and duration of inflammation and immunity, so it is not unexpected that iron defi­ciency may exist in cases of inflammation even when serum ferritin concentrations are below these cutoffs (Garcia-Casal et al., 2018).

As an example, in the national survey in the Cameroon, applying a serum ferritin cutoff of < 30µg/L under­estimated the prevalence of iron defi­ciency when compared to that based on inflammation-adjusted body iron stores < 0mg/kg (Engle-Stone et al., 2013). For more details on the assessment of body iron stores, see Section 17.10.1. Consequently, the preferred approach is to use serum CRP and AGP with the BRINDA method to adjust serum ferritin concentrations for the presence of inflammation.

WHO (2020) classify iron deficiency based on ferritin concentrations below their recom­mended cutoffs as severe, moderate, mild, or not a public health problem when the prevalence of iron deficiency is ≥ 40.0%, 20.0‑39.9%, 5.0‑19.9% or ≤ 4.9% respectively.

17.7.4 Measure­ment of serum ferritin

Venipuncture or capillary blood samples, fasted or nonfasted, can be used for serum or plasma ferritin, although within- and between-sample variation tends to be larger with capillary than venous specimens. Blood must be refrigerated immediately after collection, and processed within a few days. For ferritin assays, serum is stable for > 1wk at 4°C, and for > 1y at < −20°C, and can be subjected to 3 freeze-thawing cycles without any negative affect on the concen­trations. For optimal long-term storage, serum must be stored at < −40°C, and when stored at −70°C, serum ferritin is stable for > 10y (Pfeiffer and Looker, 2017). Dried serum spot (DSS) samples can also be used (Ahluwalia et al., 2002), a major advantage for field studies. Before analysis, the DSS must be digested for 6h with cellulase from Trichoderma reesei.

In the past, serum ferritin was often determined with a single-incubation 2‑site immuno­radio­metric assay (IRMA), a procedure used in the earlier U.S. NHANES and the U.K. Diet and Nutrition Surveys. In later U.S. NHANES (2004‑2006), the Roche Tina-quant ferritin immuno­turbi­metric assay has been used on a clinical analyzer. This method, although automated, requires a large sample (> 150µL) which is a disadvantage for capillary or pediatric specimens. Today, commercial kits based on enzyme-linked immuno assays (ELISA) are also available which compare relatively well across manufacturers, with CVs ranging from 10—15% (Lynch et al., 2018). For a review of the performance and comparability of aboratory methods for measuring ferritin concentrations in serum or plasma, the reader is referred to Garcia-Casal et al., 2018. A ferritin certified reference material is available from WHO through the U.K. National Institute for Biological Standards and Controls (NIBSC, RM 94/572). This was used in NHANES III, and the use of this reference material as a calibration check is strongly encouraged.

17.8 Erythrocyte protoporphyrin

When the supply of iron is adequate, heme is synthesized by the incorporation of iron into proto­porphyrin IX by the enzyme ferro­chetalase. How­ever, when there is insufficient iron to allow this reaction, zinc is substituted for iron, forming zinc proto­porphyrin (ZPP). The ZPP then accumulates within the iron-deprived circu­lating erythrocytes, and can be measured directly in samples of whole blood using a portable hemato­fluoro­meter. An alternative chemical method can also be used whereby the porphyrins are extracted in a solvent, after which the porphyrins are measured fluoro­metrically. With this method, ZPP is converted to free erythrocyte proto­porphyrin (FEP) (WHO, 2007). This extraction method now yields measure­ments almost identical to the direct measure­ments of ZPP performed using a portable hemato­fluoro­meter, provided the instrument used is properly calibrated and maintained, and a standardized procedure followed.

The zinc-for-iron substitution occurs predominantly within the bone marrow so a rise in the concen­tration of ZPP is one of the first indicators of insufficient iron in the bone marrow. Zinc protoporphyrin persists in the iron-deprived circu­lating erythrocytes for the duration of their lifespan (i.e., ≈ 120d). There­fore, an increased ZPP concen­tration in the blood indicates that the majority of the erythrocytes matured at a time when the iron supply was suboptimal (i.e., during the preceding 3‑4mos), and thus provides a measure of uncom­plicated iron-defi­ciency. The more pronounced the iron defi­ciency, the higher the ZPP values, with the highest values seen in severe iron defi­ciency anemia.

As ZPP and FEP are now interchangeable, the term erythrocyte protoporphyrin (EP) is now recom­mended, irrespective of the measurement method used, and is the term applied herein. A normal healthy person will have an EP concen­tration of less than 40‑50µg EP/dL of red blood cells (RBCs), but when iron stores are exhausted (i.e., iron deficient erythro­poiesis), the concen­tration may increase to more than 70‑100µg/dL RBCs, reaching as high as 200µg/dL in iron defi­ciency anemia (WHO, 2007).

When a hematofluorometer is used to measure EP, Hb concen­trations are also measured, allowing the EP to be expressed directly as a ratio of EP/Hb. Use of the ratio minimizes the effect of dilution that results from the expansion in plasma volume that occurs in preg­nancy: EP, RBCs, Hb, and heme are diluted equally. In this way, a misinter­pretation of laboratory results is avoided that may arise, for example, from plasma-volume changes in preg­nancy. How­ever, EP is expressed in several other ways in the literature, and may include EP relative to either whole blood (i.e., µg EP /  dL whole blood) or RBCs (i.e., µg EP / dL RBCs). When the hematocrit is known, then the conversion of µg EP per dL whole blood to µg per dL RBCs is as follows: \[\small \mbox {µg EP per dL red blood cells} = \frac{\mbox {µg EP per dL whole blood}}{\mbox{Hematocrit (%)}} \]
Table 17.24. Comparison of the areas under the receiver operating characteristic (ROC) curves (Mean ± SE) of erythrocyte protoporphyrin (EP), hemo­globin (Hb), and mean cell volume (MCV) in detecting iron defi­ciency in children (three to five years), non-pregnant women (15‑49 years), and pregnant women from the National Health and Nutrition Examination Surveys (NHANES, 2003‑2006 for children and non-pregnant women, 1999‑2006 for pregnant women), after excluding subjects who had signs of infection, as indicated by abnormal white blood cell counts (> 10.0 × 109/L) or elevated C‑reactive protein (> 5mg/L) or of possible liver disease, as defined by at least one of two abnormal elevations (more than two times the upper limit of normal value) on alanine aminotransferase (> 70U/L) and aspartate aminotransferase (> 70U/L).
To generate the ROC curves, EP was converted to the preferred unit of µmol EP/mol heme by multiplying by 50. Values with dif­fer­ent superscript letters (a or b) are signif­icantly dif­fer­ent (p < 0.01) in the areas under the curves. From Mei et al. (2017).
Children
(n=762) 2
Non-Pregnant Women
(n=2746) 2
Pregnant Women
(n=684) 2
EP 0.785 ± 0.049a 0.864 ± 0.011a0.771 ± 0.022a
Hb 0.596 ± 0.054b 0.841 ± 0.011a0.743 ± 0.022a
MCV 0.703 ± 0.052a 0.802 ± 0.014b0.698 ± 0.026b
From µg EP per dL RBCs to µg EP per g Hb: multiply by 0.037. From µg EP per dL RBCs to µmol EP per mol heme: multiply by 0.87. These factors are based on the assumption that the MCHC is normal, although when MCHC is measured in individual samples, then an appro­priate factor can be calculated. For more details on the differing EP reporting units and their inter-conversion, see Beard in WHO (2007).

Table 17.24 compares the receiver operating characteristic (ROC) curves used to characterize the sensitivity and specificity of EP, Hb, and MCV measure­ments in screening iron defi­ciency based on data from U.S. NHANES III (Mei et al., 2017). For children aged 3‑5y, EP expressed as µmol/L (measured by the modified acid extraction method) was a better screening tool for iron defi­ciency than Hb, but not statistically dif­fer­ent from MCV. For non-pregnant and pregnant women, the performance of EP and Hb were similar, with both being superior to MCV. Iron defi­ciency was defined as total body iron stores (TBI) < 0mg/kg body weight; for details on TBI, see “Body iron”" in Section 17.10.1.

Nevertheless, the results presented in Table 17.24 apply to U.S. popu­lation groups for whom iron defi­ciency is an impor­tant cause of anemia and may not necessarily apply to popu­lations with a high preva­lence of infectious diseases or for persons with other non-nutritional causes of iron deficient erythro­poiesis.

Table 17.25. Measured ratios of EP:Hb, their inter­pretation, and recom­mendations.
EP, erythrocyte protoporphyrin; Hb, hemo­globin; CBC, complete blood count; ACD, anemia of chronic disease; APP, acute phase protein; TSAT, trans­ferrin satu­ration. Abstracted from WHO (2007).
Measure­ments Inter­pretation Recom­mendation
Low EP/Hb ratio
(< 60µmol/mol)
Adequate systemic iron supply Iron stores can be
estimated by the ferritin
concen­tration; If EP/Hb
ratio< 40µmol/mol,
consider tests for iron
overload (TSAT)
Mid-range
EP/Hb ratio
(60‑80µmol/mol)
Possible non-replete iron status;
consider inadequate diet, ACD,
or other causes
CBC may support case
for iron depletion; ferritin
can be used to dif­fer­entiate
low iron stores from
inflam­matory blockade;
could then use
ferritin concen­tration
to verify inflam­mation
High EP/Hb ratio,
(> 80µmol/mol)
and low ferritin con-
centration (< 20µg/L)
Iron deficient erythro­poiesis
attributable to low marrow iron
supply, may be related to
depleted iron stores
Iron sup­ple­mentation;
monitor therapy with
decrease in the EP/Hb ratio
and/or increase in
reticulocyte count
High EP/Hb ratio
(> 80µmol/mol)
and high ferritin con-
centration (> 200µg/L)
Severe inflam­matory blockade,
ACD, other causes of
impaired iron utilization
Correct the causes(s)
of impaired iron
utilization; consider
chronic lead poisoning
or ineffective
sup­ple­mentation
Table 17.25 provides guidelines on the inter­pretation of EP/Hb ratios in the diagnosis of nutritional and non-nutritional causes of iron deficient erythro­poiesis compiled by Labbé and Dewaanji (2004) and reproduced by Beard (WHO, 2007).

17.8.1 Factors affecting erythrocyte protoporphyrin

Biological variation for EP is ≈6.5%. This includes circadian variation (day-to-day) and diurnal variation (within one day). Levels between 18.00hrs and midnight are higher for reasons that are unkown (Soldin et al., 2003).

Age-related trends suggest that levels of EP are highest from age 0‑12mos, probably reflecting the rapid erythro­poiesis in this age group. After infancy, they decrease until adolescence (10‑17y), when they appear to increase, declining slightly thereafter (Soldin et al., 2003; Table 17.26).

Sex-related changes in EP have been described, with girls having signif­icantly higher levels than males at 10‑17y. Likewise, adult women have slightly higher levels than males (Soldin et al., 2003).

Iron deficient erythro­poiesis, charac­ter­ized by the exhaustion of iron stores, leads to a slow but progressive increase in EP in the peripheral blood which is apparent before any decline in Hb and MCV. The increase in EP concen­tration occurs slowly because only the newly produced erythrocytes con­taining ZPP appear in the peripheral blood to replace those erythrocytes at the end of their life span (i.e., ≈ 120 days) (Mei et al., 2017).

Infection / Inflam­mation is associated with elevated levels of EP. This is due to the increase in pro-inflam­matory cytokines in the systemic circu­lation which stimulate an increase in hepcidin concen­trations. Elevated level of circu­lating hepcidin reduce both iron absorp­tion and the release of iron from body stores so that iron is no longer available for trans­port to the bone marrow for erythro­poiesis. When the iron is no longer available, divalent zinc is inserted into protoporphyrin, so concen­trations of EP are elevated, irrespective of the EP assay method used. As a result, during inflam­mation, assay of EP alone cannot be used to distin­guish between the anemia of iron defi­ciency and the anemia of chronic disease.

Preg­nancy is believed not to result in a decline in EP when expressed as a molar ratio to Hb, as the ratio is theoretically independent of hemo­dilution. Conse­quently, EP expressed as a molar ratio to Hb has been proposed as a preferred marker of iron status in preg­nancy (Schifman et al., 1987).

How­ever, in pregnant women in Kenya without evidence of inflam­mation or infection with Plasmodium or HIV, EP reportedly had a limited ability to discriminate between women with and without iron defi­ciency even when combined with Hb, and irrespective of whether EP was measured in washed erythrocytes or whole blood using a hemato­fluor­ometer or by solvent extraction. Moreover, the preva­lence of iron defi­ciency was grossly over-estimated when applying the conventional EP cutoff for whole blood (> 70µmol/mol Hb) in comparison with that based on plasma ferritin concen­trations < 15µg/L, especially when the preva­lence of iron defi­ciency was low (e.g., 10%) (Mwangi et al., 2014). Based on these results, the usefulness of EP:Hb molar ratio to assess iron status during preg­nancy, particularly in disadvantaged settings, remains uncertain.

Lead toxicity can result in elevated concen­trations of EP due to distur­bances in the heme synthesis pathway; lead inhibits porphobilinogen synthase which results in an increased excretion of amino­lae­vulinic acid (ALA) in the urine. Lead may also have a direct effect on the enzyme ferrochetalase (WHO, 2007). As a conse­quence, ZPP (now termed EP) is formed instead. In U.S. NHANES II, EP values were elevated in children 1‑4y with high blood lead values, although lower in U.S. NHANES III following the introduction of lead-free petrol (Looker et al., 1997).

Genetic Hb disorders such as sickle cell anemia, thalassemia, and Hb E increase EP concen­trations, most likely due to ineffective erythro­poiesis and red blood cells with a short life span (Graham et al., 1996; Mwangi et al., 2014; Pfeiffer and Looker, 2017). The relative iron defi­ciency that occurs in these disorders has also been proposed as a possible factor, although seemingly unlikely because of the increase in EP reported in individuals with these conditions, even in the absence of nutritional iron defi­ciency (Graham et al., 1996).

Malaria from Plasmodium infection is associated with elevated EP concen­trations in whole blood or erythro­cytes, as a conse­quence of impaired erythro­poiesis and the induction of the acute phase response, and subsequent up­reg­ulation of hepcidin; see also “Malaria” in Section 17.2.1 for more details.

17.8.2 Interpretive criteria — use of the distribution of erythrocyte protoporphyrin reference values

Details of the distribution of reference values from four sources are described below, of which only one was based on a “healthy” reference sample.
Table 17.26. Pediatric reference ranges of ZPP blood concen­trations (2.5th‑97.5thpercentiles µg/dL) for females and males age 0‑17 years. Multiply µg/dL by 1.83 to convert to µmol/mol heme. Data from Soldin et al. (2003).
Blood ZPP Reference
Ranges 2.5th‑97.5th
percentiles (µg/dL)*
Blood ZPP Reference
Ranges 2.5th‑97.5th
percentiles
(µmol/mol heme)
Age Groups Males Females Males Females
0‑12mos
n=145M; 203F
8.5‑34.5 9‑40 15.6‑63.5 16.6‑73.6
13‑24mos
n=605M; 725F
10‑34 11‑32 18.4‑62.6 20.2‑58.9
2‑5y
n=1,926M; 1,822F
5‑35 10‑31 9.2‑64.4 18.4‑57.0
6‑9y
n=522M; 408F
6‑31 9‑30 11.0‑57.0 16.6‑55.2
10‑17y
n=61M; 61F
5.5‑31.5 5‑33.5 10.1‑58.0 9.2‑61.6

17.8.3 Cutoffs for erythrocyte proto­porphyrin

Cutoff for infants aged 4, 6, and 9mos expressed as µmol/mol heme based on iron-replete breastfed infants are presented in Table 17.16 (Domellöf et al. 2002). WHO (2017) present a single cutoff for infants 0‑12mos as > 80µg/dL RBC (i.e., > 1.42µmol/L RBC; 2.7µg/g Hb; > 70 µmol/mol heme) (Table 17.27).

Cutoffs for children ages 1‑2y, 2‑9y and adolescents (girls) age 10‑17y are also shown in Table 17.27. These cutoffs are based on the CDC recom­mended demar­cation between iron deficient erythro­poiesis and iron sufficient erythro­poiesis, expressed as µmol/L RBCs or as µg/dL RBCs. Table 17.27 also includes the 97.5th percentiles (expressed as µg/dL RBCs) adapted from Soldin et al. (2003) for comparison.
Table 17.27. The threshold concen­trations of erythrocyte protoporphyrin by age groups at which iron-deficient erythro­poiesis occurs according to the Centers for Disease Control and Prevention (CDC) recom­mendations and the 97.5th percentile of the distribution of values in healthy subjects. aCDC recom­mendation as the demar­cation between iron-deficient erythro­poiesis and iron-sufficient erythro­poiesis. Abstracted from WHO (2007).
AgeEP
(µmol/L
RBC
Suggested
thresholda
(µg/dL
red cells)
97.5th
percentile
(µg/dL
red cells)
0‑12mos> 1.42> 80 40
1‑2y > 1.24> 80 32
2‑9y > 1.24> 70 30
10‑17y (girls)> 1.24> 70 34
Note the large discrepancy between the CDC cutoffs and the statistically derived upper reference limit based on the 97.5th percentile of Soldin et al. (2003), empha­sizing that the derivation of cutoff values is very method dependent.

WHO have provided cutoffs for the detection of iron defi­ciency for children < 5y and 5y or more as > 61µmol/mol heme (i.e., > 70µg/dL RBC) and > 70µmol/mol heme, (i.e., > 80µg/dL RBC) respectively. These are based on the sensitivity and specificity for detecting the absence of storage iron (WHO, 2001).

Cutoff for adults for EP in whole blood of > 70µmol/mol heme (i.e., > 80µg/dL RBC; > 2.7µg/g Hb) was developed by Looker et al. (1997). This cutoff approximated the 95th percentile of the U.S. NHANES III “healthy” reference sample. WHO (2001) suggest that this cutoff for adults is expressed in one of the following three ways: > 70µmol/mol heme; > 80µg/dL RBC; >≈ 3.0µg/g Hb.

Cutoff for preg­nancy is the same as that used for EP for non-pregnant adults i.e., in whole blood: > 70µmol/mol heme (> 80µg/dL RBC, >≈ 3.0µg/g Hb). and compiled by Looker et al. (1997). The cutoff for non-pregnant adult women was selected because EP, when expressed as the ratio of EP/Hb, was believed to remain stable during preg­nancy, as noted earlier. How­ever, the use of this conventional cutoff for preg­nancy has been questionned. Mwangi et al. (2014) reported that applying this cutoff (> 70µmol/mol heme) led to a gross overestimate of the preva­lence of iron defi­ciency in preg­nancy in their study of Kenyan women.

17.8.4 Measure­ment of erythrocyte proto­porphyrin

Two methods can be used to measure erythrocyte proto­porphyrin: a direct and indirect method. The direct method uses a portable hemato­fluor­ometer to measure the fraction of blood proto­porphyrin that exists as the zinc chelate (i.e., ZPP). The manufacturer of the hematofluorometer assumes that ZPP comprises 95% of all the erythrocyte protoporphyrin in blood, thus providing the basis for the estimates of EP concen­trations (WHO, 2017). Only a drop of fresh whole, unprocessed blood (< 20µL) obtained by capillary sampling is required, and the measure­ment is available in ≈ 1min. Calibration standards are normally provided to check the stability of the hemato­fluor­ometer (Labbé et al., 1999), although for recalibration, the equipment must be shipped back to the manufacturer. A NCCLS Guideline for erythrocyte proto­porphyrin testing is available.

Several issues may affect the direct measure­ment of EP using a hemato­fluor­ometer. These include the requirement to oxygenate the blood sample if the time between the blood sampling and EP measure­ment is > 30min, and the need to wash the red blood cells with saline. Incomplete oxygenation produces falsely low values because of a shift in Hb absorp­tion. To ensure complete oxygenation and dissolution of red-cell aggregates, the whole blood must be stirred gently using a disposable plastic pipet tip. Wooden applicator sticks should not be used for mixing as they may con­tain fluorescent by-products. Successive observations should be within 10%. Use of red blood cells that are washed with saline rather than whole blood are recom­mended to remove interference by non-porphyrin fluorescent compounds such as bilirubin, certain drugs, and high ribo­flavin concen­trations that may be present in the plasma (Hastka et al., 1992), but is rarely performed. Application of these strategies is said to reduce the number of falsely high results and improve the reproducibility and precision of the EP measure­ments; they are discussed in detail in Labbé et al. (1999).

The indirect method involves a variation of the acid extraction method of Chisolm and Brown (1975) during which ZPP is converted to FEP. This method is more time con­sum­ing and less widely used, although used by U.S. NHANES II and III (Mei et al., 2017). It involves the extraction of total proto­porphyrin from anti­coagulated whole blood into a mixture of ethyl acetate-acetic acid, then back-extraction into dilute hydrochloric acid, followed by measure­ment of proto­porphyrin fluorometrically. Measure­ments of EP using this method can be made on stored blood samples or from blood collected on filter paper in the field. Results for EP using this indirect method are similar to those using the direct method, as noted earlier, provided the hemato­fluor­ometer used is properly calibrated and maintained and a standardized procedure is followed (Mei et al., 2003).

A novel non-invasive method has been developed to optimally measure EP in the microcirculation of the lower lip by using an optical fiber probe to illuminate the lip and acquire fluorescence emission spectra in ≈ 1min. The method is fast and provides specific testing for iron defi­ciency and iron defi­ciency anemia at the point of care without the need for blood drawing; details are given in Hennig et al. (2016). Method reliability was compared by measurements of EP by both HPLC and fluorescence spectroscopy, and feasibility tested on young children. Measure­ments of EP by this novel method to identify iron defi­ciency and iron defi­ciency anemia in children 9mos‑5y compared well with those based on ferritin, serum soluble trans­ferrin receptor, and Hb (Homann et al., 2019).

17.9 Serum soluble trans­ferrin receptor

Trans­ferrin receptor (TfR) is an iron-related protein that regulates the uptake of trans­ferrin, carrying iron into body cells. The surfaces of all cells express TfR in proportion to their requirement for iron, although the largest number are in the erythron, placenta, and liver (Beguin, 2003). A soluble form of sTfR circulates in human plasma bound to trans­ferrin and is usually called soluble trans­ferrin receptor (sTfR). The predominant source of sTfR in the plasma pool are the cells of the developing red-cell mass — the erythroblasts and reticulocytes. Hence, the concen­tration of sTfR reflects erythropoietic activity in the bone marrow (Table 17.28).
Table 17.28. Serum trans­ferrin receptor concen­tration changes in human disease. Based on Worwood (2002).
sTfR Condition
Increased
or not
affected
Increased erythroid proliferation
or ineffective erythropoiesis:
 — Autoimmune hemolytic anemia
 — Hereditary spherocytosis
 — Beta thalassemia/HbE
 — Hb H disease
 — Sickle cell anemia
 — Polycythemia vera
 —Vitamin B12 defi­ciency
 — Folic acid defi­ciency
Decreased tissue iron stores:
 — Iron-defi­ciency anemia
Normal to
increased
Idiopathic myelofibrosis
Myelodysplastic syndrome
Chronic lymphocytic leukemia
Normal Hemochromatosis (see text)
Acute and chronic myeloid leukemia
Most lymphoid malignancies
Solid tumors
Decreased Chronic renal failure
Aplastic anemia
Post bone-marrow transplantation
Hypertransfusion

In the first stage of the devel­opment of uncom­plicated iron defi­ciency anemia, termed “iron depletion” (Section 17.1.9), the supply of iron to the functional compartment is not compro­mised, so concen­trations of sTfR in serum or plasma are unaffected. How­ever, in the second stage, termed “iron-deficient erythro­poiesis” when body iron stores are exhausted and the availability of iron to tissues is compro­mised, trans­ferrin produc­tion is upregulated to increase iron trans­port, and cell surface TfR1 expres­sion increases. This allows the cells to compete more effectively for trans­ferrin-bound iron, and results in an increase in sTfR concen­trations in plasma or serum. In contrast, when there is sufficient body iron, TfR1 is down regulated so concen­trations of sTfR in the plasma or serum fall. As a result, sTfR levels in plasma or serum reflect the functional iron deficit (Gimferrer et al., 1997). For example, in elderly women with uncom­plicated iron defi­ciency, but not necessarily anemia, sTfR levels in plasma were 1.3 times the levels in iron-sufficient controls (Ahluwalia et al., 1995), whereas in those with microcytic, hypo­chromic anemia, sTfR levels in serum were 5.8 times that of the controls (Gimferrer et al., 1997). Concen­trations of sTfR in serum have also been shown to corre­late with iron status based on stainable bone marrow iron in several studies (Chang et al., 2007; Koulaouzidis et al., 2009).

One of the main advantages of using serum sTfR is that concen­trations are less strongly affected by inflam­mation than serum ferritin. Nevertheless, because any change in the rate of erythro­poiesis will in turn affect serum sTfR concen­trations, sTfR can only be used as an indicator of iron status when iron stores are exhausted and there are no other causes of abnormal erythro­poiesis (WHO, 2007). Table 17.28 depicts some of the disease conditions that are known to alter serum sTfR concen­trations. More details of factors affecting serum sTfR concen­trations are outlined below.

17.9.1 Factors affecting serum soluble transferrin receptor

Biological variation for soluble transferrin receptor concen­tration is low, with reasonably stable concen­trations within an individual so that only one blood sample is sufficient to measure sTfR. Cooper and Zlotkin (1996) determined the total day-to-day within-subject coefficient of variation (CV) for a group of healthy men (n=10) and women (n=11) aged 19‑46y, based on measure­ments on 10 nonconsecutive days over a 4wk period. Biological variation for serum sTfR in this study was 12.5%, compared with 8.9% (Lammi-Keefe et al., 1993) and 8.1% (Belza et al., 2005) in other popu­lation groups.

Age-related changes in serum sTfR occur, in part as a result of the increase in erythro­poiesis during growth, although other mech­anisms may be involved. Levels increase from birth until ≈ 7.5mos in breast-fed infants, after which they level off, remaining relatively stable for the rest of the first year of life (Ziegler et al., 2014).

During childhood and adolescence, data to define age related trends in serum sTfR levels are more limited. In the U.S. NHANES 2003‑2010 surveys, serum sTfR concen­trations were higher in pre­school children age 1‑2y than those aged 3‑5y (Figure 17.14).
Figure 17.14
Figure 17.14. Distributions of serum soluble trans­ferrin receptor (sTfR) concen­trations in U.S. pre­school children and non-pregnant women of childbearing age partic­ipating in the U.S. NHANES 2003‑2010. Redrawn from Mei et al. (2012).
In adolescent girls (15‑19y), concen­trations were higher than those for adult women aged 20‑34y, although not higher compared to adults aged 35‑49y (Mei et al., 2012). These age-related trends have also been reported in other studies (Choi et al., 1999; Virtanen et al., 1999).

Sex-related differences in serum sTfR are small (Worwood, 2002). During early infancy, values may be higher for males than for females, perhaps as a result of sex differences in fetal iron accretion (Domellöf et al., 2002a). These differences have not been reported consistently in older infants (9‑15mo) (Yeung and Zlotkin, 1997). In the U.S. NHANES 2003‑2010 survey, how­ever, sTfR concen­trations were also higher in pre­school-aged boys than girls after adjusting for socio-demographic variables (Mei et al., 2012).

Race may influence serum sTfR concen­trations. Concen­trations for African American pre-school children and non-pregnant women were 9% higher than those for Caucasians, Asians, and Hispanics in the U.S. NHANES 2003‑2010 surveys (Mei et al., 2012). Such race-related differences may be attributed to the well-established but as yet unexplained lower Hb concen­tration known to occur among persons of African descent (Allen et al., 1998).

Tissue iron defi­ciency leads to increases in serum sTfR, as noted earlier. Even after iron stores are totally exhausted, serum sTfR concen­trations continue to increase in proportion to the increasing tissue iron deficit, as Hb concen­trations fall (Skikne et al., 1990). This trend has been observed in neonates (Rusia et al., 1996), infants and young children (Olivares et al., 2000), pregnant women (Carriaga et al., 1991), and adults (Skikne et al., 1990). When the iron supply is not limiting, then concen­trations of sTfR in plasma or serum can be used to monitor bone marrow erythro­poiesis (Lynch et al., 2018).

Iron overload in patients with hemachromatosis and in those with African iron overload has been associated with serum sTfR concen­trations within the normal range (Baynes et al., 1994), so that measurement of sTfR is not useful for diagnosing iron overload.

Preg­nancy is accompanied by marked physio­logical changes that may confound the interpretation of serum sTfR levels. Although reported to provide a sensitive biomarker of iron deficiency during pregnancy (Carriaga et al., 1991), sTfR levels are also influenced by changes in the rate of erythropoiesis during this time. For example, during early pregnancy, the rate of erythropoiesis decreases and may mask the development of iron deficiency (Akesson et al., 1998), whereas in later pregnancy when increases in serum sTFR are observed, whether they are related to the increase in erythropoiesis rather than iron depletion is uncertain (Worwood et al., 2017). The elevated concentrations of serum sTfR that are evident in later pregnancy return to non-pregnant values 12wks after delivery (Choi et al., 2000).

In pregnant women in Malawi in which iron biomarkers were investigated by comparison with stained bone marrow aspirates, sTfR did not enhance the sensitivity and specificity for detecting iron deficiency anemia in pregnancy (Van Den Broek et al., 1998). Lynch (2011) has proposed the use of the ratio of sTfR to serum ferritin to detect iron deficiency during trimesters 1 and 2, but has no recom­mendation for the third trimester of pregnancy. How­ever, total body iron (Section 17.10.1) was used to assess iron status in women in the 1st, 2nd, and 3rd trimester of pregnancy in the cross-sectional U.S NHANES 1999‑2006 study (Mei et al., 2011).

Table 17.29 presents the changes in several iron biomarkers, including serum sTfR, body iron (Section 17.10.1) and serum hepcidin (Section 17.11.1), in a longitudinal study of pregnant women in the Gambia at 14wk, 20wk, and 30wk gestation, supplemented with iron from enrolment to delivery (Bah et al., 2017).
Table 17.29. Iron and hematologic indexes in pregnant Gambian women. Values are the arithmetic (hemo­globin, MCV, sTfR, CRP, and total body iron) or geometric (ferritin, sTfR-F index, and hepcidin) means (ranges). P values are for 2‑sided paired t‑tests comparing analytes between 14 and 20wk and between 20 and 30wk of gestation. CRP, C‑reactive protein; MCV, mean cell volume; sTfR, soluble trans­ferrin receptor; sTfR‑F index, sTfR/log10 (ferritin). From Bah et al. (2017).
Index 14wk
n=395
20wk
n=375
P30wk
n=367
P
Gestational
age, wk
14.1 (8.0, 21.3) 20.4 (15.0, 26.9) 30.5 (25.4, 34.4)
Hemo­globin
g/dL
11.55 (7.2, 17.9) 11.00 (7.4, 14.5) < 0.001 10.77 (6.2, 14.4) < 0.0001
MCV
fL
81.9 (60.9, 98.6) 83.8 (62.9, 104.0) < 0.001 83.6 (61.8, 99) NS
Serum ferritin
µg/L
20.69 (0.1, 237.2) 19.20 (0.1, 273.8) NS 14.29 (0.1, 315.5) < 0.001
Serum sTfR
mg/L
4.41 (0.58, 17.97) 4.25 (1.12, 15.42) NS 4.80 (1.49, 17.81) < 0.001
Serum sTfR-F
index
3.29 (−14.80, 111.63) 3.26 (−5.63, 56.43) NS 3.86 (−87.19, 76.26) NS
Serum CRP
mg/L
1.70 (0.02, 43.48) 2.41 (0.00, 59.32) 0.001 2.29 (0.01, 126.68) NS
Total body iron
mg/kg
2.70 (−18.38, 14.40) 2.50 (−17.57, 14.06) NS 1.18 (−17.84, 11.02) < 0.001
Serum hepcidin
ng/L
1.59 (0.03, 49.79) 1.23 (0.02, 45.70) 0.006 1.09 (0.04, 135.69) NS
Changes in these iron biomarkers were marked. Note that the Hb declined from 14wk through 30wk gestation, hepcidin declined from enrolment through 20wk gestation, whereas the changes for serum ferritin, serum transferrin receptor, and thus total body iron were only significant here between 20wk and 30wk gestation. Extensive investigations of associations between the variables revealed that hepcidin was superior to sTfR (and Hb) as an indicator of iron deficiency; see Section 17.11.1 for more discussion of hepcidin.

Infection / Inflam­mation is now known to increase serum sTfR concen­trations, although less strongly than for serum ferritin (Rohner et al., 2017). In pre­school children and women of reproductive age, the preva­lence of elevated sTfR concen­trations (i.e., > 8.3mg/L) indicative of iron-deficient erythro­poiesis decreased incrementally as serum CRP and AGP deciles decreased, based on a large dataset from the BRINDA Project (Engle-Stone et al., 2017). The decrease observed was more pronounced for AGP, a measure of more longer-term exposure to inflam­mation than CRP, as shown in Figure 17.15.

Figure 17.15
Figure 17.15. Preva­lence of elevated sTfR concen­trations in pre­school children by CRP and AGP deciles: BRINDA project. Analysis was restricted to surveys (Bangladesh, Cameroon, Côte d'Ivoire, Kenya 2007, Kenya 2010, Laos, Liberia, Philippines, and Papua New Guinea) that measured both CRP and AGP for comparability between CRP and AGP relations with bio­markers (n=9326). AGP, α‑1‑acid-glycoprotein; BRINDA, bio­markers Reflecting Inflam­mation and Nutrition Determinants of Anemia; CRP, C‑reactive protein; sTfR, soluble trans­ferrin receptor.
Redrawn from Rohner et al. (2017).
As a result, measure­ment of AGP has been recom­mended in any procedure used to adjust serum sTfR for the presence of inflam­mation. Details of the BRINDA regression approach to adjust serum sTfR values for inflammation are available in Rohner et al. (2017). How­ever, this approach needs to be evaluated in dif­fer­ent settings and in pregnant women (Gupta et al., 2017).

Other disease conditions also affect sTfR concen­trations (see Table 17.28), which must be considered when using sTfR as an indicator for nutritional iron defi­ciency (Rohner et al., 2017).

Malaria has the potential for influencing serum sTfR concen­trations in two opposing ways. A decrease in serum sTfR concen­trations may result from a reduction in erythro­poiesis associated with the marked inflam­matory response during the acute phase of malarial infections. Erythro­poiesis may also be suppressed in part by a direct inhibitory effect of hemozoin, a parasite by-product of Hb (Lamikanra et al., 2007).

Alternatively, serum sTfR concen­trations can be elevated in malaria after the initial acute infection. This trend is thought to be due to an increase in erythropoietic activity to compensate for the increased red-cell destruction. Hence, in areas where both iron defi­ciency and malaria occur, the reliability of serum TfR as an indicator of iron status is uncertain. Rohner et al. (2017) reported a decrease in the preva­lence of iron-deficient erythro­poiesis based on elevated serum sTfR concen­trations, after applying the regression approach based on AGP to adjust for the presence of malaria. Nevertheless, these investigators stressed that additional adjust­ments for malaria, unrelated to the acute-phase response, may also be needed.

Genetic Hb disorders such as thalassemia traits and Hb EE may elevate serum sTfR due to ineffective erythro­poiesis (i.e., an increase in the proportion of immature red cells destroyed within the bone marrow) and red blood cells with a short half‑life (Khatami et al., 2013; Manolova et al., 2019). Women and children in Cambodia, for example, identified with Hb EE had serum sTfR concen­trations that were signif­icantly higher compared to those with either a normal Hb or Hb AE (Karakochuk et al., 2015; George et al., 2013). Clearly, it is difficult to use serum sTfR to detect tissue iron defi­ciency in popu­lations where the preva­lence of these disorders is high.

Increased or ineffective erythro­poiesis in disease conditions itemized in Table 17.28, are associated with elevated serum sTfR concen­trations, as reported earlier. Note that both folic acid and vitamin B12 defi­ciency are included because they are both associated with defective but elevated erythro­poiesis. In such cases, use of serum sTfR measure­ments alone would yield a false-positive diagnosis for iron defi­ciency.

Decreased erythro­poiesis associated with several disease conditions listed in Table 17.28 leads to a fall in serum sTfR concen­trations (Beard et al., 1996; Ahluwalia, 1998). Fortunately, these conditions can usually be identified by measuring serum ferritin, which is normal in hematologic disorders with decreased erythro­poiesis (Section 17.7.1).

High altitudes may result in higher serum sTfR concen­trations, as a conse­quence of the increase in the total body erythroid mass in response to systemic hypoxia. Hypoxia increases the expres­sion of erythro­poietin, and the erythro­poietin drives the produc­tion of new erythro­cytes (Tanno and Miller, 2010).

Lifestyle factors such as smoking have been linked to a signif­icant decrease in serum sTfR concen­trations (Raya et al., 2001; Pynaert et al., 2008). In contrast, sTfR levels appear to be unaffected by alcohol consumption, despite the liver damage and change in iron metabolism that occurs (Speeckaert et al., 2010). High levels of physical activity are associated with elevated levels of sTfR. The etiology is uncertain but possibly linked with a lower iron status (Woolf et al., 2009).

Obesity has been linked to elevated serum sTfR concen­trations. This trend has been related to elevated levels of hepcidin in response to obesity-related inflam­mation and an increase in Hb mass (i.e., total Hb in g) in overweight and obese women (see Table 17.15 and Section 17.6.1). Together these two factors con­tribute to an increased risk of iron defi­ciency in obesity Cepeda-Lopez et al., 2019).

Oral contraceptive agents and hormone replacement therapy do not affect serum TfR concen­trations (Raya et al., 2001).

17.9.2 Interpretive criteria — use of the distribution of soluble transferin receptor reference values

Defining reference values for sTfR in healthy individuals is difficult because serum sTfR concen­trations and the units of expres­sion vary with the assay; even dif­fer­ent kits based on the ELISA technique are associated with dif­fer­ent “reference” ranges. An international standard is now available (Thorpe et al., 2010), so distributions of reference values and internationally applicable cutoff values for sTfR should be available in the future.

Table 17.30. Geometric mean, 2.5th and 97.5th percentile (95% confidence interval) for serum soluble trans­ferrin receptor (sTfR, mg/L) concen­trations in a “healthy” reference popu­lation of U.S. children aged 1‑5y and non-pregnant women aged 15‑49y partic­ipating in the U.S. NHANES III 2003‑2010. The following exclusions apply: children with anemia, iron defi­ciency and infection; non-pregnant women with anemia, iron defi­ciency, infection, inflam­mation and liver disease.  Within a group, values with dif­fer­ent superscript letters (a,b) are signif­icantly dif­fer­ent in means, p < 0.05 (2-tailed t-test) and a Bonferroni adjust­ment was used for to correct the p values for the signif­icant test for the multiple comparisons across each demographic characteristic in sTfR. Data from Mei et al. (2012).
n Geometric mean 2.5th percentile 97.5th percentile
Children
1‑2y M 386 4.29 (4.16, 4.42)a 2.53 (2.22, 2.94) 6.46 (6.12, 6.86)
1‑2y F 354 4.19 (4.07, 4.31)a 2.76 (2.66, 2.96) 6.02 (5.59, 6.62)
3‑5y M 735 3.91 (3.84, 3.99)b 2.55 (2.40, 2.64) 5.97 (5.67, 6.23)
3‑5y F 678 3.83 (3.76, 3.90)b 2.57 (2.38, 2.71) 5.67 (5.38, 5.90)
Non-pregnant women
15‑19y 1237 3.14 (3.06, 3.23)a 1.86 (1.77, 1.97) 5.25 (4.99, 5.70)
20‑49y 3423 3.05 (3.00, 3.09)b 1.65 (1.59, 1.72) 5.40 (5.18, 5.66)
Details of the distribution of reference values from three sources are described below, each drawn from a “healthy” reference sample in which participants with conditions known to affect iron status were excluded. For each distribution of reference values, a dif­fer­ent assay method was used.
Table 17.31. Serum trans­ferrin concen­trations in various apparently “healthy” non­anemic popu­lations. Data for neonates from cord blood. 95% confidence intervals (CI) calculated using non-parametric methods. Data from Choi et al. (1999).
Age Mean ± SD
(mg/L)
95% CI
(mg/L)
Neonates 0‑5min 4.95 ± 1.24 2.21‑6.74
Children
 — Infants 4‑24 mo 4.51 ± 1.12 2.15‑6.31
 — Young children 3‑7 y 3.02 ± 0.76 1.47‑4.24
Adolescents
 — Middle school 14‑16 y 2.86 ± 0.74 1.35‑4.19
 — High school 17‑19 y 2.09 ± 0.55 1.18‑3.23
Adults 23‑62 y 2.13 ± 0.51 1.22‑3.31

17.9.3 Cutoff values indicative of risk tissue iron defi­ciency

There is currently no universally agreed cutoff value for serum sTfR concen­tration because of the large assay differences in the measure­ment of sTfR, as noted earlier. These discrepancies reflect the differences in preparations of trans­ferrin receptor used to raise antibodies and as a standard in the various assays. As a conse­quence, assay-dependent cutoffs must be applied. There is an urgent need to establish internationally applicable cutoff values for serum sTfR concen­trations.

Cutoffs for breastfed infants at age 4, 6, and 9 mos were developed by Domellöf et al. (2002a); these are shown in Table 17.32. No details of the assay used are provided.
Table 17.32. Suggested −2SD cutoff values for iron status variables for infants at 4, 6, and 9mo, based on iron-replete, breastfed infants. MCV, mean cell volume (data from Swedish infants); ZPP, zinc proto­porphyrin; TfR, trans­ferrin receptor. From Domellöf et al. (2002a).
Iron status variables4mo 6mo 9mo
Hb (g/L) < 105 < 105 < 100
MCV (fL) < 73 < 71 < 71
ZPP (µmol/mol heme) > 75 > 75 > 90
Ferritin (µg/L) < 20 < 9 < 5
sTfR (mg/L) > 11 > 11 > 11

Cutoff for pre­school children 1‑5y based on the 97.5th percentile in a U.S. “healthy” reference popu­lation was 6.0mg/L (Table 17.30; Mei et al., 2012). The ROCHE Tina-quant sTfR assay was used.

Cutoff for non-pregnant women 15‑49y based on the 97.5th percentile in a U.S. “healthy” reference popu­lation was 5.3mg/L (Table 17.30) (Mei et al., 2012). The ROCHE Tina-quant sTfR assay was used.

Cutoff for preg­nancy of > 4.4mg/L indicative of iron defi­ciency was used for both U.S. pregnant women from U.S. NHANES 1999‑2006 (Mei et al., 2011) and for pregnant women in Kenya (Bah et al., 2017). This cutoff was derived by the manufacturer of the Roche Tina-quant ferritin immuno­turbi­metric assay (Kolbe-Busch et al., 2002). It is of interest that use of this sTfR cutoff yielded a preva­lence estimate for iron defi­ciency that was comparable to that based on total body iron < 0mg/kg; for more details on total body iron (TBI), see “Body iron” in Section 17.10.1. Several other studies have applied a cutoff of sTfR > 8.5mg/L, said to be indicative of tissue iron defi­ciency, as determined by quantitative phlebotomy (Skikne et al., 1990), as well as specific for iron defi­ciency during preg­nancy (Carriaga et al., 1991; Akesson et al., 1999; Rusia et al., 1999). These studies have all employed the RAMCO enzyme immuno­assay which yields values that are on average 30% higher than those with the ROCHE Tina-quant automated sTfR immunoturbidity assay (Pfeiffer et al., 2007).

17.9.4 Measure­ment of serum soluble trans­ferrin receptor

Serum is the preferred matrix although some assays can use EDTA-heparin or citrate plasma. Both venous and capillary blood samples can be used but must be refrigerated immediately after collection. Serum samples can be stored for no longer than 2d at room temperature, 7d at 4°C, and 1y at < −20°C. Pfeiffer and Looker (2017) report serum sTfR was stable for > 10y when stored at −70°C. Repeated freezing and thawing is not recom­mended, although < 3 freeze-thawing cycles do not negatively affect serum sTfR concen­trations. Moderate hemolysis does not confound the results. Assays on whole-blood spots dried on filter paper are now possible, although not widely used (Cook et al., 1998; McDade et al., 2002).

In the past, two-site immuno­radio­metric assays and ELISA were used to measure serum sTfR. Today, measure­ment of serum TfR in clinical laboratories is carried out via immuno­assays using commercial kits on automatic clinical analyzers (e.g., Tina-quant sTfR, Roche Diagnostics) (Kolbe-Busch et al., 2002). How­ever, clinical analyzers are expensive, require a relatively large sample volume (typically > 150µL), and utilize reagents that are relatively costly, with poor comparability across assays. In low income countries, manual ELISA assays available from a few manufacturers are still used, requiring a smaller sample volume (< 50µL), and a relatively inexpensive microplate reader, although their precision is only moderate (Lynch et al., 2018). The RAMCO enzyme immuno­assay yields values that are on average 30% higher than those from the Roche Tina-quant sTfR assay (Pfeiffer et al., 2007).

An international sTfR certified reference material is now available from WHO through the United Kingdom National Institute of Biological Standards and Control [sTfR (NIBSC RR 07/202)]. Devel­opment of this certified reference material will enable dif­fer­ent laboratories to calibrate their own assays, facilitating comparisons across countries (Thorpe et al., 2010).

17.10 Multiple bio­markers

Table 17.33. Relationships between bio­markers and iron status. From Lynch et al. (2018).


Iron status
Sustainable
bone marrow
iron
Serum
ferritin
Trans­ferrin
satu­ration
Erythrocyte
proto-
porphyrin
Serum
trans­ferrin
receptor
Hemo-
globin
Iron-defi-
ciency anemia
Absent Low Low High High Low
Iron-deficient
erythro­poiesis
Absent Low Low High High   Normal  
Iron
   depletion
Absent Low Normal Normal Normal Normal
Normal iron
   status
Normal   Normal   Normal Normal Normal Normal
Iron overload Normal or
increased
High High Normal Normal Normal
The use of several concurrent bio­markers of iron status provides a more valid assessment of iron status than measure­ment of any single bio­marker. Such an approach minimizes the misclassification that can occur due to the overlapping of normal and abnormal values when a single measure is used.

Moreover, use of multiple bio­markers enables the three stages in the devel­opment of iron-defi­ciency anemia (iron depletion, iron-deficient erythro­poiesis, and iron-defi­ciency anemia) to be charac­ter­ized. In this way, the severity of iron defi­ciency can be dif­fer­entiated more readily. In addition, normal iron status and iron overload can also be identified. The relationship between the bio­markers commonly used (serum ferritin, trans­ferrin satu­ration, erythro­cyte proto­porphyrin, serum sTfR, Hb) and the gold standard test — sustainable bone-marrow iron — with iron status is shown in Table 17.33.
Figure 17.16
Figure 17.16. preva­lence of impaired iron status in subjects of varying ages estimated using the ferritin model and the MCV model: U.S. NHANES II, 1976‑1980. Redrawn from Pilch and Senti (1984).

In the past, a combination of three bio­markers of iron status has been used, with abnormal values for at least two of the three bio­markers indicating iron defi­ciency. If iron defi­ciency was concurrent with a low Hb concen­tration, then individuals were considered to have iron defi­ciency anemia (Looker et al., 1997). In U.S. NHANES II and U.S. NHANES III (Pilch and Senti, 1984; Dallman et al., 1984; Looker et al., 1997), two models were used to assess the relative preva­lence of impaired iron status and anemia in selected groups of the U.S. popu­lation. The first (ferritin model) was based on serum ferritin, trans­ferrin satu­ration, and FEP; whereas the second (MCV model) used MCV rather than ferritin, together with trans­ferrin satu­ration, and FEP. Figure 17.16 compares the preva­lence of impaired iron status by age group in individuals partic­ipating in the U.S. NHANES II, 1976‑1980 based on the ferritin and MCV models. In most age groups, as expected, the preva­lence of impaired iron status was higher using the ferritin model than the MCV model, most notably for the younger age groups.

The cutoffs used by NHANES to identify abnormal values of iron status based on the three bio­markers used in the ferritin model and included in the analysis of selected data from NHANES are shown in Table 17.16. (Looker et al., 1997). These cutoffs to define abnormal values were validated whenever possible in clinically diagnosed adult patients, and there­fore are “true cutoffs”, with the exception of those for children and adolescents. Subsequently these same cutoffs were recom­mended by WHO / UNICEF / UNU in (2001 together with cutoffs for iron overload. Note the differences in the mode of expres­sion for erythro­cyte proto­porphyrin between Table 17.16 and Table 17.34.
Table 17.34. Cutoff values for erythrocyte proto­porphyrin, serum ferritin, and trans­ferrin satu­ration by stages of iron status and by popu­lation group. From Lynch et al. (2018).
Popu­lation group
< 5y of age≥ 5y of age
Erythrocyte
proto­porphyrin
 Iron overload Normal Normal
 Normal iron statusNormalNormal
 Iron depletionNormal Normal
 Iron defi­ciency with
 or without anemia
> 70g/dL RBC> 80µg/dL RBC
> 2.6µg/g Hb > 3.0µg/g Hb
> 61mmol/mol
heme
> 70mmol/mol
heme
Serum ferritin, µg/L
 Severe risk of iron
overload
No cutoff> 200 (adult males)
> 150 (adult females)
 Depleted iron stores< 12 < 15
Trans­ferrin satu­ration
 Iron overload > 60‑70%
 Iron defi­ciency anemia < 16%

Increasingly, the inclusion of serum sTfR as one of the laboratory criteria to assess iron status is recom­mended because it is a relatively sensitive and reliable guide to the degree of functional iron defi­ciency (i.e., iron-deficient erythro­poiesis) (Skikne et al., 1990), and less affected by inflam­mation than serum ferritin (Rohner et al., 2017).

The NHANES III 2003‑2006 survey was the first U.S. NHANES survey to include a combination of bio­markers that included serum sTfR, serum ferritin, and Hb to describe the iron status of a popu­lation. The rationale for this combination was that while serum ferritin levels reflect the decline of body iron stores, it does not reflect the severity of the depletion once the iron stores are exhausted. In contrast, serum sTfR concen­trations increase after iron stores are exhausted, when they continue to rise with increasing functional iron defi­ciency (i.e., in iron-deficient erythro­poiesis).

Further­more, such a combination provides diagnostic information over a wide spectrum of iron status, ranging from replete iron stores to overt iron defi­ciency anemia. Cutoff values for serum sTfR are not included in Table 17.34 because assay-dependent cutoffs must be applied.

Table 17.35. Characteristics of the serum ferritin and total body iron model. Data from Pfeiffer and Looker (2017).
Ferritin model — characteristics
Does not indicate degree of severity of iron defi­ciency
May assess slightly later stage of iron defi­ciency than the
presence or absence of bone marrow iron stores (i.e.,
lower preva­lence than serum ferritin alone)
May detect iron deficient erythro­poiesis in addition to
iron store depletion
Produces categorical yes/no estimate only
Total body iron model — characteristics
Assesses entire range of iron status
Derived from direct calculation of body iron from serial
phlebotomy, a “gold standard” method to assess body
iron
Better predicts the absence of bone marrow iron than
serum ferritin alone
May provide a better estimate of the impact of iron
inter­vention (amount of iron absorbed)
Total body iron can be analyzed as a continuous variable
Following the validation of the Total Body Iron (TBI) model (Cook et al., 2003), together with the devel­opment of fully automated assays for both serum ferritin and sTfR , U.S. NHANES III replaced the ferritin model with the Total Body Iron (TBI) model to assess the preva­lence of iron defi­ciency, as shown in Table 17.35. See “Body iron” in Section 17.10.1 for more details on the derivation of TBI.

In 2007 WHO developed guidelines on the use of serum sTfR together with ferritin to assess the iron status of popu­lations, and presented a classification of iron defi­ciency based on these two bio­markers which is shown in Table 17.36.

Table 17.36. The inter­pretation of low serum ferritin and high trans­ferrin receptor concen­trations in popu­lation surveys: this classification is based on experience of measuring ferritin and trans­ ferrin receptor in research studies and requires validation in popu­ lation surveys (WHO, 2020).
a Apply thresholds by age group given in WHO / UNICEF /UNU (2001).
b Apply thresholds recom­mended by manufacturer of assay until an inter­national reference standard is available.
c < 10% for pregnant women.
d ≥ 30% for pregnant women.
Percen­tage of
serum ferritin
values below
thresholda
Percen­tage of
trans­ferrin receptor
values above
thresholdb
Inter­pretation
< 20%c< 10%Iron defi­ciency is not prevalent.
< 20%c≥ 10%Iron defi­ciency is prevalent;
inflam­mation is prevalent.
≥ 20%d≥ 10%Iron defi­ciency is prevalent.
≥ 20%d< 10%Iron depletion is prevalent.
Note that the calculation of TBI was not recom­mended by WHO (2007). They advised that for preva­lence estimates, a single number based on the preva­lence from serum ferritin should be used except in settings where inflam­mation is prevalent when the preva­lence based on sTfR is more appro­priate. Where possible, when determining preva­lence estimates, bio­markers of inflam­mation (CRP and AGP) should also be assayed, so that both serum ferritin and sTfR concen­trations can be adjusted for inflam­mation, preferably using the BRINDA correction method described in Section 17.7.3.

WHO (2007) have also made recom­mendations on the use of the most appro­priate iron bio­markers to evaluate the impact of inter­ventions to control iron defi­ciency, and to predict a change in Hb concen­trations in response to iron inter­ventions in popu­lations (e.g., sup­ple­mentation or fortif­ication). Based on data from nine double-blind, randomized controlled trials in which iron was provided as sup­ple­ments or as food fortified with iron for periods of 4‑18mos, serum ferritin and Hb were concluded as the most efficient indicators of popu­lation response to iron inter­ventions. Serum ferritin or serum sTfR, although less successful, were considered the best to predict a change in Hb concen­tration in response to iron inter­ventions. Further­more, WHO advised that if both serum ferritin and serum sTfR were measured, then body iron stores could be estimated to predict the change in Hb, provided the correct algorithm was applied; for more details see Mei et al. (2005) and WHO (2007).

It is noteworthy, that in settings where iron defi­ciency is the major cause of anemia, then the Hb concen­tration is likely to improve more rapidly than serum ferritin following an iron inter­vention. How­ever, when other factors in addition to iron are the cause of anemia, then an improvement in serum ferritin rather than Hb is more likely. Again, bio­markers of inflam­mation (CRP and AGP) should also be assayed, where possible, so that serum ferritin and serum sTfR concen­trations can be adjusted for inflam­mation, where necessary.

17.10.1 Use of ratios of sTfR to serum ferritin as indicators of iron status

Increasingly, the ratio of sTfR to serum ferritin is being used as an indicator of the extent of iron defi­ciency. Serum ferritin, as noted in Section 17.7, is a sensitive indicator until body iron stores (mainly in hepato­cytes) are depleted, but once stores are totally exhausted (i.e., with serum ferritin < 12g/L), serum ferritin does not provide information on the severity of the iron defi­ciency. In contrast, serum sTfR is a sensitive indicator after body iron stores are exhausted, with concen­trations progressively increasing as iron defi­ciency advances. Hence, in combination, these two bio­markers assess the full range of iron status from severe defi­ciency to overload. Moreover, when expressed as a ratio, they correct for any plasma volume expansion differences during preg­nancy, thus providing a more accurate reflection of iron status at this time. Never­the­less, the practical utility of the sTfR/ ferritin ratios in serum for detecting iron defi­ciency in the presence of inflam­mation is still uncertain (Lynch et al., 2018).

Several approaches exist to express and to interpret this ratio, each with dif­fer­ent cutoff values to interpret the ratio; these approaches and cutoff values are shown in Table 17.37, and discussed below.
Table 17.37. Approaches to express and interpret the ratio of sTfR to SF. ACD, anemia of chronic disease; ELISA, enzyme linked immnosorbent assay; IDA, iron defi­ciency anemia; IDA + ACD, combined iron defi­ciency anemia and anemia of chronic disease; SF, serum ferritin; sTfR, soluble trans­ferrin receptor. 2 Units of both sTfR and SF in the ratio are µg/L. Data from Pfeiffer and Looker (2017).
Ratio CalculationsTfR assay used
to establish ratio
Cutoff value
and definition
Total body
iron,2 mg/kg
−[log(sTfR/SF)
−2.28229]/0.1207
In-house ELISA that
is equivalent to
Ramco assay and
has known relation
to Roche assay
≤ 0: iron deficit
> 0: iron surplus
Simple ratio2 sTfR/SF In-house ELISA that is
equivalent to Ramco
assay and has known
relation to Roche assay
≤ 500: ample iron
stores > 500: depleted
iron stores
sTfR index,
mg/L
sTfR/log10SF First ELISA from
R&D Systems;
then adopted on
Access Beckman
Coulter Counter
< 1: ACD; > 2: IDA
or IDA + ACD
Simple ratio of sTfR to ferritin (both expressed in µg/L) was calculated in the initial phlebotomy study of Skikne et al. (1990) and is easier to calculate than body iron. Their results confirmed that because of the reciprocal relationship between serum sTfR and ferritin measure­ments, the ratio of sTfr:SF represented iron status over the entire range in their study. When plotted logarithmically, the ratio increased from < 100 in those with ample iron stores to > 2,000 in those with signif­icant functional iron defi­ciency. A rise above 500 occurred when stores were fully depleted (i.e., iron stores of 0mg/kg). A major disadvantage of this simple ratio is that its inter­pretation is assay dependent. The cutoff developed for the simple ratio (i.e., ≤ 500 indicative of ample iron stores; > 500 of depleted iron stores) was established with the Ramco assay, and should only been used with data generated with either the Ramco assay (van den Brock et al., 1998) or an assay equivalent (Grant et al., 2012).

sTfR index, mg/L is calculated as the ratio of sTfR to the base-10 logarithm of serum ferritin (Table 17.37) and was developed initially to identify individuals with depleted iron stores, defined by a complete absence of stainable iron in the bone marrow (Punnonen et al., 1997). Use of the log of serum ferritin in this ratio decreases the influence of the acute phase response on the ferritin compo­nent of the ratio (WHO 2007). Cutoff values have been developed (expressed in mg/L) to distin­guish between the anemia of chronic disease (< 1mg/L) and iron defi­ciency anemia (> 2mg/L), or both con­ditions (> 2mg/L) (Weiss and Goodnough, 2005) (Table 17.37). The sTfR index was later adopted for use with the assay of sTfR and ferritin simul­tane­ously by the Access immuno­assay systems (Beckman Coulter Counter), when a cutoff of > 1.03mg/L was proposed for iron defi­ciency anemia or a combination of iron defi­ciency anemia and the ACD. How­ever, the inter­pretation of the sTfR index is assay dependent so this proposed cutoff should not be applied when alternative assays not comparable to the Beckman Coulter Counter are used (Pfeiffer and Looker, 2017). Hence, standardization of the various sTfR assays is needed before this ratio is widely used. To convert sTfR measure­ments when expressed as nmol/L to mg/L, multiply nmol/L by a factor of 0.0738 (Skikne et al., 2011).

Body iron was the original term used by Skikne et al., (1990), although others have used body iron stores or total body iron (TBI). Measure­ments of body iron were initially based on a study in which serial measure­ments of serum ferritin and sTfR were obtained during repeated phlebotomies in 14 healthy Caucasian adults (6M; 8F, 24‑46y) over 6‑22wks; see Skikne et al. (1990) and Cook et al. (2003) for further details. The investigators demonstrated a close linear relationship between the logarithm of the ratio of sTfR/serum ferritin and total body iron stores expressed as mg per kg body weight. The formula for this relationship (based on both sTfR and serum ferritin (SF) expressed as µg/L) is the following: \[ \small \mbox{Body iron (mg/kg)}= \frac { \mbox{[log}_{10} \mbox{(sTfR/SF)− 2.8229]}}{\mbox{0.1207}}\]
Box 17.5: Advantages of using total body iron (TBI) (as µg/kg) as a measure of iron status Modified from Gupta et al. (2017).
This formula was validated by Cook et al. (2003) using data from three published studies. Note this model is based on serum sTfR concen­trations using an in-house ELISA assay (Flowers et al., 1986), shown later by Pfeiffer et al. (2007) to be equivalent to the Ramco sTfR assay. The Ramco assay was later reported to measure ≈ 30% higher than the fully automated sTfR immuno­turb­imetric assay developed by Roche Diagnostics and used in the U.S. NHANES III survey to measure body iron (Cogswell et al., 2009). Conse­quently, Pfeiffer et al. (2007) developed a regression equation to convert serum sTfR concen­trations assayed using the fully automated ROCHE assay to Flower's sTfR concen­trations: \[\small \mbox{Flower's sTfR}= \mbox{1.5 × Roche'sTfR + 0.35(mg/L)} \] Note this body iron equation cannot be used directly when other sTfR assays are used. Factors that led to the decision to apply the body iron model to NHANES data are itemized in Box 17.5.

Figure 17.17 depicts TBI, expressed as the preva­lence of low iron stores for the first, second, and third trimester in U.S. pregnant women from U.S. NHANES 1990‑2006 survey. In this study, the sTfR concen­trations for each trimester were adjusted to be comparable with values produced by the Flowers sTfR assay used in the devel­opment of the TBI model, by using the regression equation shown above. Results confirmed that iron defi­ciency based on TBI < 0mg/kg was more prevalent in pregnant women in the second or third trimester, and in non-Hispanic black pregnant women (data not shown). Moreover, the preva­lence and pattern, by trimester and by race / ethnic group (data not shown) of iron defi­ciency based on sTfR and TBI were similar.
Figure 17.17
Figure 17.17. Distribution of total body iron (calculated from serum ferritin and sTfR concentrations) in U.S. pregnant women at the 1st, 2nd, and 3rd trimester. Data from the U.S. NHANES 1999‑2006. Redrawn from Mei et al. (2011).

Note that these results did not take inflam­mation and infection into account, even though preg­nancy is considered an inflam­matory state. How­ever, after adjusting the preva­lence of iron defi­ciency based on TBI for inflam­mation by excluding those with elevated CRP concen­trations (i.e., > 5mg/L), the preva­lence of iron defi­ciency after exclusion was not substantially dif­fer­ent, suggesting that the effect of inflam­mation on the preva­lence of iron defi­ciency in these U.S. women during preg­nancy was small (Table 17.38).

Never­the­less, several uncertainties remain when using TBI to define iron defi­ciency. As an example, whether the equation for estimating total body iron is appro­priate for young children and pregnant women is unclear. The equation was validated only in non-pregnant adults (Cook et al., 2003), and the relationship between TBI and serum ferritin and sTfR concen­trations may differ in these life-stage groups. Further­more, the changes in serum ferritin and sTfR that occur during more gradual reduction in body iron stores may not resemble those observed during the relatively rapid reduction in body iron stores with phlebotomy (Mei et al., 2017).
Table 17.38. Unadjusted preva­lence of iron defi­ciency (ID percen­tage: 95%CI) on the basis of high serum soluble trans­ferrin receptor (sTfR) concen­trations > 4.4mg/L and total body iron (TBI) < 0mg/kg in U.S. pregnant women in the National Health and Nutrition Examination Survey (NHANES), 1999‑2006. Data from Mei et al. (2011).
Trimester nHigh sTfR
(> 4.4mg/L)
TBI
(< 0mg/kg)
First 189 5.8 (2.4, 9.2) 6.9 (2.6, 11.2)
Second 416 10.6 (6.3, 14.9) 14.3 (10.2, 18.5)
Third 384 27.9 (20.6, 35.3) 29.7 (24.4, 34.9)

There is also some confusion over the term TBI because it is not a measure of the quantity of iron in the body of an individual. Instead, total-body iron provides a quantitative estimate of the size of the body store when iron is present in the store (i.e., values > 0mg/kg). In the case of iron defi­ciency (i.e., values < 0mg/kg) , TBI provides an estimate of the quantity of iron that would be needed to replace the functional deficit (Pfeiffer and Looker, 2017). In light of the confusion, an alternative term — body iron index — has been proposed.

In addition, there is some concern over the assumption that a body iron index value of “0” is in fact the criterion for defining iron defi­ciency in view of inconsistencies in the reported preva­lence of iron defi­ciency estimated by serum / plasma ferritin, sTfR, and TBI (Engle-Stone et al., 2013; Cogswell et al., 2009). This has prompted the suggestion that a higher cutoff such as 2mg/kg may be more appro­priate than TBI of < 0mg/kg to indicate iron defi­ciency; for more discussion of these concerns, see Lynch et al. (2018).

In the future, the devel­opment of low cost ELISA assays capable of measuring several iron and inflam­mation status bio­markers simul­tane­ously from a small volume of serum / plasma will assist in improving the reliability of the diagnosis of iron defi­ciency. This is especially needed in settings where a high burden of infectious diseases, genetic Hb disorders, and micro­nutrient deficiencies have often confounded the inter­pretation of the data.

17.11 Emerging iron status indicators

Emerging indicators, currently mainly used in a research setting, include hepcidin, non-trans­ferrin-bound iron, and some reticulocyte indices. More work is required to understand the diagnostic value of these emerging indicators before they become more widely available for public health or clinical use.

17.11.1 Hepcidin

Hepcidin is a protein encoded by the HAMP gene. It plays a major role in controlling physio­logical iron homeo­stasis, as noted earlier. Hepcidin binds and subsequently causes degradation of the iron-exporter ferroportin, thereby inhibiting both absorp­tion of dietary iron from the gastro­intestinal tract and the release of iron from body stores (Section 17.1.2). In con­ditions of iron depletion or increased iron demand, hepatic synthesis of hepcidin is reduced, thus facilitating iron absorp­tion and mobil­ization from body stores via active ferro­portin. In contrast when iron stores are replete (or in renal impairment when excretion of hepcidin is suppressed), hepcidin levels are increased which block the release of iron into the circu­lation. Instead, iron is lost with the cell when it is shed from the villus. Hepcidin produc­tion by the liver is also regulated by inflam­mation and erythro­poietic activity.

Hepcidin may be useful in the clinical diagnosis of iron-refractory anemia (i.e. the type of iron defi­ciency anemia that typically does not improve with oral iron treatment), in dif­fer­entiating between iron defi­ciency anemia and anemia of chronic disease and in patients with iron overload syndromes. As hepcidin is a determinant of dietary iron absorp­tion, hepcidin may also provide guidance on safe iron sup­ple­mentation in countries with a high infection burden (van der Vorm et al., 2016; Pfeiffer and Looker, 2017). Studies have also investigated the diagnostic potential of serum hepcidin as an index of iron defi­ciency in preg­nancy (Bah et al., 2017). Never­the­less, whether hepcidin assays provide an advantage over serum ferritin and the other more common methods for assessing nutritional iron status, particularly when iron status is replete, remains uncertain (Pfeiffer and Looker, 2017; Lynch et al., 2018).

Inflam­mation raises levels of pro-inflam­matory cytokines in the systemic circu­lation, which in turn increase serum hepcidin levels, an acute-phase protein, just like serum ferritin. Elevated hepcidin levels reduce both iron intestinal absorp­tion and the release of iron from body stores, thus decreasing the amount of circu­lating serum iron and thereby limiting erythro­poiesis. Hence, the risk of iron defi­ciency increases, even though body iron stores are normal.

Obesity is associated with low grade inflam­mation and thus elevated hepcidin levels. Results presented in Table 17.39
Table 17.39. Inflam­matory markers and iron biomarkers in normal weight healthy young women (n=22) and obese women (n=25).
WBC, White Blood Cell Count; IL-6, interleukin-6; CRP, C-reactive protein; MCV, mean corpuscular volume (mean cell volume); TIBC, total iron-binding capacity; TSAT, trans­ferrin satu­ration.
1 Data available for 40 women (18 normal weight vs. 22 obese);
2 Data available for 42 women (20 normal weight vs. 22 obese); * Geometric mean (95% CI).
Abstracted from Aguree and Reddy (2021).
Normal Weight Women
(BMI: 18.5‑24.9 kg/m2)
Obese Women
(BMI: >29.9 kg/m2)
p Value
VariableMean ± SD Mean ± SD
Inflam­matory markers
WBC (×103µL) 5.6 ± 1.6 7.7 ± 2.0 < 0.001
IL-6 (pg/mL) 1* 1.46 [1.13, 1.89] 2.16 [1.86, 2.51] 0.003
CRP (mg/L) 1* 8.2 [3.1, 21.8] 69.9 [41.1, 118.9] < 0.001
Iron bio­markers
Hb (g/dL) 13.6 ± 1.1 13.3 ± 1.1 0.166
MCV (fL) 86.3 ± 4.5 82.4 ± 8.9 0.030
Serum iron (µg/dL) 2 112.0 ± 41.4 92.4 ± 33.9 0.044
TIBC (µg/dL) 2 394.4 ± 61.5 371.3 ± 50.4 0.086
TSAT (%) 2 29.7 ± 13.3 25.2 ± 9.1 0.094
Ferritin (ng/mL) 1* 34.0 [21.0, 55.1] 37.7 [26.9, 52.9] 0.355
Hepcidin (ng/mL) 16.21 [4.39, 8.77] 11.21 [7.04, 17.83] 0.024
compare inflam­matory and iron status bio­markers in a cross-sectional study of normal weight and obese healthy U.S. women. The data highlight the higher mean serum concen­trations of CRP, inter­leukin‑6, and hepcidin but lower levels of MCV and serum iron (but not ferritin) in the obese women compared to their normal weight counterparts (Aguree and Reddy, 2021).

Enhanced erythro­poiesis suppresses hepcidin because of the increased iron demand. Conse­quently, iron absorp­tion is increased and iron is mobilized from body stores. Numerous con­ditions are associated with an increase in erythroid proliferation; some examples are given in Table 17.28. Of concern are individuals with severe transfusion-dependent HbE β‑thalassemia and β‑thalassemia carriers (but not HbE carriers) because the increased uptake of iron resulting from severe and mildly suppressed hepcidin, respectively, may increase the risk of iron overload. As an example, Table 17.40 highlights the suppression
Table 17.40. Demographic, iron, erythropoietic, and inflam­matory indices for patients with HbE β‑thalassemia stratified by severe and moderate phenotype, NTBI, Non-trans­ferrin-bound iron. Values are *mean (range) or **geometric mean (range). signif­icant difference between moderate and severe patients (P < 0 .05). signif­icant difference between controls and severe patients, and controls and moderate patients. Data from Jones et al. (2015).
ParameterSevere
(n=28; 17 female)
Moderate
(n=41; 25 female)
P Local controls
(n=25)
Hb
g/dL*
5.8 (4.4‑7.6)6.4 (4.9‑8.3).1397 15.0 (14.5, 15.5)
Ferritin
mg/L**
1356 (328, 9790) 732 (143, 3260) .0021 48.2 (38, 61.2)
Liver iron
mg/g dwt**
11.3 (1, 54.2) 5.30 (0.7, 29.0) .0006
NTBI
µM/L*
5.16 (−1.67, 10.27) 4.36 (−2.67, 23.75) .4456
Hepcidin
ng/mL**
2.24 (0.1, 51.8) 1.95 (0.1, 36.3) .7441 28.23 (18.83, 42.3)
sTfR
nM/L**
6.51 (1, 35.3) 8.87 (1.4, 35.3) .1472
CRP
mg/L**
1.66 (0.23, 34.23) 1.58 (0.20, 13.01) .8433
Labile plasma
iron µM/L*
3.60 (0.1, 7.7) 4.29 (−1.07, 13.61) .9320
Trans­ferrin
SAT, %*
97.3 (48.3, 100.0) 86.6 (31.9, 100.0) .0319
of hepcidin in Sri Lankan patients with HbE β‑thalassemia classified as severe or moderate clinical phenotypes. Patients in both groups had lower concen­trations of serum hepcidin and Hb and higher serum ferritin concen­trations when compared to the values for 25 iron-replete, non‑thalassemic Sri Lankan male controls.

Even among Sri Lankan schoolchildren, hepcidin was mildly suppressed among those with increased erythro­poiesis associated with β‑thalassemia trait (but not with HbE) compared to controls, suggesting an enhanced propensity to accumulate iron (Jones et al., 2015). Hence, in Asian countries where β‑thalassemia trait syndromes and thalassemia carriers are prevalent, risk of causing an inadvertent burden of iron overload should be considered when planning public health iron inter­ventions to improve the iron status of high risk popu­lation groups (Jones et al., 2015), as noted earlier.

Malaria induces an increase in hepcidin concen­trations, likely as a result of suppressed erythropoietic activity, inflam­mation and perhaps additionally by a direct stimulatory effect of malaria parasites and their products (de Mast et al., 2009). Hepcidin concen­trations are elevated even when the malaria is asymptomatic (i.e., presence of parasitemia in the absence of fever or malaria-related symptoms) and in the absence of a marked acute phase response, as shown in Figure 17.18. Although an effect of mild inflam­mation cannot be excluded, never­the­less, other mech­anisms might also con­tribute to elevated hepcidin concen­trations such as the existence of an IL-6-independent pathway in malaria.
Figure 17.18
Figure 17.18. Serum concen­trations of hepcidin in children with asymptomatic P. falciparum parasitemia (n=73), asymptomatic P. vivax parasitemia (n=18) and controls (n=17). Data depicted are before and 4 weeks after the start of antimalarial treatment; the line represents the mean. P values were determined using the Student's t-test and paired t-test. Redrawn from de Mast et al. (2009).
The results shown in Figure 17.18 are based on a group of Indonesian children (5‑15y) (n=1197) screened by microscopy for the presence of parasitemia, Hb, serum hepcidin, indices of iron status (ferritin, sTfR, mean cell volume) and inflam­mation (CRP) at baseline and 4 weeks after antimalarial treatment.

It is of interest that the presence of even mildly elevated hepcidin concen­trations appear to induce functional iron defi­ciency in these Indonesian children with asymptomatic parasitemia, as indicated by lower values of Hb, MCV, serum iron and trans­ferrin satu­ration. These findings suggest that in the presence of asymptomatic malaria, iron therapy for the treatment of malaria anemia may be less effective because absorp­tion of iron is compro­mised by the high hepcidin concen­trations. Moreover, the iron therapy may be even hazardous, increasing malaria-associated morbidity and mortality (de Mast et al., 2010).

Preg­nancy suppresses hepcidin by a mech­anism that is unclear (Koenig et al., 2014). During the first trimester, serum hepcidin concen­trations are within the reference range for nonpregnant women, but decrease in the second trimester to very low levels, where they remain in the third trimester, increasing again immediately after delivery and thereafter (van Santen et al., 2013; Bah et al., 2017; Fisher and Nemeth, 2017) (Figure 17.19).
Figure 17.19
Figure 17.19. Median (IQR) serum hepcidin concen­trations in 31 women during preg­nancy and postpartum. ***Compared with first-trimester values, P < 0.0001. Modified from Fisher and Nemeth (2017).
The fall in hepcidin appears to occur in advance of the onset of low iron stores in preg­nancy, as shown from the data for Gambian women presented in Table 17.29. Here, hepcidin, indices of iron stores, erythro­poiesis, and inflam­mation were all measured at 14, 20 and 30wks gestation (Bah et al., 2017). This suggests that maternal iron defi­ciency is not solely responsible for the sharp decline. Even mothers with replete iron stores have low hepcidin concen­trations at delivery, suggesting that maternal hepcidin may be actively suppressed during preg­nancy (Fisher and Nemeth, 2017).

How­ever, in pregnancies complicated by infection or inflammation (including obesity), maternal hepcidin concentrations are elevated compared to healthy controls. Hence, in these conditions maternal and fetal iron bioavailability could be compromised in pregnancy, limiting the amount of iron presented for uptake by the placenta and for transfer to the fetus (Koenig et al., 2014).

Interpretive criteria for hepcidin concen­trations are difficult to define because concentrations vary widely across assays, as for serum sTfR (Kroot et al., 2009). As a conse­quence, no usable inter­national reference ranges and reference limits for hepcidin for specified life-stage groups exist at the present time.

Measure­ment of serum hepcidin with good reproducibility in both plasma and urine is possible using mass spec­trom­etry or immuno­assays, although inter-assay variability is large, with no certified reference material currently widely available (Pfeiffer and Looker, 2017). Recently an inter­national calibrator for hepcidin with long-term stability has been developed (available at HepcidinAnalysis.com) which will greatly improve the equivalence between hepcidin measure­ment procedures in the future; see Van der Vorm et al. (2016); Diepevenn et al. (2019); and Aune et al., (2020) for further details.

Measure­ment of hepcidin in urine may provide a less-invasive screening for iron status and warrants more investigation. Concen­trations are elevated in anemic children with febrile malaria.

17.11.2 Novel red blood cell indices

Novel red blood cell indices are generated by specific models of hematology automated cell counters and include among others, reticulocyte mean Hb content (RHcc) and mean reticulo­cyte volume (MRV). How­ever, their inter­pretation is complicated by the dif­fer­ent techniques used by the manufacturer and the lack of system­atic studies to determine their usefulness to evaluate nutritional anemias.

Reticulocyte Hb content (RHcc) reflects directly the synthesis of Hb in bone marrow precursors and is a measure of the adequacy of the availability of iron. It can be used to dif­fer­entiate iron defi­ciency from other causes of anemia. Reticulo­cyte Hb content is sometimes used in clinical settings to establish iron status in young children with anemia, when other common iron bio­marker assays may not be available; low values provide an early marker of iron deficient erythro­poiesis. The measure­ment of RHcc is not useful in individuals with α and β‑thalassemias (including individuals who are hetero­zygous) when the reticulocyte Hb count is reduced independently of iron stores.

Reticulocyte Hb content can also be used to measure the early response to intravenous iron therapy because it increases signif­icantly after only 48‑72h (Buttarello, 2016). The sensitivity and specificity of RHcc for diagnosing iron defi­ciency is only moderate, with no recom­mended cutoffs to define iron defi­ciency. How­ever, RHcc is less affected by inflam­mation than serum iron, trans­ferrin satu­ration, and ferritin, although it is influenced by any con­ditions that cause iron restricted erythro­poiesis (Gelaw et al., 2019).

Reticulocyte volume is also an index that can be measured by some newer hematology analyzers. Values for reticulocyte volume decrease rapidly with the devel­opment of iron-deficient erythro­poiesis and increase rapidly in individuals with depleted iron stores or nutritional macrocytosis after therapy with iron, or vitamin B12 and/or folic acid, respectively. How­ever, numeric results depend on the analyzer manufacturer, making comparisons difficult (Buttarello, 2016).

17.11.3 Non-trans­ferrin bound iron

Iron in plasma is tightly bound to trans­ferrin, the iron trans­port protein that delivers iron throughout the body, as noted earlier (Section 17.6). Cells take up trans­ferrin-bound iron in proportion to the number of cell-surface trans­ferrin receptors (TfR1). When cellular iron levels are sufficient, the uptake of trans­ferrin decreases to limit further iron assimilation and prevent excessive iron accumulation; usually only 20‑30% of trans­ferrin is saturated with iron. How­ever, in iron overload con­ditions, such as hereditary hemo­chromatosis and thalassemia major, trans­ferrin is fully saturated and non-trans­ferrin-bound iron (NTBI) appears in serum. Note the high concen­trations of NTBI in the patients with HbE β‑thalassemia in Table 17.40 , which were inversely associated with hepcidin. This trend may reflect the increased release of iron into the plasma that occurs when suppressed hepcidin levels permit enhanced iron absorp­tion and mobilization. The NTBI also corre­lated with ferritin and liver iron concen­trations, suggesting that a high level of NTBI is associated with hepatic iron accumulation (Jones et al., 2015).

There is no known regulatory mech­anism for NTBI uptake so NTBI can enter cells readily, where it produces free radicles resulting in cellular and organ damage. Circulating NTBI may also cause an increase in bacterial-pathogenic infections owing to the free ion being utilized by the parasite. Concern has been raised over the rapid absorp­tion of iron sup­ple­ments given in non-physio­logical amounts, and their potential for exceeding the capacity for trans­ferrin to bind the circu­lating iron, resulting in the formation of NTBI concen­trations (Prentice et al., 2016).

The measure­ment of serum NTBI is fraught with problems, often with a 40-fold variation difference in methods, in part due to the hetero­geneous nature of the chemical forms of NTBI, as well as large analytical variation. A new automated NTBI assay has been developed which looks promising in terms of reproducibility and comparability with HPLC, one of the most reliable methods to quantify NTBI. Never­the­less, more research is required to define the most relevant forms of NTBI, and understand their clinical importance (Pfeiffer and Looker, 2017).

Acknowledgments

RSG would like to thank Dr. Anne-Louise Heath who kindly reviewed this chapter and suggested some helpful improvements, as well as collaborators, particularly my former graduate students. RSG is grateful to Michael Jory for the HTML design and his tireless work in directing the trans­ition to this HTML version.