Book

Gibson RS1    Principles of
Nutritional Assessment:
Bioavailability of Nutrients

3rd Edition.    May, 2025

Abstract

Nutrient intakes calculated from food composition data or determined by direct chemical analysis represent the maximum available to the body and not the amount actually absorbed and utilized. Algorithms have been developed for those nutrients for which the factors affecting their bioavail­ability are well characterized. Use of these algorithms allows a more reliable assessment of the nutrient adequacy of diets. The earlier algorithms for iron were developed from results of single meal absorption studies now recognized to exaggerate the effects of dietary modifiers on iron absorption. In addition, only the effect of selected dietary modifiers on nonheme iron absorption were considered, with a value of 25% assumed for heme iron absorption. Most recently, an interactive tool based on the probability-based approach has been developed whereby total iron absorption from mixed diets of adults at any level of iron status can be estimated. The newest quantitative zinc algorithms use the trivariate saturation response model to estimate total absorbable zinc for adults, provided intakes of zinc and phytate (a major inhibitor of zinc absorption), are available. Whether these new models are appropriate for young children is uncertain. New terms have been introduced for protein (digestible indispensable amino acid score), folate (dietary folate equivalent), and vitamin A (retinol activity coefficient) to consider the differences in the bioavail­ability of all sources of these nutrients. However, some uncertainty exists on the bioavail­ability for the different forms of vitamin D and vitamin E. Currently, both vitamin D2 (ergocalciferol) and vitamin D3 (cholecalciferol) are considered bioequivalent, whereas of the 8 naturally occurring vitamin E analogues, only α-toco­pherol is considered by some authoritative groups when measuring vitamin E intake and setting requirements.

CITE AS: Gibson RS, Principles of Nutritional Assessment:
Bioavailability of Nutrients
https://nutritionalassessment.org/bioavailability/

Email: Rosalind.Gibson@Otago.AC.NZ
Licensed under CC-BY-4.0
( (PDF)

4b.0      Assessment of bioavailable nutrient intakes

Nutrient intakes calculated from food composition data or determined by direct chemical analysis represent the maximum amount of nutrient available to the body. For most nutrients, however, the amount actually absorbed and used by the body is lower than the total intake. The term “bioavail­ability” is defined as the proportion of the ingested nutrient that is absorbed from the diet and utilized for normal body functions (Chapt17, section 17.1.7, Lynch et al., 2018). Nutrients that are readily taken up following ingestion are termed “bioavailable.”

With the increasing use of food fortification / enrichment and dietary supplements worldwide, care must be taken to ensure that the bioavail­ability (as well as the amount) of any potential fortificant and supplement in the food supply of a country is also considered, so their contributions can be taken into account when estimating the nutrient composition and bioavail­ability of the total diet. However, because the level of fortification/enrichment is country-specific due to different food standards in each country, caution must be used if food composition values are borrowed from other countries.

Depending on the nutrient, bioavail­ability may be influenced by diet and host-related factors independently and in combination. In general, diet-related factors have less influence on the bioavail­ability of the macronutrients than micronutrients. Examples of dietary factors include the chemical form of the nutrient and nature of the dietary matrix, interactions among nutrients themselves or with other organic components in the diet, and pretreatment of food from processing and/or preparation practices. At present, the dietary factors influencing the bioavail­ability of many nutrients are not well established (Gibson, 2007).

Information on the host-related factors is often limited, so their effects on nutrient bioavail­ability are often ignored. Moreover, the extent to which host-related factors influence absorption and/or utilization varies with the nutrient, life-stage group, and environment. Host-related factors are often classified as intestinal or systemic factors. Intestinal factors influence the efficiency of luminal and mucosal digestion and absorption through reductions in the secretion of hydrochloric acid, gastric acid, and intrinsic factor, as well as from alterations in the permeability of the intestinal mucosa. Of these, atrophic gastritis is considered one of the most significant luminal factors influencing nutrient bioavail­ability, mainly through its association with hypochlorhydria. The latter is characterized by a progressive decrease in the capacity of the parietal cells of the stomach to secrete hydrochloric acid. In children in low-income countries, hypochlorhydria often arises from infection with the bacterium Helicobacter pylori. As a result of the hypochlorhydria, absorption of folate, iron, calcium, and zinc, and the bioconversion of β‑carotene to vitamin A may be impaired (Gibson, 2007).

Bacterial overgrowth in the proximal small intestine also occurs with atrophic gastritis. This condition often exacerbates vitamin B12 deficiency because the bacteria may take up any freed vitamin B12 from food. Small intestinal bacterial overgrowth is also a manifestation of environmental enteropathy (EE) (also known as environmental enteric dysfunction (EED). EE/EED is an acquired subclinical condition predominantly affecting the gut of children in low-income countries. EE/EED is hypothesized to be caused from continuous exposure to fecally contaminated food and water and is characterized by multiple abnormalities. These may include loss of intestinal barrier function and low-grade intestinal inflammation, which together result in atrophy of the villi of the small intestine. Increasingly, linear growth faltering, poor neurocognitive development, and oral vaccine low efficacy is being associated with EE/EED in childhood (Watanabe & Petri, 2016; Fasano, 2024).

Of the systemic factors that can influence absorption and utilization of nutrients, age, physiological status (e.g., pregnancy or lactation), and nutrient status of the host are those most frequently considered. Coexisting infectious illnesses also impact the bioavail­ability of nutrients through several mechanisms, including decreased absorption, increased losses, and sequestration of nutrients in the liver or other sites. There is also increasing evidence that the absorption of certain nutrients may also be affected by ethnicity, lifestyle (e.g., smokers, oral contraceptive users), genotype, environmental pollution (e.g., lead pollution) and chronic disease (e.g., asthma, diabetes (Kang et al., 2021). For more discussion of these diet- and host-related factors, see Gibson (2007).

For some nutrients, mathematical models termed algorithms have been developed to predict nutrient bioavail­ability. These mathematical models attempt to predict bioavail­ability by considering the amount and form of the nutrient (where applicable), the presence of dietary enhancers and inhibitors, and in some cases, the nutrient and health status of the individual. The models then apply certain general principles to the complex whole-diet matrix. Unfortunately, for many nutrients, algorithms have not been developed to predict bioavail­ability due to the paucity of data. Notable exceptions are the algorithms available for iron, zinc, protein, folate, vitamin A, and vitamin E. Clearly, many factors can limit the accuracy of the algorithms (Hunt, 2010). Hence, as new research findings emerge, algorithms must be modified on an ongoing basis. Currently, the effects of only some of the dietary modifiers and host-related factors are considered in the bioavail­ability algorithms available.

4b.1      Bioavail­ability of iron

Two forms of iron are present in foods: heme iron and nonheme iron. Heme iron is bound in a porphyrin ring and is derived mainly from hemoglobin and myoglobin in meat, poultry, and fish. The proportion of heme iron in lean meat, poultry and fish is variable and ranges from about 30% in white meats to about 70% in red meats. In individuals consuming omnivorous diets, heme iron contributes only about 10‑ of the total iron consumed, but because of its efficient absorption, it can provide over 40% of the total iron absorbed (Hurrell & Egli, 2010). Nonheme iron is found primarily as iron salts in a variety of foods of both plant and animal origin, and possibly as contaminant iron introduced by food processing and the soil (Harvey et al., 2000; Gibson et al., 2015). Each form of iron is absorbed by separate pathways; heme iron is absorbed as the intact iron porphyrin, whereas nonheme iron is absorbed from the common pool within the gastrointestinal tract. However, once inside the mucosal cells of the small intestine, all iron enters a common pool (Hurrell & Egli, 2010; Lynch et al., 2018).

Of the two dietary iron forms, heme iron is much more readily absorbed than nonheme iron, with absorption ranging from 10-40%, depending on the iron status of the individual, compared to 2‑20% for nonheme iron. Absorption of heme iron is little affected by dietary factors; absorption is regulated primarily in response to total body iron. As demand for body iron increases there is a corresponding upregulation in the uptake of heme iron. For example, in iron-deficient individuals, absorption of heme iron can increase to 40% whereas in those who are iron- sufficient, only 10% of heme iron may be absorbed (Hallberg et al., 1998).

The absorption of nonheme iron, in contrast to heme iron, is modified by the ingestion of several dietary components when consumed in the same meal as well as the level of iron stores of the individual. Dietary components that inhibit nonheme iron absorption include phytate, polyphenols, and peptides from partially digested proteins (e.g., soybean protein). Phytic acid (myo-inositol hexakisphosphoric acid and its associated magnesium, calcium, or potassium salts collectively termed phytate, are present in high concentrations in unrefined cereals, legumes, and oleaginous seeds. In unrefined cereals, phytate is typically concentrated in the outer aleurone layer, except for maize, where it is mainly in the germ. In legumes and most oilseeds, phytate is uniformly distributed within the protein bodies of the endosperm or kernel (Gibson et al., 2018).

Phytic acid (myo-inositol hexakisphosphoric acid) is made up of an inositol ring with six phosphate ester groups (i.e., InsP6) and is the most abundant form of myo-inositol phosphate found in mature, unprocessed, plant-based foods. Phytic acid is the main inhibitor of nonheme iron absorption, forming insoluble complexes with iron (and other minerals) in the upper gastrointestinal tract which cannot be digested or absorbed by humans because of the absence of intestinal phytase enzymes (Iqbal et al., 1994). Several food preparation and processing methods such as soaking, germination, fermentation, and milling can lead to reductions in phytic acid, either by loss of water-soluble phytic acid, or by hydrolysis by phytase enzyme of phytic acid to lower myo-inositol phosphate forms that no longer inhibit nonheme iron absorption (i.e., IP2; IP1).The negative effect of phytate on iron absorption is dose dependent, with phytate-to-iron molar ratios less than at least 1:1, and preferably less than 0.4:1 before iron absorption is enhanced (Gibson et al., 2018).

Polyphenol compounds from beverages such as tea, coffee, cocoa, and red wine, vegetables (spinach, aubergine), legumes (colored beans), and cereals such as red sorghum, also inhibit non- heme iron absorption. Their inhibitory effect is dose-dependent, the strength depending on the structure of the phenolic compounds, with the gallate-containing tea polyphenols having the largest inhibitory effect (Brune et al., 1989; Hurrell et al., 1999). Calcium inhibits both nonheme and heme iron absorption, although the inhibitory effect is weak and short-term, especially in complex meals containing absorption enhancers (Abioye et al., 2021). The mechanism is not well understood (Lynch et al., 2018}. Peptides from partially digested proteins such as casein, whey, and egg white have also been shown to inhibit nonheme iron absorption (Hurrell et al., 1988). Reductions in iron absorption associated soybean protein have also been reported, an effect which appears to be independent of its phytate content (Hurrell et al., 1992).

Both ascorbic acid and muscle tissue from meat, poultry, fish and liver, enhance nonheme iron absorption when eaten in meals containing inhibitors (Lynch et al., 2018}. Of these, ascorbic acid is the most potent enhancer of nonheme iron absorption through its ability to convert ferric (Fe3+) iron to ferrous (Fe2+) iron at low pH as well as its chelating properties (Conrad & Schade, 1968). The enhancing effect of ascorbic acid is dose-dependent and most marked when consumed with meals that contain high levels of inhibitors, including phytate and polyphenols. Ascorbic acid can also enhance absorption of many iron fortification compounds (Hurrell et al., 2004), except NaFeEDTA (Troesch et al., 2009). The enhancing effect of ascorbic acid on iron absorption, however, is removed by cooking, industrial processing, and storage all of which degrade ascorbic acid (Hurrell & Egli, 2010). The mechanism whereby muscle tissue enhances nonheme iron absorption is not clear. It may be linked to cysteine-containing peptides released during digestion, which have the potential to reduce ferric (Fe3+) iron to ferrous (Fe2+2+) iron and chelate iron as described for ascorbic acid (Taylor et al., 1986).

Host-related factors influencing absorption and utilization of iron include the iron status of the individual and physiological status. For example, there is a well established inverse relationship between the size of the body iron store and absorption of iron in healthy individuals so that when iron stores are sufficient, less iron is absorbed, as noted earlier (Cook & Skikne, 1982; Hallberg & Hulthén, 2002). During pregnancy, fractional absorption of iron from the diet is enhanced by the increased demand for iron and the supression of hepcidin, especially during the third trimester (Barrett et al., 1994). Ethnic differences in iron homeostasis have also been reported, with higher concentrations of iron status indicators (serum ferritin, transferrin saturation, and hemoglobin) in East Asians than Europeans, African Americans, or South Asians, although the genetic determinants of these differences are unknown (Kang et al., 2021). Inflammation and obesity are also known to play an important role in iron bioavail­ability although their effects have only been considered in the more recent algorithms (Collings et al., 2013; Fairweather Tait et al., 2020). During inflammation and obesity, there is an increase in levels of pro-inflammatory cytokines in the systemic circulation, which in turn increase hepatic synthesis of hepcidin levels. Elevated serum hepcidin levels reduce both intestinal absorption of iron as well as the release of iron from body stores (Hurrell & Egli, 2010; Collings et al., 2013).

The earlier algorithms estimating iron bioavail­ability were developed from results of isotopically labeled single meal absorption studies now recognized to exaggerate the effect of dietary enhancers and inhibitors and other factors on iron absorption (Armah et al., 2013). As well, individuals may adapt to the effect of dietary modifiers on iron absorption after prolonged exposure (Hunt & Roughead, 2000). Moreover, initially only the effects of the enhancers of nonheme iron absorption (i.e., ascorbic acid and animal flesh) were considered, with absorption of heme iron assumed to be 25% and to account for 40% of the iron in meat, poultry or fish (Monsen & Balintfy, 1982). Later, algorithms became more complex and included differing numbers of both dietary enhancers and inhibitors of nonheme iron absorption (Murphy et al., 1992; Tseng et al., 1997; Hallberg & Hulthén, 2000; Reddy, 2005; Rickard et al., 2009). Of these, the model of Hallberg and Hulthén (2000) is the most detailed and considered the effects of all known modifiers of nonheme iron absorption, interactions, as well as adjustments for the iron status of the individual. High iron stores are known to increase the expression of hepcidin which reduces iron absorption, whereas with low iron stores hepcidin expression is reduced so iron absorption increases (Franchini et al., 2010). However, the model of Hallberg and Hulthén (2000) is difficult to apply, in part because of incomplete food composition data for the absorption modifiers. For a detailed review of these early algorithms, see Reddy (2005).

More recently, several algorithms have been developed to predict the bioavail­ability of iron in whole diets because of the known exaggerated effects of the dietary modifiers on bioavail­ability when based on single test meals (Cook & Reddy, 2001). In some of these newer algorithms the form of the iron (i.e., heme or nonheme iron), the content of several known dietary enhancers and inhibitors in the diets, and the iron status of the individual are considered. None, however, attempt to estimate the simultaneous effects of enhancers and inhibitors on iron absorption. Some of the models only predict nonheme iron absorption so assumptions about the absorption of heme iron must be made; 25% is the value most frequently assumed (Murphy et al., 1992; Armah et al., 2013). A further limitation is that phytate values, when included, generally represent the sum of all the myo-inositol phosphate forms in foods rather than only those forms known to inhibit nonheme iron absorption (i.e., InsP6 to IP3) (Tseng et al., 1997; Armah et al., 2013). Using the values for total inositol phosphates is likely to increase the apparent negative impact of InsP6 in the models, thus compromising the ability of the models to accurately predict iron absorption. However, a new global food composition database that includes values for InsP6 and lower myo-inositol phosphate forms (and iron, zinc, and calcium) of plant-based foods, together with details of the processing and analytical methods is now available, enabling users to select myo-inositol phosphate values based on the most appropriate processing and analytical method (FAO/INFOODS/IZiNCG, 2018).

Nevertheless, all algorithms have limitations often under-estimating bioavail­ability when compared with measured absorption. Reports have shown a 3-fold variation in bioavail­ability estimated by applying different algorithms (Beard et al., 2007; Atkins et al., 2024). Ultimately, the choice of an algorithm to use depends on the population under study, and the type of dietary and biochemical data available. Some examples of algorithms applicable for the diets of children and adults are described briefly below.

4b.1.1   FAO/WHO semi-quantitative model

This semi-quantitative classification was based on results from isotopically labeled iron absorption studies on typical single meals in Asia, India, Latin America, and Western countries. FAO/WHO estimated dietary iron bioavail­ability for three different categories of diet that relate to the amount of meat and dietary iron absorption modifiers, as shown in Box 4b.1, although they do not quantify the exact amounts.

Box 4b.1: FAO/WHO semi-quantitative model to estimate iron bioavail­ability

Modified from FAO/WHO (2002)
Note the estimates of absorption given in Box 4b.1 refer to non-anemic individuals with normal iron transport (i.e., with normal hemoglobin concentration) but no iron stores. When individuals have iron deficiency anemia (i.e., low hemoglobin and no iron stores), absorption may be increased by 50 percent (i.e., increasing to percentages of 7.5, 15, and 22.5, respectively, in absorption for the low, intermediate, and high-bioavail­ability diets) (FAO/WHO, 2002). Note that for Western-type diets, FAO and WHO now propose two categories of bioavail­ability — 12% and 15% — depending on the meat content of the diet. There are no data available validating the results of this model in relation to the iron status of populations (Reddy, 2005).

4b.1.2   Model of Murphy et al. (1992)

This model was adapted from the algorithm of Monsen & Balintfy (1982) to estimate iron bioavail­ability in diets of children in low-income countries. In this algorithm, heme iron absorption is assumed to be 25%, and to account for 40% of the iron in meat, poultry, and fish. The absorption of nonheme iron is assumed to be lower and to vary according to the amount of meat, poultry, and fish, ascorbic acid, and polyphenols, as well as the level of iron stores of the individual. The inhibitory effect of phytate on iron absorption is not considered in this model.

To use this algorithm, quantitative data on the intake of heme iron, nonheme iron, and two enhancers--ascorbic acid and protein from meat, poultry, and fish--are required. The cut-offs applied for these two enhancers are those derived from Monsen but are expressed per 4.18 MJ (1000 kcal), so that the same algorithms can be used for males and females across all age groups. The percentage levels for the bioavail­ability of nonheme iron given for each class and shown in Table 4b.1 approximate those of the typical meals of the low, medium, high bioavail­ability categories of the FAO/WHO algorithm.

Table 4b.1: Estimated percentage bioavail­ability of nonheme iron for iron-deficient, nonanemic persons with differing intakes of meat + fish + poultry protein and ascorbic acid. From Murphy et al. (1992)
Ascorbic acid
(mg / 4.18 MJ)
Meat + fish + poultry protein (g/4.18MJ)
<9 9-27 >27
< 35 5 10 15
35–105 10 15 15
>105 15 15 15

For individuals with iron deficiency anemia, the estimated absorption is increased by 50%, i.e., to 7.5%, 15%, and 22.5% absorption for low, intermediate, and high bioavail­ability diets, respectively (FAO/WHO, 2002) The algorithm can be corrected for the inhibitory effect of polyphenols from tea on nonheme iron absorption. The correction factor applied depends on the average number of cups of tea or coffee per day. For tea, factors range from 1 if no tea is consumed to 0.40 for at least 600mL because a 200 to 250mL cup of normal strength tea will reduce nonheme iron absorption at a meal by approximately 60 percent. Corresponding correction factors for coffee are in the range of 1 and 0.60. The final algorithm equals:

\[\small \mbox {Available iron = }\] \[\small \mbox {heme iron × 0.25 + (nonheme iron × availability factor × tea factor )}\]

This model can also be used to calculate the bioavail­ability of iron on a daily rather than a meal basis, if the data for food intake by meal are not available. Estimates of available iron derived from day and meal-based survey results using the same model have proved comparable. This model has been used to estimate available iron intakes of toddlers in rural Egypt, Kenya, and Mexico (Murphy et al., 1992) and children in rural Malawi (Yeudall et al., 2005).

4b.1.3   Algorithm of Tseng et al. (1997)

This was the first algorithm to adjust iron availability for the inhibitory effect of phytate. The model was adapted from that of Murphy et al. (1992) and included an adjustment for the enhancing effects of meat, poultry, and fish and vitamin C on nonheme iron absorption, followed by a further separate adjustment for the inhibiting effects of both tea and phytate. Heme iron availability was assumed to be 23% in this model, lower than that assumed for Murphy’s model (i.e., 25%). To use this algorithm, both the nonheme and heme iron content of each food consumed on each eating occasion by each individual in the study sample must be included in the nutrient composition database. To derive these estimates, nonheme iron is assumed to be 60% and 100% of the total iron in meats and non-meats, and the remaining iron (if any) is assumed to be heme. This algorithm has had limited use. It was used to estimate the intake of bioavailable iron in both rural and urban children and women in Russia (Tseng et al., 1997), and in preschoolers in Australia (Atkins et al., 2024).

Note neither this algorithm nor the one developed by Murphy et al. (1992) attempted to quantify interactions among the dietary factors or adjust for the inclusion of calcium or the proportion of iron derived from fortification.

4b.1.4   Algorithm of Armah and co-workers (2013)

This algorithm was used to estimate iron absorption from the US whole-day diet and considered the mean intake of inhibitors (phytate, polyphenols, calcium) and enhancers (ascorbic acid) and the proportion of heme and nonheme iron. The datasets used were from four diet studies designed to measure the effects of ascorbic acid, meat, tea, and calcium on nonheme iron absorption from 5-d complete diets of US male and female adults aged 19-38y (n=53) measured using an extrinsic radio labeling technique. Each participant in the studies was assessed for three one-week periods during which they consumed diets that were typical, high, or low in meat, tea, calcium, or vitamin C. Iron status of the adults was measured using serum ferritin. Absorption of heme iron was assumed to be 25% in this model. Multiple linear regression was used to quantify the effect of different factors on nonheme iron absorption. The investigators claim the algorithm can be used to predict nonheme iron absorption from the diets of different populations.

The final fitted regression model for nonheme iron absorption in adults is:

\[\small \mbox {Ln Absorption (%)=6.294 – 0.709 ln (SF) + 0.119 ln (C)}\] \[\small \mbox {+ 0.006 ln (MFP + 0.1) −0.055 ln (T + 0.1)}\] \[\small \mbox { – 0.247 ln (P) – 0.137 ln (Ca) – 0.083 ln (NH)}\]

Where SF is serum ferritin (µg/L); C is vitamin C (mg); MFP is meat, fish, poultry (g); T is tea (number of cups); P is phytate (mg); Ca is calcium (mg); NH is nonheme iron (mg)

The investigators validated this algorithm with single meal and complete diet data and reported the r2 values as 0.57 (p 0.001) for the single meals and 0.84 (p <0001) for the complete diet data. When applying this algorithm, the mean daily intake of nutrients based on at least 3 days of diet records should be determined and serum ferritin concentrations in each participant measured.

Of the factors explaining differences in nonheme iron absorption, serum ferritin was the most important; the effect of the dietary modifiers was small. Between-person variation explained a large proportion of the differences in nonheme iron absorption, possibly attributed to other host-related factors besides iron status, such as inflammation and obesity. Further studies are needed to identify other unknown non-dietary factors affecting nonheme iron absorption.

Armah et al. (2015) applied this algorithm to estimate both nonheme iron absorption and the total iron bioavail­ability from the US diet using information on dietary intakes and iron biomarkers from the NHANES 2001-2002 survey. Iron biomarkers included serum ferritin, hemoglobin, and C-reactive protein. For each individual, daily intakes for total iron, vitamin C, meat poultry, fish, calcium, phytate, and polyphenols (as black tea equivalents) were obtained. Of the total iron intake, 90% was estimated as nonheme iron and 10% as heme iron, with a value of 25% for heme iron absorption.

In this study, nonheme iron absorption was estimated at the individual level and geometric means estimated by age and gender, and by gender and ethnicity. The unadjusted geometric mean (95% CI) of unadjusted nonheme iron absorption for all individuals was 3.7% (3.6%, 3.8%), and higher in females that males. Two approaches were used to estimate total iron absorption. In the first approach all the participants of the survey were included but the geometric mean of nonheme iron absorption was adjusted to 15µg ferritin/L serum to correspond to values of individuals with no iron stores. In the second approach absorption of nonheme iron was estimated only for nonanemic individuals with no iron stores. Using the two approaches, after correcting individual nonheme iron absorption values and adding fractional absorption from heme iron (i.e., 25%), the calculated percentages of total dietary iron absorption for US diets were 15.5% and 15.1%, compared with the current estimate of 18% (IOM, 2001). See Armah et al. (2015) for details of the equations used for these calculations.

The authors cautioned that estimating total iron absorption for all individuals without adjusting for serum ferritin is not advisable because the value obtained underestimates iron absorption for individuals with low iron stores while overestimating absorption for those with very high iron stores. Instead, by adjusting nonheme iron absorption to 15µg ferritin/L serum as carried out in their first approach, absorption is only overestimated in those individuals with adequate iron stores.

4b.1.5   Algorithm of Collings and co-workers (2013)

These investigators conducted a systematic review and calculated nonheme iron absorption from pooled data on iron status (serum or plasma ferritin) and dietary modifiers based on five isotope studies on whole diets of 58 U.S adults > 18y. Studies that included participants with unspecified illness or disease were excluded except those with iron deficiency or iron deficiency anemia. From the individual data, a regression equation for nonheme iron absorption was derived based on a simplified scoring system. The latter was used to classify diets in view of the limited effect of the dietary factors on nonheme iron absorption in healthy iron-replete individuals. Age and sex were shown to have no effect and hence were not included in the regression equation shown below:

\[\small \mbox {Log[nonheme-iron absorption, %]}\] \[\small \mbox {= −0.73 log[ferritin, µg/L] + 0.11[modifier] + 1.82}\]

Where [modifier] is 0 for standard diets, -1 for diets that include an inhibitor, and +1 for diets that include an enhancer.

Note this equation was used to predict the effect of dietary modifiers on percentage absorption of nonheme iron in individuals with low to high serum ferritin concentrations (6‑80µg/L). Within this range of serum ferritin concentrations (6‑80µg/L), predicted absorption values for nonheme iron ranged from 2.1 to 23.0%, depending on iron status and type of diet. For individuals with low iron stores (serum ferritin 12µg/L) consuming a standard diet, non-heme iron absorption was predicted to be 10.8%, increasing to 13.9% when a higher bioavail­ability diet (low calcium, high vitamin C, high meat) was consumed, and decreasing to 8.4% in a lower bioavail­ability diet (high calcium, low vitamin C, no meat).

This algorithm only requires information on the iron status (serum or plasma ferritin) of the population coupled with assumptions about the type of whole diet based on the presence of enhancers and inhibitors. The authors emphasize that many of the studies included, although on whole diets, were of very short duration and did not reflect the diets consumed over time. Such short-term measurements of absorption may overestimate differences in iron bioavail­ability between diets, as noted earlier (Hunt & Roughead, 2000). In addition, because the individual iron absorption data pooled to develop the regression equation was from US studies and based on young adults, the regression equation should not be extrapolated to predict absorption of nonheme iron from low-income countries or for men and women of different ages.

4b.1.6   Probability-based approach of Dainty and co-workers (2014)

To estimate total absorption of dietary iron using the algorithms of Armah et al. (2013) and Collins et al. (2014), an estimate of the quantity of absorbed heme iron using an assumed absorption value must be added to the value for predicted nonheme iron absorption. In contrast, the probability-based approach of Dainty et al. predicts total dietary iron absorption (both heme and nonheme iron) at the population level and can provide estimates of total iron absorption from a mixed Western-style diet at any level of iron status. For this approach, measurements of serum ferritin, intakes of total iron (heme and nonheme iron), and factorial calculations of iron requirements for the population are required. However, unlike most of the earlier algorithms, no data on dietary iron absorption modifiers or heme iron are required. Furthermore, estimates of dietary iron bioavail­ability can be made not just for adult individuals with low storage age (defined by serum ferritin of 15µg/L), but for any selected mean serum ferritin level. Nevertheless, the method cannot be used for children, pregnant women, and individuals immediately after menopause because of their changing iron requirements. In addition, care must be taken to ensure the serum ferritin concentrations are not confounded by inflammation/infection or obesity by employing the BRINDA correction method (Namaste et al., 2017).

This novel approach was developed using adult data on dietary intakes from a mixed Western-style diet (via 7d diet diary), hematological measures (hemoglobin and serum ferritin) from a representative group of men (n=495; mean age 42.7±12.1y) and pre-menopausal women (n=378; mean age 35.7±8.2y) from the 2000‑2001 adult cohort of the UK National Diet and Nutrition Survey (NDNS). Individuals with elevated serum ferritin concentrations due to inflammation (i.e., elevated α‑1‑antichymotrypsin) or those taking iron supplements were excluded.

The first step involved estimating the distribution of dietary iron requirements for the NDNS study sample. This was achieved by employing the factorial modeling developed by the US Institute of Medicine (IOM, 2000) that calculated the distribution of estimated iron requirements needed to meet body functions with a minimal iron store for several age and sex groups. The IOM values for pre-menopausal women (mixed population of oral contraceptive users and non-users) and men were used. The derived dietary iron requirements for the UK NDNS study sample were converted to requirements for absorbed iron for individuals with low levels of storage ion but were not anemic by multiplying by 0.18 (U.S. IOM values assume 18% iron absorption from a US mixed adult diet). In addition, a full probability approach was also applied to the dietary iron intake data from the UK NDNS to predict the prevalence of a dietary iron intake that would be sufficient to maintain iron balance for the men and women based on their estimated daily iron intake and a series of percentage iron absorption values from 1‑40%. An average dietary iron absorption in the population sample was then calculated for selected serum ferritin concentrations by matching the observed prevalence of inadequacy (i.e., from prevalence of serum ferritin below the designated level) with the prevalence predicted for the range of absorption estimates (from 1‑40%).

Using this approach, the mean calculated dietary iron absorption was 8% in men (50th percentile for serum ferritin 85µg/L) and 17% in women (50th percentile for serum ferritin 38µg/L). With a serum ferritin level of 45mg/L, the estimated absorption for iron was similar for both men (14%) and for women (13%) (Dainty et al., 2014).

The European Food Safety Authority (2015) applied this approach to predict total iron absorption in European diets and selected a target value of 30µg/L for serum ferritin concentrations for men and premenopausal women because this reflects iron stores of approximately 120mg. At this level of ferritin, the predicted iron absorption values for European men and premenopausal women were 16% and 18%, respectively (EFSA, 2015).

An interactive tool based on the probability approach has been developed for estimating total dietary iron absorption in adult populations with a selected target serum ferritin concentration; for more details see Fairweather-Tait et al. (2017). To download the correct table, see Dainty et al. (2024). Note that because both the iron requirements and dietary iron must be in a steady state for at least a year when applying this approach, it is not suitable for certain groups, as noted earlier. Care must be taken to ensure the appropriate correction for the impact of inflammation on serum ferritin concentrations is performed, when necessary (see “Acute and chronic inflam- mation” in Chapter 17, Section 17.7.1 for more details).

Subsequently, additional data from a nationally representative survey in Ireland and data collected in older people in the UK have been included in the model (Fairweather-Tait et al., 2017). This predictive model has also been used in women of child-bearing age from Benin to estimate country-specific percentage iron absorption at different levels of iron status (Fairweather-Tait et al., 2020). For diets in Benin in which 29% of iron was from cereals and 30% from animal products, predicted iron absorption was 6% at a serum ferritin value of 30µg/L compared with 18% for the Western diet reported earlier (EFSA, 2015). Moreover, iron absorption remained low until the serum ferritin fell below 25µg/L. More work is required to adapt the existing model for adults from other low and middle-income countries in which intakes of iron absorption inhibitors may be higher and heme iron intakes lower than those in whole diets in Benin.

4b.2      Bioavail­ability of zinc

The absorption of dietary zinc, unlike iron, does not change in response to alterations in whole- body zinc homeostasis or status. Instead, zinc absorption is influenced by current zinc intake, not the past or long-term zinc intakes, or status. With increasing zinc intakes, the total amount of absorbed zinc over the whole day increases while the percent absorbed (i.e., fractional absorption) declines (Figure 4b.1). Fractional zinc absorption is determined by dividing the amount absorbed by the amount digested.

Figure 4b.1
Figure 4b.1 Effect of dietary zinc on fractional (FZA) and total zinc absorption (TAZ}. Redrawn from King (2010).

These adjustments in the efficiency of zinc absorption with changes in zinc intake are controlled by the up-regulation and down-regulation of zinc transporters (primarily Zip4) and possibly other proteins involved in the transport of zinc (King & Cousins, 2014; King et al., 2015).

Absorption and/or utilization of zinc may also be influenced by other host related factors such as physiological state and possibly age. Studies have shown an increase in zinc absorption in late pregnancy and lactation, most notably when dietary zinc intake is low (Fung et al., 1997; Donangelo et al., 2005; Donangelo & King, 2012; Hambidge et al., 2017) whereas during aging, zinc absorption may decline (Turnlund et al., 1986).

Not surprisingly, given the dominant role of the gastrointestinal tract in zinc homeostasis, malabsorptive disorders that alter the integrity of the mucosal cells, also reduce absorption of zinc, as has been noted for iron. Such malabsorptive disorders include celiac disease (Crofton et al., 1990; Tran et al., 2011) and environmental enteric dysfunction (EED) (Syed et al., 2016), although the magnitude of their effects on zinc absorption has not been quantified (Manary et al., 2010; Lindenmayer et al., 2014).

In addition to the amount of the dietary zinc intake, food sources of zinc may also affect zinc absorption. The dietary factor with the greatest effect on zinc absorption is phytic acid (myoinositol hexakisphosphoric acid (InsP6) and its associated magnesium, potassium, and calcium salts — termed phytates. Phytate binds zinc in the intestinal lumen and forms an insoluble complex that cannot be digested or absorbed by humans, as noted earlier. This inhibitory effect on zinc absorption follows a dose-dependent response and can be substantial. If the habitual diet is rich in phytate, adults cannot adapt by increasing zinc absorption (Hunt et al., 2008) or enhancing reabsorption of endogenous zinc (Hambidge et al., 2010). Whether phytate has an inhibitory effect on zinc absorption in young children is uncertain. Miller et al. (2015) failed to detect a negative effect of phytate on zinc absorption in their isotope studies of infants and young children.

As noted earlier, InsP6 may be dephosphorylated during certain food processing, preparation, and storage practices to lower myo-inositol phosphate forms, of which only penta-inositol phosphate (InsP5) significantly inhibits the bioavail­ability of zinc; the lower inositol phosphates have no inhibitory effect (Gibson et al., 2018).

Molar ratios of phytate-to-zinc of individual foods or whole diets are often used to estimate the likely proportion of zinc absorbed in view of the dose-dependent inhibiting response of phytate on zinc absorption. Consequently, molar ratios of phytate-to-zinc of individual foods or whole diets are a critical component of any algorithm used to estimate zinc bioavail­ability. Where possible, local food composition data for zinc, and the global food composition database for phytate should be used to calculate phytate: zinc molar ratios, as both the zinc and phytate content of plant-based foods can vary with both local soil conditions and food preparation and processing practices (Gibson et al., 2018). Below is the equation to calculate the phytate:zinc molar ratio for a whole diet. \[\small \mbox{= }\frac{\mbox{mg phytate per day / 660}}{ \mbox{mg zinc per day / 65.4}}\] For example, if the phytate intake is 883 mg/d and the zinc intake is 7 mg/d, then the phytate:zinc molar ratio is 12.4.

In the initial semi-quantitative model by developed by WHO in 1996, and then adopted by both FAO and WHO in 2002 (WHO/FAO, 2004) to estimate the bioavail­ability of zinc, the protein from meat, fish or poultry was considered an enhancer of zinc absorption, while both high levels of calcium, particularly calcium salts, and the proportion of phytate to zinc in the whole diet were considered inhibitors of zinc absorption. This led to the use of millimolar ratios of phytate × calcium : zinc to predict zinc absorption (Fordyce et al., 1987). However, the use of this ratio has been discontinued. Recent research has reported conflicting effects of calcium on zinc absorption, with some showing a positive effect, although modest (Miller et al., 2013), while others have shown no impact irrespective of whether intakes of dietary phytate were high or low (Hunt & Beiseigel, 2009). Whether calcium has an adverse effect in phytate-containing diets low in zinc is uncertain (King et al., 2015). Likewise, because of the limited effect of dietary protein on enhancing zinc absorption, dietary protein is no longer included in zinc bioavail­ability algorithms (IZiNCG, 2004; Miller et al., 2013).

4b.2.1   WHO semi-quantitative classification for bioavailable zinc

This was developed in 1996 by WHO and adopted by FAO and WHO in 2002 and in 2004. WHO applied zinc absorption data from published isotope studies on adults that focused mainly on single meals, individual foods, and some whole-day diets. To determine the proportion of dietary zinc that was absorbed by the intestine (i.e., fractional zinc absorption), the mean amount of absorbed zinc was regressed against total zinc intakes from diet types that differed in their composition.

The WHO system takes into account three dietary variables, considered important predictors of the bioavail­ability of zinc at that time. These include the amount and source of dietary protein, the proportion of phytic acid to zinc in the whole diet, and high levels of calcium, particularly calcium salts, as noted earlier. In general, however, the calcium content of most plant-based diets is probably too low to have any detrimental effect. No allowance for an adaptive response to meet the additional physiological demands for zinc during pregnancy or lactation was considered by WHO in view of the paucity of data at that time.

The FAO/WHO (2004) system of classifying diets into three broad categories of low, moderate, and high zinc bioavail­ability is described in Box 4b.2.
Box 4b.2: FAO/WHO semi-quantitative model for estimating the bioavail­ability of zinc

Low-bioavail­ability diets: zinc absorption of about 15%. Such diets are characterized by: Moderate bioavail­ability diets are mostly mixed diets with a zinc bioavail­ability of about 30%. They include: High bioavail­ability diets are mostly diets with an adequate protein content mainly from non- vegetable sources. They have a zinc bioavail­ability of about 50%, and include:
As noted earlier, because of the conflicting effects of calcium and the limited effect of dietary protein on zinc absorption, neither calcium or protein are included in more recent zinc bioavail­ability algorithms.

4b.2.2   IZiNCG (2019) qualitative estimate for the intake of total absorbable zinc

Since 1996, new research has led to the development of revised qualitative and quantitative models for estimating zinc absorption. IZiNCG (2019) has provided a qualitative estimate for the intake of total absorbable zinc which is presented in a technical brief entitled: Determining the risk of zinc deficiency. To determine this qualitative estimate, the first step is to calculate the total intake of zinc and phytate from the dietary intake data, based preferable on three non-consecutive days. Local food composition data for zinc should be used, as these can vary with local soil conditions (Alloway, 2004). Data on the zinc content of local foods can be obtained from regional and national centers of the FAO Network of Food Data Systems (INFOODs). These food composition tables can be downloaded free of charge from the (FAO/INFOODS website).

The values for the phytate content of raw and processed plant-based staples are available in the FAO/INFOODS/IZiNCG Global Food Composition Database for phytate. Care must also be taken to ensure the myo-inositol phosphate values selected consider both the most appropriate food processing and analytical method.

Next, the phytate : zinc molar ratios of the whole diets must be calculated, as shown in Box 4b.2 to provide an estimate of likely zinc absorption. Diets with phytate : zinc molar ratios greater than 18 are considered unrefined cereal-based diets and classified as having low zinc bioavail­ability (i.e., 18% absorption for males and 25% absorption for females), whereas diets with ratios 4-18 are considered mixed or refined vegetarian diets and classified as having average zinc bioavail­ability (i.e., 26% absorption for males and 34% absorption for females). Note that if information on the phytate content of the diets cannot be calculated, then the zinc bioavail­ability of the diets should be categorized based on whether they are based on unrefined cereals and/or legumes (i.e., low bioavail­ability) or mixed or refined vegetarian diets (i.e., average zinc bioavail­ability)

4b.2.3   European Food Safety Authority (EFSA) quantitative trivariate model

This model provides estimates for adults of total absorbed zinc from Western diets. It is based on a refinement of the trivariate saturation response model of total absorbed zinc developed initially by Miller et al. (2007). The model examines the relationship between total absorbed zinc, total dietary zinc, and phytate, as shown in Figure 4b.2.

Figure 4b.2
Figure 4b.2 Three-dimensional representation of the trivariate model describing the relationship between total absorbed zinc (TAZ), dietary phytate, and dietary zinc. Adapted from European Food Safety Authority (2014).

For their refined model EFSA extracted individual values for true zinc absorption from 10 whole-day isotope studies in healthy adults based on representative Western diets. For each study, intakes of total dietary zinc (TDZ) and total dietary phytate were reported. A total of 72 data points, reflecting 650 individual measurements, were used to generate predictions of total absorbed zinc as a function of total dietary zinc at 6 different levels of dietary phytate, ranging from 0 to 3000mg/d as shown in Figure 4b.3. The modifying effects of calcium, protein, and iron were found to be insignificant and hence were discounted in the final model, which was said to account for more than 80% of the variance in total absorbed zinc.

Figure 4b.3
Figure 4b.3 Saturation response model predictions of total absorbed zinc (TAZ) for selected levels of dietary phytate. Adapted from European Food Safety Authority (2014).

Based on the final model, EFSA generated values for dietary zinc absorption for males and females aged > 18y at four levels of phytate (and zinc) intake chosen to cover the usual mean/median phytate intake in different European countries (see Table 4b.2).

Table 4b.2:Dietary zinc absorption based on differing levels of phytate intake. Adapted from European Food Safety Authority (2014).
Phytate level
(mg/d)
Men,
Zn absorption (%)
Women,
Zn absorption (%)
300 42 46
600 33 38
900 28 32
1200 24 28

Note that when Miller et al. (2015) applied their trivariate saturation response model based on data from infants and children with a mean age of 24months, no significant inhibitory effect of phytate was reported. The only significant predictor of zinc absorption in their model apart from the dietary zinc content, was child age. It is possible, however, that the lack of a significant effect of phytate was due in part to insufficient statistical power as few studies included data on phytate intakes, as noted earlier. More recent zinc tracer studies have reported marked increases in zinc absorption following the addition of exogenous phytase in high phytate test meals in young children (Brnic et al., 2017; Zyba et al., 2019).

4b.2.4   IZiNCG quantitative estimate of intake of total absorbable zinc for non-pregnant, non-lactating adults aged >19y

For non-pregnant, non-lactating adult >19y, the total intakes of zinc and phytate for each individual can be used to estimate total absorbale zinc using the updated trivariate saturation response model refined by EFSA (2014) and outlined above. Details are described in IZiNCG Technical Brief No.3, 2019 2nd Edition, and summarized by the following equation where TAZ is the total absorbable zinc; TDZ is the total dietary zinc. and TDP is the total dietary phytate.

izincg

An example of how to apply the equation to calculate total absorbed zinc from a national Cameroon diet is presented in Box 4b.3.

Box 4b.3: How to estimate total absorbable zinc in a national diet

Assuming a mean intake of total dietary zinc (TDZ) of 8.5mg/d and a mean intake of total dietary phytate (TDP) of 900mg/d from the Cameroon national food consumption survey, then
  1. Convert TDZ and TDP as mmol/d as follows:
    Mean TDZ in mmol/d = 8.5 mg/d / 65.4 = 0.13 mmol/d
    Mean TDP in mmol/d = 990 mg/d / 660 = 1.5 mmol/d
  2. Substitute TDZ = 0.13 mmol/d and TDP = 1.5 mmol/d into this equation
    izincg
  3. Convert TAZ as mmol/d back to mg/d
    TAZ = 0.0414 × 65.4 = 2.7 mg/d
  4. From this, fractional absorption of zinc (FAZ) (i.e. bioavail­ability) in the Cameroon national diet can be calculated as follows:
    FAZ = TAZ/TDZ (× 100%) = 2.7/8.5 (× 100%) = 32 percent

4b.3       Bioavail­ability of other nutrients

Research on factors affecting the bioavail­ability of other nutrients in contemporary diets is urgently required. For several nutrients, the amount available for absorption can be only estimated. Such estimates are generally made by the expert committees formulating the nutrient recommendations, and they vary primarily according to the characteristics of the habitual diet of a country. More work is required to develop algorithms for predicting the bioavail­ability of these micronutrients in diets of differing compositions. So far, steps have been taken to develop algorithms to quantify the bioavail­ability of protein, folate, vitamin A, vitamin E, and vitamin D in human diets; these are considered below. Niacin is discussed in Section 4a.2.4.

4b.3.1   Protein-digestibility corrected amino acid score (PDCAAS).

This score considers the content of amino acids and the digestibility of the ingested protein, attributes of protein quality which have frequently been neglected in many studies investigating the adequacy of dietary protein (Arsenault & Brown, 2017). The method was recommended by FAO / WHO in 1993 and WHO / FAO / UNU in (2007). It is based on a comparison of the first limiting amino acid in the test protein (i.e., the amino acid present in the lowest concentration in a food) with the concentration of that amino acid in a reference scoring pattern. Initially, the amino acid requirements of preschool-age children (i.e., 1-2y) was used as this was considered the most demanding in terms of protein quality of indispensable amino acids). However, with the recognition that requirements for the indispensable amino acids differ among age groups, age-specific amino acid scoring patterns (mg/g protein) for calculating protein quality were recommended by FAO in (2013). The three age-related reference patterns recommended as shown in Table 4b.3 are

Table 4b.3:Amino-acid scoring patterns (mg/g protein requirement) for children for protein quality.
HIS: histidine; ILE: isoleucine; LEU: leucine; LYS: lysine; SAA: sulfur amino acids; AAA: Aromatic amino acids;THR: threonine; TRP: tryptophan; VAL: valine. From FAO / WHO / UNU (2013).
Age (y) HIS ILE LEU LYS SAAAAA THR TRP VAL
0.5203266572752318.543
1–2183163522546277.442
3–10163161482341256.640

This comparison generates a limiting amino acid score for the test protein which is then multiplied by digestibility. For the PDCAAS method, nitrogen fecal digestibility (i.e., digestibility over the whole intestine) is applied. The calculation is shown below: \[\small \mbox {PDCAAS =} \frac{\mbox {mg of limiting amino acid in 1g test protein}} {\mbox {mg of the same amino acid in 1g of the reference protein}}\] \[\small \mbox { × true fecal digestibility}\]

Note initially the highest PDCAAS score any protein could achieve was 1.0. This indicated that the protein provided all the indispensable amino acids in the amounts required by the body. Any scores exceeding 1.00 were truncated to a maximum of 1.0, reflecting the view that amino acids supplied above the requirements do not have additional physiological value and will be catabolized. Truncation was also said to simplify the comparison of protein quality among different foods.

Table 4b.4: Calculation of Protein Digestibility-Corrected Amino Acid Score (PDCAAS) for single foods
1 Source of information on protein and amino composition of foods is USDA Standard Reference Version
2 Reference amino acid pattern for young children (0.5y to 3y) (57mg lysine/g protein, 27mg sulfur amino acids/g protein, 31mg threonine/g protein, 8.5mg tryptophan/g protein). See Table 4b.3.
3 Truncation of amino acid score is recommended by WHO / FAO / UNU, 2007.
4 Digestibility factors from FAO (1991).
Rice Wheat Sorghum Maize Lentils Milk
USDA code1 20450 20630 20648 20320 16069 01077
Protein, g/100g 6.6 9.7 8.4 8.1 24.6 24.6
Amino acids, mg/g protein of food
Lysine 36 27 21 28 70 84
Sulfur-AA 44 41 37 39 22 32
Threonine 36 27 37 38 36 43
Tryptophan 12 11.6 12.6 7.0 9.0 12.7
Ratio of amino acid/g protein in food to reference protein scoring pattern2
Lysine 0.63 0.47 0.36 0.49 1.23 1.47
Sulfur-AA 1.62 1.50 1.37 1.44 0.80 1.20
Threonine 1.15 0.88 1.20 1.21 1.16 1.37
Tryptophan 1.37 1.37 1.48 0.83 1.06 1.40
Amino Acid Score3 0.63 0.47 0.36 0.49 0.80 1.003
Digestibility factor4 0.88 0.96 0.74 0.85 0.78 0.95
PDCAAS (lowest ratio
digestibility factor)
0.56 0.45 0.27 0.42 0.62 0.95
In (2007) WHO/FAO/UNU questioned the practice of PDCAAS truncation, as well as which value to truncate (i.e., the PDCAAS or the amino acid score) prior to multiplying by the digestibility factor. The decision reached was to calculate PDCAAS values from a truncated amino acid score value on the basis that digestibility is first limiting. Examples of how to calculate the PDCAAS for both single foods and a mixture of foods are shown in Tables 4b.4 and 4b.5, respectively. Table 4b.4 is modified from Arsenault & Brown (2017), whereas Table 4b.5 is from WHO / FAO / UNU (2007). Note when calculating PDCAAS values it is usually only necessary to use a pattern based on the four indispensable amino acids likely to be limiting in dietary protein. These are lysine (mostly cereal proteins), the sulfur amino acids (legume proteins), tryptophan (some cereals such as maize), or threonine (some cereals). Hence, only these four amino acids have been adjusted for digestibility in the examples given in Tables 4b.4 and 4b.5. Further, note the amino acid score for the food mixture in Table 4b.5 is calculated from the weighted average digestible amino acid content. In both examples, fecal digestibility factors are applied. For more details of the calculation of PDCAAS scores, see Marinangeli & House (2017). For PDCAAS for selected foods, see Boye et al. (2012) and Arsenault & Brown (2017).

Note that in situations where there may be an additional need for specific amino acids (i.e., pregnancy, lactation, old age, certain clinical conditions), the scoring pattern recommended for adults may be of limited validity.

Recently, PDCAAS has been applied to estimate the adequacy of protein intakes in young children in several low-income countries (Arsenault & Brown, 2017). Table 4b.5 provides an example of how to apply the PDCAAS to determine the protein adequacy of a mixed diet based on milk, maize, and rice.

Table 4b.5. Example calculation of Protein Digestibility-Corrected Amino Acid Score (PDCAAS) for mixed diet. Adapted from Table 6 in WHO / FAO / UNU (2007). 2 Source of information on protein and amino acid (AA) composition of foods is USDA Standard Reference Version 28 (code 01077 milk, 20320 maize, and 20450 rice) 3 Digestibility factors from FAO (1991).
Food
item
Consu-
med, g
Protein
g/100g
food2
Amino acid content of
food, mg/g protein2
Protein
consu-
med, g
Digest-
ibility
factor3
A B C D E F G H
Lys SAA Thr Trp
Milk 150 3.2 84 32 43 13 4.8 0.95
Maize 50 8.1 28 39 38 7 4.1 0.85
Rice 100 6.6 36 44 36 12 6.6 0.88
Weighted average digestibility
(sum of digestible protein/total protein (2010).
0.89
Digestable
protein, g
Digestible amino acid, mg
Food item (G x H) C × G x H D × G x H E × G x H F x G x H
Milk 4.6 383 146 196 59
Maize 3.4 97 134 130 24
Rice 5.8 210 255 208 68
Total: 13.8 690 535 533 151
Lys SAA Thr Trp
Amino acid mg/g protein
(total of each digested
AA/total digestible protein)
50 39 39 11
Reference pattern
mg/g protein
57 27 31 8.5
Amino acid score:
AA mg/g protein divided
by mg/g reference protein
0.88 1.44 1.25 1.29
PDCAAS (lowest AA score x
weighted digestibility factor)
0.78

4b.3.2   Digestible Indispensable Amino Acid Score (DIAAS)

There is increasing recognition that the PDCAAS has several limitations. They include: true fecal nitrogen digestibility does not consider the loss from the colon of indispensable amino acids that were not absorbed in the ileum; failure to credit additional nutritional value to high biological value proteins; overestimation of protein foods that contain antinutritional factors; and overestimation of protein foods of lower digestibility when supplemented with limiting amino acids (Boye et al., 2012). As a result, a newer method, the Digestible Indispensable Amino Acid Score (DIAAS) has been proposed. DIAAS relies on measures of true ileal digestibility of individual amino acids and lysine bioavail­ability because they better reflect the true quantity of amino acids digested and absorbed. It also avoids truncation of the score obtained. The most limiting digestible indispensable amino acid content (DIAA) defines the DIAAS value of a protein. For DIAAS, the three age-related amino acid reference scoring patterns shown in Table 4b.3 are used.

To calculate DIAAS of single protein source, data on the complete indispensable amino acid (IAA) composition of the protein, the crude protein content (CP), and the IAA standardized ileal digestibility (SID) are required. For a given IAAy, DIAAy ratio is calculated as follows: \[\small \mbox {DIAAS = 100 x lowest DIAAy ratio among IAAs}\] where IAAy is expressed as mg/g CP. The lowest DIAA ratio leads to the DIAAS value of the protein.

Values for DIAAS can also be calculated for protein mixtures. To accomplish this, average values of each single protein for IAAy and SIDy in the mixture must be obtained. See Herreman et al. (2020) for details of the calculation. The maximum DIAAS calculated among all possible ratios represents the optimal protein mixture. Care must be taken to ensure, where possible, that the values for IAAy and SIDy selected consider any processing and cooking conditions. A protein source or protein mixture with a DIAAS of 100 or above indicates that none of its indispensable amino acids is limiting and this source of protein has the potential to meet physiological requirements.

This method was first introduced by FAO (2013), but at that time there was a lack of human digestibility data available that utilized DIAAS. However, in a later joint FAO-IAEA meeting in 2022, valid in vitro models of ileal amino acid digestibility were presented and a database of ileal digestibility of proteins and individual amino acids in foods created (Lee et al., 2016). The need for data on foods consumed by individuals in developing countries where environmental enteropathy has the potential to affect their digestion and absorption together with data on the effects of processing and storage on protein quality was emphasized (Lee et al., 2016). Access to such a database in the future will facilitate calculation of the protein quality of individual foods and mixtures of foods (Tome et al., 2024). See Bandyopadhyay et al. (2022) for a review of current approaches to measure ileal amino acid digestibility in humans and currently available data.

4b.3.3   Bioavail­ability of the polyglutamate and monoglutamate forms of folate

The polyglutamate form of folate occurs naturally in foods, whereas the monoglutamate form comprises synthetic folic acid commonly used for fortifying foods and dietary supplements. The polyglutamate form has a lower bioavail­ability than synthetic folic acid. As a result, the IOM (1998) introduced a new term — dietary folate equivalents (DFE) — to consider the differences in the bioavail­ability of all sources of ingested folate.

Total dietary folate equivalent (DFE) is defined as the quantity of natural food folate plus 1.7 times the quantity of folic acid in the diet. \[\small \mbox {µg DFE = µg natural food folate + (1.7 × µg synthetic folic acid)}\] This equation assumes that the bioavail­ability of food folate is about 50%, whereas that of synthetic folic acid taken with food is 85% (either as a food fortificant or as a supplement) or 100% when a supplement is taken on an empty stomach with water. Hence, folic acid taken with food is 85/50 = 1.7 times more available than folate naturally occurring in food (Caudill, 2010).

Many countries are now fortifying foods such as breads and grains with the synthetic monoglutamate form of folate (i.e., folic acid). In the USA, Canada, Australia, Chile and more recently New Zealand, fortification on a mandatory (population wide) basis has been introduced, and has proved to be highly effective, not only in increasing folate status and reducing folate deficiency (Yang et al., 2010) but also in reducing neural tube defects in that country (Honein et al., 2001; Lopez-Camelo et al., 2005; De Wals et al., 2007; Sayed et al., 2008). Mandatory fortification is now in place in 85 countries worldwide, both high and low-middle income countries. For a recent review of the concerns of potential adverse effects of excess folic acid intake and/or elevated folate status, see Maruvada et al. (2020).

Some food composition tables fail to distinguish between folate found naturally in foods and any synthetic folic acid added to foods as a fortificant. Work is underway to provide values for the total dietary folate equivalent (DFE) content of foods in food composition tables. Examples providing total dietary folate equivalents include the USDA Nutrient Database for Standard Reference and the New Zealand FOODfiles (2021). Caution must be used when borrowing dietary folate values from other food composition tables as they are not universally applicable and depend on the fortification regulations of the country. The following examples in Box 4b.4 provide guidance on how to convert folate values in a food composition database to µg DFE:

Box 4b.4: Conversion of folate values to µg dietary folate equivalents

Bailey et al. (2010) calculated folate intakes, expressed as DFEs, for the U.S population from the National Health and Nutrition Examination Survey 2003-2006. In contrast to earlier U.S surveys, this was the first study to include estimates of folic acid from dietary supplements as well as dietary folate intakes. Hence, the data generated reflect intakes that account for differences in the bioavail­ability of all sources of folate, as noted above. In this survey, 34.5% of the US population used dietary supplements that contained folic acid. These data emphasize that in a population such as the U.S where more than one-half of the population report the use of dietary supplements, monitoring folate status without the inclusion of such an important contributor of folic acid will yield inaccurate data.

Recently, rhe use of a single bioavail­ability value to accurately reflect food folate, especially when used in whole diets, has been questioned. Recent research has highlighted the influence of many post-absorptive factors shown to modify the bioavail­ability of folate, both naturally occurring and as folic acid. Examples of such factors include genetics, ethnicity-race, and sex. Clearly, in the current era of fortification and dietary supplement use, more large-scale studies are needed to refine folate bioequivalency values for use in whole diets.

4b.3.4   Bioavail­ability of preformed vitamin A and pro-vitamin A carotenoids.

In the diets of most developed countries, vitamin A occurs mainly as preformed vitamin A derived only from animal products: fish-liver oils, liver, butterfat, and egg yolk are the major dietary sources. Muscle meats are poor sources of preformed vitamin A. In contrast, in most low-income countries the main food sources are the provitamin A carotenoids from yellow and orange fruits (West, 2000) and dark-green leafy vegetables. Red palm oil, and certain indigenous plants such as palm fruits (buriti) in Latin America, and the fruit termed "gac” in Vietnam, are unusually rich sources of provitamin A carotenoids (FAO/WHO, 2002).

Provitamin A carotenoids, when derived from ripe yellow- and orange-colored fruits and cooked yellow tubers (e.g., sweet potatoes), appear to be more efficiently converted to retinol than when derived from dark green leafy vegetables (IOM, 2001; West et al., 2002). Processing methods, the food matrix, fat content of a meal, and more recently, genetic variations reportedly affect the bioavail­ability of provitamin A carotenoids (Torronen et al., 1996; van het Hof et al., 1988; Borel & Desmarchelier, 2017). Provitamin A carotenoids include β-carotene, β-cryptoxanthin, α-carotene, lutein, lycopene, and zeaxanthin, of which β-carotene and β-cryptoxanthin are the most important in the diet.

There is a lack of consensus about the bioavail­ability of ingested provitamin A carotenoids from food and the efficiency with which these absorbed carotenoids are subsequently converted to retinol (i.e., their bioconversion). Currently two conversion factors are used for calculating the amount of vitamin A activity in foods from provitamin A carotenoids, and the values applied differ across agencies. FAO/WHO (2002) still maintain the use of 1µg retinol equals 6µg of β-carotene and 12µg of other provitamin A carotenoids (mainly α-carotene and β-cryptoxanthin). These same carotenoid / equivalency ratios have also been adopted by the European Food Safety Authority (EFSA, 2017) and by Australia and New Zealand (2021). Furthermore, these agencies express the substances with vitamin A activity as retinol equivalents (RE), whether they are preformed vitamin A (mainly retinol and retinyl esters) in foods of animal origin or provitamin A carotenoids.

The U.S. Food and Nutrition Board, however, concluded that the bioavail­ability of provitamin A β‑carotene from plant sources is 12µg to 1µg retinol and 24µg to 1µg for other provitamin A carotenoids for healthy individuals. For a detailed justification of these conversion factors, see IOM (2001). The U.S has also adopted the term retinol activity equivalents (RA) for use when calculating the total amount of vitamin A in mixed dishes or diets.

Such inconsistencies in the specific carotenoids/retinol equivalency ratios applied exacerbate problems when comparing vitamin A values among food composition databases and, in turn, vitamin A intakes across countries. For example, vitamin A intakes calculated from some food composition data may be higher if the lower bioconversion factors for provitamin A carotenoids recommended by FAO/WHO and EFSA were used, rather than the higher bioconversion factors adopted by the United States. Some older food composition tables continue to express vitamin A in terms of international units (IU). Use of these older units is no longer appropriate for assessing dietary adequacy of vitamin A and should be discontinued (FAO/WHO, 2002).

Increasingly, synthetic sources of retinol and provitamin A compounds (mainly β‑carotene) are being added to foods or used as dietary supplements worldwide. In developed countries, foods fortified with preformed vitamin A may include ready-to-eat cereals, snack foods, beverages, margarine, and processed dairy products, whereas in low‑ and middle-income countries sugar, cereal flours, edible oils, margarine, and noodles are sometimes fortified with preformed vitamin A. Commercial dietary supplements may contain both preformed vitamin A and provitamin A (predominately as β‑carotene) (Tanumihardjo et al., 2016). As a result, when estimating the bioavail­ability (and the quantity) of vitamin A, both fortified foods and dietary supplements must also be considered. For more discussion of the confusion that may arise when assessing dietary vitamin A intakes, see Melse-Boonstra et al. (2017).

4b.3.5   Bioavail­ability of α‑toco­pherol

Vitamin E, a fat-soluble vitamin, is present in the diet as eight naturally occurring vitamin E analogues: four toco­pherols (α, β, γ, and δ) and four tocotrienols that have varying levels of biological activity. Major food sources of natural vitamin E are vegetable seed oils (wheat germ, sunflower, corn, soyabean, safflower, palm), and olive oil. Synthetic forms of α‑toco­pherol are called all‑racemic‑α‑toco­pherol and are widely used as antioxidants, dietary supplements, and fortified products (e.g., ready‑to‑eat breakfast cereals) (Ranard & Erdman, 2018; Borel et al., 2013).

The United States (IOM, 2000) and European Food Safety Authority (EFSA, 2015) consider the natural sources of α‑toco­pherol (i.e. RRR‑α‑toco­pherol) and the synthetic forms of α‑toco­pherol (i.e., all‑racemic) as the only biologically active forms of vitamin E. The other naturally occurring forms of vitamin E in food (β‑, γ‑, and δ‑toco­pherols and the tocotrienols), although absorbed, do not appear to be converted to α‑toco­pherol by humans, and are poorly recognized by the α‑toco­pherol transfer protein (α‑TTP) in the liver. Hence the β‑, γ‑, and δ‑toco­pherols and the tocotrienols are not considered by the IOM or EFSA when assessing vitamin E intakes or setting requirements. In contrast, WHO/FAO (2004) express dietary vitamin E activity as α‑toco­pherol equivalents (α‑TEs) to account for the combined biological activity assumed for the naturally occurring (i.e. RRR‑α‑toco­pherol) and synthetic (all‑racemic‑α‑toco­pherol) forms of vitamin E. However, WHO/FAO chose not to set any requirements for vitamin E due to the lack of information on the role of vitamin E in biological processes other than its involvement in antioxidant function at that time.

In food composition tables and databases in the past, the vitamin E content of foods has been expressed as α‑toco­pherol equivalents (α‑TEs). Factors have been used to convert the toco­pherols and tocotrienols in food to α‑toco­pherol equivalents (α‑TEs). For example, α‑TEs are defined by WHO/FAO (2004) as α‑toco­pherol, mg × 1.0; β‑toco­pherol, mg × 0.5; γ‑toco­pherol, mg × 0.1; α‑tocotrienol, mg × 0.3. The biological activities of δ‑toco­pherol and γ‑ and δ‑tocotrienol were below detection. All the synthetic all‑racemic‑α‑toco­pherols when present in a food are multiplied by 0.74 (WHO/FAO, 2004).

Today, in some food composition tables or databases (e.g., recent releases from USDA, Finland and Sweden), the vitamin E content of foods is expressed only by their α‑toco­pherol content (mg/100g) (EFSA, 2015). The content of other chemical forms of toco­pherol naturally occurring but with less biological activity are ignored. Hence, the vitamin E content of foods when expressed as their α‑toco­pherol content (mg/100g) alone will be less than when based on intake of α‑toco­pherol equivalents.

The activity of synthetic sources (i.e., all‑racemic‑α‑toco­pherol) relative to the natural sources of α‑toco­pherol (i.e, RRR‑α‑toco­pherol) is considered to be 50% by the US IOM (2000) and EFSA (2015) who define 1mg all‑racemic‑α‑toco­pherol as equal to 0.5mg RRR‑α‑toco­pherol. This is the ratio now used for food‑labelling purposes in the United States as shown below:

1mg α‑tocopherol (α‑T)
                = 1mg of RRR‑α‑tocopherol

1mg α‑toco­pherol (α‑T)
                           = 2mg of all‑racemic‑α‑toco­pherol

Nevertheless, there remains considerable uncertainty regarding the bioavail­ability and biopotency of the natural sources of α‑toco­pherol (RRR‑α‑toco­pherol) compared with the synthetic sources (all‑racemic‑α‑toco­pherol). In part, this has arisen because the 2:1 biopotency ratio shown above was based on bioavail­ability data and not on the measurement of a clinical endpoint. More research is urgently required that compares the relevant dose levels of RRR and all‑racemic‑α‑toco­pherol in relation to human diseases to resolve this uncertainty (Hoppe & Krennrich, 2000; Ranard & Erdman, 2018).

Limited data exists on factors affecting the bioavail­ability of α‑toco­pherol; both dietary and some host-related factors may be implicated (Borel et al., 2013). For a usual mixed diet, 75% was considered the average absorption for α‑toco­pherol by EFSA (2015). Absorption of α‑toco­pherol increases in the presence of dietary fat, so that in conditions associated with malabsorption of fat, intestinal absorption of vitamin E is impaired. Food matrix and interactions, including competition with other fat-soluble vitamins and carotenoids, might also influence the absorption of vitamin E, although existing evidence is limited (Borel et al., 2013). Among the host-related factors, in adults with metabolic syndrome, in the elderly, and in those who smoke cigarettes absorption tends to be reduced (Borel et al., 2013; Mah et al., 2015). Genetic variants may have a role, at least in part, considering the wide interindividual variability in absorption efficiency of vitamin E. (Borel et al., 2013; Mah et al., 2015).

4b.3.6   Bioavail­ability of vitamin D3 (cholecalciferol) and vitamin D2 (ergocalciferol)

Vitamin D3 (cholecalciferol) is the main dietary source of vitamin D and is derived from foods of animal origin, especially fatty fish and fish liver oils, with small amounts in beef liver, dairy products and egg yolk. Some UV-exposed mushrooms provide vitamin D2 (ergocalciferol). Diets can also contain 25-hydroxyvitamin D (25(OH)D3), the metabolite produced in the liver, and trace amounts of dihydroxy cholcalciferol (1,25(OH)2D3. Vitamin D3 can also be synthesized by irradiation of 7‑dehydrocholesterol in the skin by ultraviolet B (UVB) light.

The bioavail­ability of vitamin D in food varies widely, with reports ranging from 55% to 99% (mean 78%) in healthy persons (Silva & Furlanetto, 2018). Research on factors affecting the bioavail­ability of vitamin D is limited. In relation to diet- related factors, the food matrix appears to have no effect on vitamin D bioavail­ability, although whether the supplement matrix also has no effect is less clear. Bioavail­ability is improved when vitamin D is given with fat-containing food, as expected, although unaffected by the amount of fat ingested (Borel et al., 2015).

The effect of several host-related factors on vitamin D bioavail­ability has also been investigated. Of these, neither ageing nor obesity modify vitamin D bioavail­ability, whereas in several diseases associated with fat malabsorption vitamin D absorption is impaired. More data are required to establish whether genetic factors, or the vitamin D status of the host, affect vitamin D bioavail­ability.

Studies consistently report the bioavail­ability of 25‑hydroxyvitamin D (25(OH)D3) to be higher (i.e., about three times) than that for vitamin D2 (ergocalciferol) or vitamin D3 (cholecalciferol). Uncertainty exists on whether there are differences in the bioavail­ability of vitamin D2 (ergocalciferol) and vitamin D3 (cholecalciferol). Both forms appear to exhibit identical responses in the body and hence are considered bioequivalent (IOM, 2011), so are expressed in µg regardless of whether they are from conventional foods, fortified foods, or supplements. For details on the mechanism of vitamin D absorption, see Borel et al. (2015).

Both vitamin D2 (ergocalciferol) and vitamin D3 (cholecalciferol) may be present in fortified foods and dietary supplements. In developed countries, some foods are fortified, generally with a dry stabilized form of vitamin D3 which contains an antioxidant. Examples of foods fortified with vitamin D3 include cow’s milk, dried milk powder, evaporated milk, cereals, bread, and margarine (Cashman & O'Neill, 2024). Consequently, care must be taken to use country-specific food composition databases for values on the vitamin D content of these food items.

Acknowledgments

The author is very grateful to the late Michael Jory who after initiating the HTML design worked tirelessly to direct the transition to this HTML version from MS-Word drafts. James Spyker’s ongoing HTML support is much appreciated.