The use of reference values to
assess the adequacy of the nutrient intakes of population groups
Ideally, this is accomplished by discovering the distri-bution
of intakes of a nutrient in the population group (e.g., by carrying out a
dietary survey), and comparing these intakes with the distribution of
requirements for that nutrient within the same population. In practice,
reliable data with which to plot the second of these distributions have rarely
been col-lected, and therefore what must be used is an estima-tion of the average
requirement together with an esti-mation of the variance in that requirement,
i.e., the standard deviation (based on whatever scientific evi-dence is
available), that is used to plot the population distribution of requirements as
shown in Figure 7.1.
Figure 7.1 Frequency distribution of individual requirements for a nutrient. (a) The mean minus a notional 2 standard deviations (SDs); intakes below this will be inadequate for nearly all of the population. (b) The mean; the midpoint of the population’s requirement. (c) The mean plus a notional 2 SDs; the intake that is adequate for nearly all of the population. Note that, in practice, because insufficient data exist to establish reliable means and SDs for many nutrient requirements, the reference intakes describing the points a and c on the curve are generally set, in the case of a, at the level that is judged to prevent the appearance of signs of deficiency (biochemical or clinical), and, in the case of c, at the level above which all individuals appear to be adequately supplied. Thus, it is unlikely that even 2.5% of the popula-tion would not achieve adequacy at intake level c.
When considering how to assess the adequacy of nutrient intakes
of populations it is important to compare the intakes with the most appropriate
level of requirement as defined in dietary recommendations.
It is not useful to compare usual intakes with the RDA (PRI,
RNI, i.e., the average requirement plus a notional 2 SDs) at the population
level since this approach leads to overestimates of the prevalence of
inadequacy. (It may, however, be justified to compare an individual’s intake
with the RDA.) Furthermore, this approach might be seen to encourage the
con-sumption of higher intakes, which could be toxic in the case of certain
nutrients.
Comparison of the population intake with the average requirement
[AR; estimated average require-ment (EAR)] is now considered to be the best
estima-tion of dietary adequacy; if the average intake is less than the average
requirement, then it is clear that there could be a problem in that population.
Accord-ingly, using the average requirement as a cut-off point, the proportion
of individuals in the group whose usual intakes are not meeting their
requirements can be calculated, allowing the problem to be quantified. However,
this approach cannot be used in the case of energy since energy intakes and
requirements are highly correlated (the effects of an imbalance being quickly
obvious to the individual).
The lowest defined intake level [lowest threshold intake (LTI),
lower reference nutrient intake (LRNI), i.e., the average requirement minus a
notional 2 SDs] is not regarded as being useful in the context of assess-ing
the adequacy of population nutrient intakes. This is because it would identify
only those individuals who were almost certainly not meeting their
require-ment, and by the same token would omit to include many in the
population who would be at appreciable risk of nutrient inadequacy (in other
words, those whose intake was below the average requirement).
Finally, the tolerable upper levels of intake defined for
certain nutrients can also be used as cut-off points to identify those
individuals at risk of consuming toxic levels of a nutrient.
Related Topics
Privacy Policy, Terms and Conditions, DMCA Policy and Compliant
Copyright © 2018-2023 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.