Methods used to determine requirements
This is the most direct method and involves removing the nutrient from the diet, observing the symptoms of deficiency, and then adding back the nutrient until the symptoms are cured or prevented. Difficulties with this approach are as follows. First, that the exper-iment may need to continue for several years owing to the presence of body stores of the nutrient, and often requires a very limited and therefore boring dietary regimen. Second, unpredicted long-term adverse consequences may result. Third, such experi-ments are not ethical in vulnerable groups such as children (often the most relevant for study). In some cases, epidemiological data may be available; for example, the deficiency disease beriberi occurs in populations whose average thiamin intake falls below 0.2 mg/4.2 MJ (1000 kcal).
This approach makes use of a known amount of the radioactively labeled nutrient, which is assumed to disperse evenly in the body pool, allowing the estima-tion of the total pool size by dilution of the isotope in samples of, for instance, plasma or urine (i.e., if the body pool is large, then the dilution will be greater than if the body pool is small). Specific activity, that is radioactivity per unit weight of the nutrient in the samples, can be used to calculate pool size as long as the total dose administered is known. The rate of loss can then be monitored by taking serial samples, allow-ing calculation of the depletion rate. In the case of vitamin C, the average body pool size of a healthy male was found to be 1500 mg, which, on a vitamin C-free diet, depleted at a rate of approximately 3% (of the body pool) per day. This fractional catabolic rate was independent of body pool size, and symptoms of scurvy appeared when the body pool fell below 300 mg. The estimated replacement intake needed to maintain the body pool above 300 mg was therefore 3% of 300 mg, i.e., 9 mg (similar to the 10 mg found to be needed to prevent scurvy in the earlier Sheffield experiment).
These rely on the assumption that, in healthy individuals of stable body weight, the body pool of some nutrients (e.g., nitrogen, calcium, and sodium) remains constant. Compensation mechanisms equal-ize the intake and output of the nutrient over a wide range of intakes, thereby maintaining the body pool. Thus, day-to-day variations of intake are compen-sated for by changes in either the rate of absorption in the gut (generally in the case of those nutrients of which the uptake is regulated) or the rate of excretion in the urine (in the case of very soluble nutrients) or feces, or both. However, there comes a point beyond which balance cannot be maintained; therefore, it can be proposed that the minimum intake of a nutrient at which balance can be maintained is the subject’s minimum required intake of that nutrient. However, this approach would need to be extended over time to investigate possible adaptive responses to reduced intakes, e.g., absorption could eventually be increased. In the case of calcium, the European consensus is that average daily losses are assumed to be 160 mg/day in adults, and absorption is assumed to be 30%; thus, around 530 mg would need to be consumed to balance the losses. Adding or subtracting 30% to allow for individual variation (the notional 2 SDs explained above) gives (rounded) dietary reference values of 400, 550 and 700 mg/day (LTI, AR, and PRI, respectively).
These are predictions, rather than measurements, of the requirements of groups or individuals, taking into account a number of measured variables (factors, hence “factorial”) and making assumptions where measurements cannot be made. For example, the increased requirements during growth, pregnancy, or lactation are calculated by this method; this approach is necessitated by the lack of experimental data in these physiological situations owing to ethical prob-lems. The idea is that the rate of accumulation of nutrients can be calculated and hence the amount required in the diet to allow that accumulation can be predicted. In the case of pregnancy, the requirement is estimated to be the amount of the nutrient needed to achieve balance when not pregnant plus the amount accumulated daily during the pregnancy, all multi-plied by a factor accounting for the efficiency of absorption and assimilation (e.g., 30% for calcium). For lactation, the calculation for energy is based on the amount in the milk secreted daily, which is increased by a factor accounting for the efficiency of conversion from dietary energy to milk energy (reck-oned to be 95%), from which total is subtracted an allowance for the contribution from the extra fat stores laid down during pregnancy, which it is desir-able to reduce in this way. The difficulty with this approach is that the theoretical predictions do not necessarily take account of physiological adaptations (e.g., increased efficiency of absorption in the gut) that may reduce the predicted requirement. This would apply particularly in the case of pregnancy, as shown by the ability of women to produce normal babies even in times of food shortage.
Some nutrient requirements can be defined according to the intakes needed to maintain a certain level of the nutrient in blood or tissue. For many water-soluble nutrients, such as vitamin C, blood levels reflect recent dietary intake, and the vitamin is not generally measurable in plasma at intakes less than about 40 mg/day. This level of intake has therefore been chosen as the basis for the reference in some countries such as the UK. This approach is not, however, suitable for those nutrients of which the plasma concentration is homeostatically regulated, such as calcium. In the case of the fat-soluble vitamin retinol, the dietary intake required to maintain a liver concentration of 20 μg/g has been used as the basis of the reference intake. To do this, the body pool size needed to be estimated; assumptions were made as to the proportion of body weight represented by the liver (3%) and the proportion of the body pool of retinol contained in the liver (90%). The fractional catabolic rate has been measured as 0.5% of the body pool per day, so this would be the amount needing to be replaced daily. The efficiency of conversion of dietary vitamin A to stored retinol was taken to be 50% (measured range 40–90%), giving an EAR of around 500 μg/day for a 74 kg man.
In many respects, biochemical markers represent the most satisfactory measure of nutrient adequacy since they are specific to the nutrient in question, are sensi-tive enough to identify subclinical deficiencies, and may be measured precisely and accurately. However, such markers are available for only a few nutrients, mostly vitamins, at present. One well-established example of a biochemical marker is the erythrocyte glutathione reductase activation test for riboflavin status. Erythrocytes are a useful cell to use for enzyme assays since they are easily obtainable and have a known life-span in the circulation (average 120 days), aiding the interpretation of results. Glutathione reductase depends on riboflavin and, when activity is measured in both the presence and absence of excess riboflavin, the ratio of the two activities (the erythro-cyte glutathione reductase activation coefficient, EGRAC) reflects riboflavin status: if perfectly suffi-cient, the ratio would be 1.0, whereas deficiency gives values greater than 1.0.
These are measures of some biological function that is directly dependent on the nutrient of interest; again, not always easy to find, hence the recent sug-gestion that some functional indices be considered that are not necessarily directly dependent on the nutrient. Iron status is assessed according to a battery of biological markers, including plasma ferritin (which reflects body iron stores), serum transferrin saturation (the amount of plasma transferrin in rela-tion to the amount of iron transported by it is reduced in deficiency), plasma-soluble transferrin receptor (an index of tissue iron status), and the more tradi-tional tests such as blood hemoglobin (now consid-ered to be a rather insensitive and unreliable measure of iron status since it indicates only frank anemia, and also changes as a normal response to altered physio-logical states such as pregnancy).
Vitamin K status is assessed by measuring pro-thrombin time (the length of time taken by plasma to clot), which is increased when vitamin K levels fall since the synthesis of prothrombin in the liver depends on vitamin K as a cofactor. This test is clinically useful in patients requiring anticoagulant therapy (e.g., using warfarin, which blocks the effect of vitamin K), in whom the drug dosage must be closely monitored.
These are of limited use in defining human nutrient requirements because of species differences (e.g., rats can synthesize vitamin C, so it is not a “vitamin” for them), differences in metabolic body size (i.e., the proportions of metabolically active tissue, such as muscle, and less active tissue, such as adipose tissue, gut contents), and differences in growth rates (young animals generally grow far more rapidly than humans, e.g., cattle reach adult size in about 1 year). However, animals have provided much of the information on the identification of the essential nutrients, and their physiological and biochemical functions. Furthermore, animals can be used in experiments that would not be possible in humans, such as lifelong modifications in nutrient intake; it is merely the setting of human requirements for which they are inappropriate.