Many minerals have been proven in research studies to be essential for optimal growth, physiologic function, and productivity in ruminants. Historically, testing for these minerals has been performed on diets and/or dietary components to ensure “adequate” concentrations of specific minerals in the diet. However, general mineral analysis does not identify the chemical forms of these minerals, which can dramatically alter their bioavailability and utilization.
Although not possible for some of the minerals, the most specific means of diagnosing a mineral deficiency is by testing animals for unique functional deficits or deficiencies of specific mineral containing proteins or enzymes. This type of testing is often impractical from a field perspective due to individual test costs or rigorous sample handling requirements. But, when possible, this type of testing eliminates the need to know the specific molecular characteristics of a dietary mineral and the potential for competitive interactions of antagonistic minerals for absorption/utilization. For minerals that do not have identified physiologic indices for which testing can be performed, direct quantification from animal tissues or serum may provide a reliable indication of the overall mineral status of the animal or herd.
Mineral deficiencies can be suggestively diagnosed by the development of clinical disease or by postmortem identification of tissue lesions. But proof of mineral deficiencies often requires analytical verification since most do not have very unique clinical signs or lesions. In some instances, circumstantial proof of a deficiency can be provided by positive response to supplementation of a suspected deficient mineral. But positive response may have nothing to do with the supplementation and may be just a time-responsive correction of some other clinical condition.
An individual mineral may have multiple means of measurement for identification of deficiencies, but most have one that is more specific than the others. For example, dietary concentrations may or may not be reflective of the amount of bioavailable minerals. Or an individual tissue concentration may or may not reflect functionally available mineral concentrations at the target or functional site.
The age of the animal being tested also is important for proper interpretation of mineral status. For example, fetuses accumulate some minerals at different rates during gestation, necessitating adequate aging of the fetus for interpretation. In addition, some minerals, for which little is provided in milk, accumulate at higher concentrations during gestation in order to provide neonates with adequate body reserves for survival until they begin foraging. This is especially prevalent with copper, iron, selenium, and zinc. Thus, the “normal range” for these minerals in body storage tissues would be higher in early neonates than in an adult animal.
When individual animals are tested, the prior health status must be considered in interpreting mineral concentration of tissues. Disease states can shift mineral from tissues to serum or serum to tissues. For example, diarrhea can result in significant loss of sodium, potassium, and calcium from the body. Or acidosis will cause electrolyte shifts between tissues and circulating blood. It is known that infectious disease, stress, fever, endocrine dysfunction, and trauma can alter both tissue and circulating serum/blood concentrations of certain minerals and electrolytes. Thus, evaluation of multiple animals is much more reflective of mineral status within a group than testing individual animals that are ill or have died from other disease states.
This paper is directed at the animal testing side of diagnosing mineral deficiencies and provides a summarization of the most commonly utilized tissues and fluids that are used for diagnosing specific mineral deficiencies in animals.
Live Animal Sampling
A variety of samples from live animals can be analyzed for mineral content. The most common samples from live animals are serum and whole blood. These samples are adequate for measurement of several minerals, but it must be recognized that some disease states, as well as feeding times, can result in altered or fluctuating serum concentrations. Other samples from live animals that are occasionally used for analyses include liver biopsies, urine, and milk. But, since milk mineral content can vary through lactation, vary across lactations, and be affected by disease, it is not typically used to evaluate whole animal mineral status. Furthermore, hydration status significantly affects urinary mineral concentrations, rendering it a poor sample for evaluation of mineral status.
Serum should be separated from the red/white blood cell clot within one to two hours of collection. If the serum sits on the clot for long periods of time, minerals that have higher intracellular content than serum can leach into the serum and falsely increase the serum content. Minerals for which this commonly occurs include potassium and zinc. In addition, hemolysis from both natural disease and due to collection technique can result in increased serum concentrations of iron, manganese, potassium, selenium, and zinc.
The best type of collection tube for serum or whole blood is the royal blue-top Vacutainer® tubes, as they are trace-metal free. Typical red-top clot tubes will give abnormally increased concentrations of zinc as a zinc=containing lubricant is commonly used on the rubber stoppers. For minerals other than zinc, serum samples from the typical red-top clot tubes are adequate. Similarly, serum separator tubes are typically adequate for mineral analyses, except for zinc. But I also have found tin contamination in serum samples collected into some brands of serum separator tubes.
Samples should be appropriately stored for preservation. Liver biopsies, urine, and serum can be stored frozen long term or refrigerated if analyses are to be completed within a few days. Whole blood and milk should be refrigerated but not frozen, as cell lysis or coagulation of solids, respectively, will result.
Postmortem Animal Sampling
A variety of postmortem animal samples are available that can be analyzed for mineral content. The most common tissue analyzed for mineral content is liver, as it is the primary storage organ for many of the essential minerals. In addition, bone is used as the primary storage organ for calcium, phosphorous, and magnesium. Other postmortem samples that can be beneficial in diagnosing mineral deficiencies include urine and ocular fluid.
Postmortem samples should be stored frozen until analyzed to prevent tissue degradation. If samples are to be analyzed within one to two days, they can be stored under refrigeration.
Analysis for calcium deficiency falls into two distinct classes. The first is metabolic calcium deficiency, often referred to as “milk fever.” The second is due to a true nutritional deficiency which is associated with long-term dietary calcium deficits.
Analysis for metabolic calcium deficiency is aimed at detection of low systemic or circulating calcium concentration. In live animals, testing is performed on serum to determine circulating calcium concentration. However, in dead animals, testing is more difficult as serum collected postmortem will not accurately reflect true serum calcium concentration prior to death. But circulating serum calcium concentration can be approximated from analysis of ocular fluid, with a vitreous-to-serum ratio of approximately 0.54 (McLaughlin and McLaughlin, 1987). The Utah Veterinary Diagnostic Laboratory has been able to confirm and disprove clinical hypocalcemia in numerous postmortem cases via vitreous fluid analysis.
True nutritional calcium deficiency is associated with weak, unthrifty animals with swollen joints, lameness, weak bones, and a propensity for broken bones (Puls, 1994). Analytical verification of calcium deficiency requires analysis of bone since approximately 98 to 99% of the body calcium content is in bone, and serum concentrations are maintained by both diet and turnover of bone matrix. The bone analysis should be performed as fat-free, dry weight to remove the age variability of moisture and fat concentrations.
Cobalt deficiency is associated with deficiency of vitamin B12 (cobalamin) in ruminants. Deficiency is associated with decreased feed intake, lowered feed conversion, reduced growth, weight loss, hepatic lipidosis, anemia, immunosuppression, and impaired reproductive function (Graham, 1991; Puls, 1994). Cobalt deficiency can also lead to decreased copper retention in the liver.
Tissue and serum concentrations of cobalt are generally quite small, as the B12 is produced in the rumen by the microflora. Since cobalt concentrations may not truly reflect the B12 concentrations, the most appropriate analysis for cobalt deficiency is the direct quantification of serum or liver vitamin B12. But there are numerous forms of cobalamins that ruminants produce with differing bioactivity, making interpretation of analytical results difficult (Mills, 1987). Cobalamin is absorbed into circulation, and small amounts are stored in the liver. Of the tissues available, the liver cobalt concentration best reflects the animal’s overall status, but it may not be truly reflective of vitamin B12 content.
Copper deficiency is a commonly encountered nutritional problem in ruminants, but copper excess is also commonly encountered, especially in sheep. Clinical signs of deficiency can present as a large array of adverse effects (Graham, 1991; Puls, 1994). Reduced growth rates, decreased feed conversion, abomasal ulcers, lameness, poor immune function, sudden death, achromotrichia, and impaired reproductive function are commonly encountered with copper deficiency.
The best method for diagnosing copper status is via analysis of liver tissue, although much testing is performed on serum. Deficiency within a herd will result in some animals that have low serum copper concentrations, but serum concentration does not fall until liver copper is significantly depleted. In herds that have tested liver and found a high incidence of deficiency, it is not uncommon for a high percentage of the animals to have “normal” serum concentrations. At the Utah Veterinary Diagnostic Laboratory, it is commonly recommended that 10% of a herd or a minimum of 10 to 15 animals be tested in order to have a higher probability of diagnosing a copper deficiency via serum quantification. Even with herd deficiency, low serum copper concentrations may only be seen in 20% or more of the individuals. Herds that may be classified as marginally deficient based on liver testing may have predominantly “normal” serum copper concentrations. Thus, serum copper analysis should be viewed as a screening method only. Another factor that can influence diagnosis of copper deficiency in serum is the presence of high serum molybdenum. As the copper-sulfur¬molybdenum complex that forms is not physiologically available for tissue use, “normal” serum copper content in the presence of high serum molybdenum should always be considered suspect. In addition, the form of selenium supplementation can alter the normal range for interpretation of serum copper status, with selenite supplemented cows having a lowered normal range for serum copper.
Copper deficiency can be diagnosed via analysis of copper-containing enzymes. The two most common enzymes that are utilized are ceruloplasmin and superoxide dismutase (Suttle, 1986; Mills, 1987). Low concentrations of these enzymes in serum and whole blood, respectively, are diagnostic for copper deficiency. But ceruloplasmin concentrations can increase with inflammatory disease states. Higher costs for analysis of these enzymes than that of liver copper analysis often limit their utilization.
Excessive supplementation of copper in dairy cattle is a relatively common finding at the Utah Veterinary Diagnostic Laboratory. Liver copper concentrations greater than 200 ppm are routinely identified. In comparison, the recommended adequate liver copper concentration range in cattle is 25 to 100 ppm.
As an essential component of proteins involved in the electron transport chain and oxygen transport, iron is essential for normal cellular function of all cell types. Iron deficiency is associated with reduced growth, poor immune function, weakness, and anemia (Graham, 1991; Puls, 1994). Although offspring are typically born with liver reserves of iron, providing the mother had adequate iron reserves, milk has low iron concentration which results in iron deficiency over time in animals fed a diet of only milk, as is the case in veal animals.
Both liver and serum concentrations are commonly utilized to diagnose iron deficiency. When using serum to measure iron concentration, samples that have evidence of hemolysis should not be used, as they will have artificially increased iron concentration from the ruptured red blood cells. In addition, disease states can alter serum and liver iron concentrations as the body both tries to limit availability of iron to growing organisms and increases the availability of iron to the body’s immune cells. Thus, interpretation of iron status should be made with consideration of the overall health of the animal.
Other factors that can be used to assist with diagnosis of iron status include serum iron binding capacity, serum iron binding saturation, red blood cell count, packed cell volume, serum hemoglobin concentration, and ferritin concentration (Smith, 1989). But a variety of clinical conditions can cause these values to vary, including bacterial infections, viral infections, other types of inflammation, hemorrhage, bleeding disorders, and immune mediated disorders.
Similar to calcium, analysis for magnesium deficiency falls into two distinct classes. The first is metabolic magnesium deficiency often referred to as “grass tetany.” The second is due to a true nutritional deficiency, which is associated with long-term dietary magnesium deficits.
Analysis for metabolic magnesium deficiency is aimed at detection of low systemic or circulating concentration. In live animals, testing is performed on serum to determine circulating magnesium concentration. But it must be noted that ruminants that are displaying recumbency or tetany may have normal serum magnesium, as tissue damage that occurs releases magnesium into the serum from the soft tissues. And testing in dead animals is even more difficult, as serum collected postmortem will not accurately reflect true serum magnesium concentration prior to death. Circulating serum magnesium concentration can be approximated from analysis of ocular fluid, with a vitreous-to-serum ratio of 1.05 (McLaughlin and McLaughlin, 1987). The Utah Veterinary Diagnostic Laboratory has been able to confirm clinical cases of hypomagnesemia in numerous postmortem cases via vitreous fluid analysis. Urine is another postmortem sample that can be analyzed since, at times of low serum magnesium, the kidneys minimize magnesium loss in the urine.
True nutritional magnesium deficiency is not recognized in ruminants, except under experimental conditions. This syndrome is associated with weak, poor doing animals that have weak bones, low bone ash, and calcification of soft tissues. Analytical verification of true magnesium deficiency would require analysis of bone for verification since approximately 70% of the body magnesium content is in bone. The bone analysis should be performed as fat-free, dry weight to remove the age variability of moisture and fat content.
Manganese deficiency in ruminants is associated with impaired reproductive function, skeletal abnormalities in calves, and less than optimal productivity (Graham, 1991; Puls, 1994). Cystic ovaries, silent heat, reduced conception rates, and abortions are reported reproductive effects. Calves that are manganese deficient can be weak and small and develop enlarged joints or limb deformities.
Manganese deficiency, although not reported often, is identified routinely in dairy cattle when tested. Of interest is the fact that most testing of beef cattle finds normal manganese concentrations in liver, blood, and serum, but in these same matrices, greater than 50%, 75%, and 95%, respectively, of dairy cattle tested are below recommended normal concentrations (unpublished data). This may, in part, be due to high calcium and phosphorous concentrations in dairy rations, which can be antagonistic to the bioavailability of manganese.
Of the samples available, liver is the most indicative of whole body status, followed by whole blood, and then serum. As red blood cells have higher manganese concentration than serum, hemolysis can result in increased serum concentration. Since the normal serum concentration of manganese is quite low, many laboratories do not offer this analysis because of inadequate sensitivity. Overall, response to supplementation has frequently been used as a means of verifying manganese deficiency, but it is critical that a bioavailable form be utilized.
Phosphorous status is somewhat difficult to measure in animal tissues. Serum and urine phosphorous concentrations can aid in diagnosing deficiency, but with mobilization of bone phosphorous to maintain serum concentration, significant drops in serum and urine may take weeks to develop. Serum phosphorous measurement should be as inorganic phosphorous for adequate interpretation. Longer-term phosphorous deficiency can be diagnosed postmortem by measuring bone or bone ash phosphorous concentrations. Dietary phosphorous and/or response to supplementation are better indicators of deficiency than tissue concentrations unless severe long-term deficiency has occurred.
The predominant effects of low dietary phosphorous are associated with diminished appetite and its resultant effects. Depressed feed intake, poor growth, and weight loss are common with phosphorous deficient diets. Longer-term phosphorous deficiency results in impaired reproductive performance, diminished immune function, bone abnormalities, and pica.
Tissue concentrations of potassium poorly correlate with dietary status. Of the animal samples available, serum potassium is the best indicator of deficiency, but disease states can cause electrolyte shifts that result in lowered serum potassium when dietary deficiency has not occurred. In addition, serum that is hemolyzed or left on the clot too long may have falsely increased potassium concentration due to loss from the red blood cells. In addition, renal disease can result in increased serum potassium. Thus, dietary potassium concentrations are a better guide to potassium status.
Dietary potassium deficiency affects intake, productivity, heart function, and muscle function. Common clinical signs of severe potassium deficiency include diminished feed intake, reduced water intake, pica, poor productivity, weakness, and recumbency.
As an essential mineral, selenium is commonly identified as deficient in ruminants but infrequently in dairy cattle. Selenium deficiency in ruminants is associated with adverse effects on growth, reproduction, immune system function, offspring, and muscle tissues (Graham, 1991; Puls, 1994). “White muscle disease,” a necrosis and scarring of cardiac and/or skeletal muscle, is linked to severe selenium deficiency, although it can be caused by vitamin E deficiency as well. Reduced growth rates, poor immune function, and impaired reproductive performance can be observed with less severe selenium deficiency.
Diagnosis of a deficiency can be made by analysis of liver, whole blood, or serum for selenium concentration or by analysis of whole blood for the activity of glutathione peroxidase, a selenium-dependent enzyme (Ullrey, 1987). The most specific analysis is that of whole blood glutathione peroxidase, as it verifies true functional selenium status. Liver is the optimal tissue to analyze for selenium concentration as it is a primary storage tissue. With serum and whole blood, the former better reflects recent intake, while the latter better reflects long-term status. Since seleno-proteins are incorporated into the red blood cells when they are made and the cells have a long half-life, selenium concentration is a reflection of intake over the previous months.
In order to adequately diagnose selenium deficiency, the dietary form of the selenium consumed by the animals is important. Natural selenium, predominantly in the form of selenomethionine, is metabolized and incorporated into selenium-dependent proteins but can also be incorporated into nonspecific proteins in place of methionine. Inorganic selenium is metabolized and only incorporated into selenium-dependent proteins. Thus, “normal” concentrations in serum and whole blood differ depending on whether the dietary selenium is a natural organic form or an inorganic supplement.
Tissue concentrations of sodium poorly correlate with dietary deficiency. Of the animal samples available, serum and urine are the best for measuring sodium deficiency, but disease states can cause electrolyte shifts that result in lowered serum or urinary sodium even when dietary concentrations are adequate. Thus, dietary sodium concentrations are a better guide to diagnosing a deficiency.
Dietary sodium deficiency affects feed intake and productivity. Common clinical signs of severe sodium deficiency include diminished feed intake, reduced water intake, poor productivity, and pica.
Zinc is an essential mineral that is required by all cells in animals. Zinc plays a role in numerous enzymatic reactions (Graham, 1991; Puls, 1994). Deficiencies of zinc are associated with reduced growth, poor immune function, diminished reproductive performance, and poor offspring viability, as well as skin lesions in severe cases.
Tissue zinc concentrations do not reflect body status well (Mills, 1987). Of the common samples tested, liver and serum are the best indicators of zinc status. But serum and liver zinc can be altered by age, infectious diseases, trauma, fever, and stress. It has been suggested that pancreas zinc concentration is the best means of truly identifying zinc deficiency. Response to zinc supplementation has shown that some animals having low-end normal liver or serum zinc can still show improvement in some clinical conditions. Thus, liver and serum only verify deficiency when these samples have very low zinc concentration.
A variety of samples can be tested for mineral content but may not provide any indication of the overall mineral status of the animal. Appropriate diagnosis of mineral status involves thorough evaluation of groups of animals. The evaluation should include a thorough health history, feeding history, supplementation history, and analysis of several animals for their mineral status.
Dietary mineral evaluation should augment the mineral evaluation of animal groups. If minerals are deemed to be adequate in the diet but the animals are found to be deficient, antagonistic interactive effects of other minerals need to be investigated. As an example, high sulfur or iron in the diet can cause deficiencies in copper and selenium, even when there are adequate concentrations in the diet.
Jeffery O. Hall
Utah Veterinary Diagnostic Laboratory
Department of Animal, Dairy, and Veterinary Sciences
Utah State University
Graham, T.W. 1991. Trace element deficiencies in cattle. Vet. Clin. N. Am.: Food Anim. Pract. 7:153-215.
McLaughlin, P.S., and B.G. McLaughlin. 1987. Chemical analysis of bovine and porcine vitreous humors: Correlation of normal values with serum chemical values and changes with time and temperature. Am. J. Vet. Res. 48:467-473.
Mills, C.F. 1987. Biochemical and physiologic indicators of mineral status in animals: copper, cobalt, and zinc. J. Anim. Sci. 65:1702-1711.
Puls, R. 1994. Mineral Levels in Animal Health: Diagnostic Data. Second edition. Sherpa International. Clearbrook, B.C.
Smith, J.E. 1989. Iron metabolism and its diseases. In Clinical Biochemistry of Domestic Animals. J.J. Kaneko (ed). Academic Press. San Diego, Calif.
Suttle, N.F. 1986. Problems in the diagnosis and anticipation of trace elament deficiencies in grazing livestock. Vet. Rec. 119:148-152.
Ullrey, D.E. 1987. Biochemical and physiologic indicators of selenium status in animals. J. Anim. Sci. 65:1712-1726.