Forage test reports are like short stories.
Forage test reports are like short stories. Each is a narrative of old friends, surprising plot twists, and sometimes sly interactions between characters. Some folks, I suppose, look at a forage test and see only a list of numbers. That's kind of like looking at a map of Wyoming and seeing only a a grid of roads. My mind sees snow-covered mountains, deep valleys, and the metallic blue of a wide Wyoming sky. So let's turn our attention to forage tests, and I'll share with you some of the things I see.
All forage test reports — and I'll use this term to mean all feed tests — contain two columns of numbers. One column is labeled "As Sampled" or "As Fed" or "Wet Basis" or something like that, and the other is almost always labeled "Dry Matter." Which one should we look at?
Here's a question for you. Let's say that we have two feeds: Feed A contains 58% TDN and 13.4% crude protein. Feed B contains 16% TDN and 3.3% crude protein. Which feed is better? Of course this is a trick question — otherwise why would I ask it? And of course, anyone seeing these numbers would choose Feed A.
Now the punch line: Feed A is early-bloom grass hay. Feed B is cows' milk. Now which feed would you choose? Of course, milk is a better feedstuff, but these numbers don't show it. Why? Because these numbers are "as fed" values — right out of the barn, as they are fed, which includes the water in those feeds. Milk contains 87.6% water, which dilutes the other values, makes them smaller. In this case a lot smaller. The hay contains only 11.0% moisture. Since these feeds contain different amounts of water, comparing their nutritional values on an "as fed" basis is like comparing apples and oranges. Water contains no nutritional energy or protein, so we must eliminate it from the report. All laboratories, therefore, adjust for this water by dividing the "as fed" number by the dry matter coefficient. Then they list the corrected results in the "dry matter" column of the report.
Let's make that adjustment to our two feedstuffs. Our results now show that, on a dry matter basis, Feed A (the grass hay) contains 65% TDN and 15.1% protein, and Feed B (milk) contains 129% TDN and 26.6% crude protein. Now these numbers reflect the real nutritional values of the feeds. Therefore I always look at the dry matter column of a forage test.
When I first receive a forage report, I quickly scan the entire thing, trying to get a full picture of the feed. I look at the general values for crude protein, TDN, horse TDN (HTDN), calcium, phosphorus, and dry matter. Are these numbers close to what I expect for this type of forage? I'm searching for weird numbers, unexpected values. I recently received a forage test that reported 37% protein for a grass pasture. 37%!?! Wow — that number certainly caught my eye. I asked the lab to retest the sample. They did, and the results came back unchanged. Only then did I have real confidence in it.
Then I scrutinize the energy value. I like to use TDN, but many reports also list values for digestible energy, metabolizable energy, and/or a whole raft of numbers for net energy (maintenance, growth, lactation), and also a relative feed value. That's OK. Remember that labs don't actually test for energy; labs only test for fiber (generally ADF = Acid Detergent Fiber) and then plug that fiber value into an equation to derive the energy value of the forage. For example, labs use a formula that looks something like "TDN = A x (ADF) + B", where the values of A and B come from reference tables for different types of feeds. That's why it's so important to label a forage sample accurately. Lab technicians use that label to determine which values of A and B to apply.
A few years ago in Oregon, we sampled some perennial ryegrass hay for analysis. On the submission form, the rancher wrote the words "p. hay" as the label for his perennial ryegrass hay. When we received the report, I noticed some unexpected values for energy and protein, so I called the lab. The technician pulled up the form and we found the problem — the lab thought that the sample was pea hay and used reference values for legumes rather than grasses. H-e-l-l-o? After our conversation, they ran the numbers through the computer again, this time with the proper label, and the nutritional values came out better.
Labeling samples properly is even more critical if the lab analyzes the sample with NIRs (Near Infrared Reflectance Spectroscopy) rather than old-fashioned wet chemistry. NIRs is, literally, an electronic black box technology that uses infrared waves and sophisticated mathematics to deduce nutritional values. With NIRs, all of the numbers, including the values for both types of fiber (NDF and ADF), are derived from reference tables. Samples with inaccurate labels can show wildly fascinating results in the final report — good for laughs but not much else.
Then I look at crude protein, and if it is listed, the supplemental value for something called adjusted crude protein or its inverse, ADF-N, which is nitrogen bound up in the ADF and not available to the animal. These supplementary numbers can indicate heat damage — which can occur in wet hay stacked in a barn, where some of the forage protein can cook into a gooey, indigestible substance like caramel. This shows up in the forage report as ADF-N. Some labs list ADF-N directly, and some labs translate this nitrogen into crude protein in the usual way (by multiplying by 6.25) and list unavailable crude protein, which is then subtracted from the total crude protein to arrive at a value for adjusted crude protein. Either way, a high number for ADF-N or a low number for adjusted crude protein, relative to the total crude protein, tells me that the forage had suffered some protein losses due to heating.
In practice, nearly all forages have a tiny amount nitrogen naturally linked to fiber, which reduces the adjusted crude protein value slightly. This is no big deal, and I generally ignore it. But if the adjusted crude protein is significantly lower than the total crude protein, then I balance rations with the adjusted value, because that represents the biologically usable amount of protein. How much is significant? That depends on the level of crude protein, but anything more than one or two percentage units is significant.
Let's continue to look down the dry matter column of a forage test report. We've already examined the array of values for energy and protein, so what's next? The minerals. Lots of interesting numbers. Here are some things that I consider when I look at those mineral levels.
I first look at the levels of calcium and phosphorus, and I am particularly interested in two things: (1) the absolute levels of these minerals, and (2) the ratio between them. Grasses typically contain 0.25 to 0.60% calcium (remember that everything is expressed on a dry matter basis). Legumes tend to have higher calcium levels than grasses, often 1.4% of the dry matter or higher. For example, if I see a forage calcium level of 1.2%, I would guess that forage contained a high percentage of legume. Phosphorus levels for both grasses and legumes tend to be in the range of 0.15–0.50%. Low phosphorus levels tell me that the forage was quite mature. High phosphorus levels, on the other hand, may be due to high levels of phosphorus fertility in the soil. But if I see values outside of these ranges, I look very carefully at them.
For the ratio between calcium and phosphorus (Ca-P ratio) in the total diet, I like to see at least 1.3:1 for most situations and, ideally, 2:1 or slightly higher for young, growing animals. Once the phosphorus requirements are met, then having a calcium level of approximately twice as high as phosphorus helps insure that male animals don't suffer from a syndrome called urinary calculi, in which insoluble crystals containing these two minerals form in the urethra and block urination. Older animals generally don't need such high Ca-P ratios, but under some conditions, ratios lower than 1:1 may cause calcium deficiency problems in adults, particularly the milk fever syndrome in mature ewes or dairy cows. Since many feed companies add calcium and/or phosphorus to their supplements and mineral mixtures, knowing the forage levels of calcium and phosphorus helps me understand the total diet and guides my decisions about the need to supplement them.
Then I look at the level of magnesium and an associated mineral, potassium. Low magnesium levels contribute to the spectacular neurological problem of magnesium tetany, which is also called grass staggers, winter tetany, and other names. Whatever we call it, symptoms occur when blood magnesium levels drop below a trigger threshold, which causes the animal to go into seizures. We usually see this problem in the early spring when forage is lush and young. The rule-of-thumb about magnesium tetany is rather simple: low risk for forage magnesium levels above 0.20%; moderate risk for levels between 0.15–0.19%; and high risk for levels below 0.15%. Except that ... (drum roll, please) ... high potassium levels in a forage can reduce magnesium absorption from the intestinal tract. How high? Another rule-of-thumb: forage potassium levels above 3.0% can cause problems, particularly if magnesium levels are low or marginal. Compared to legumes, grasses are particularly greedy about potassium — they will absorb extra potassium from high-potassium soils, even above their own requirements for growth. I've seen grass test higher than 4.0% potassium, and in the early spring before the soil really warms up, the primary forage growth is grass rather than legumes. So knowing the levels of magnesium and potassium helps me evaluate the metabolic risks of magnesium tetany and the option of adding extra magnesium to the mineral mix during the risk period.
Some folks ignore the level of sodium, but I don't. Most trace mineral mixtures contain white salt (sodium chloride) which also usually acts as the main palatability factor that drives animals to consume the mixture. High sodium levels in forages make me a little nervous, because those high levels may satisfy an animal's desire for salt and thus reduce its intake of the free-choice TM mixture. Forages usually contain less than 0.20% sodium, but I've seen levels higher than 0.40%, especially in forages grown near the ocean or on saline ground, and also in byproduct feedstuffs and other supplemental feeds.
And then there is copper. This is a big bugaboo, especially among sheep producers who rightly worry about chronic copper toxicity. But I'm not only just interested in copper, I'm also interested in three other minerals that affect copper absorption — molybdenum, sulfur, and possibly iron.
Sheep, cattle, and goats all have a nutritional requirement for copper, at approximately 8-11 ppm in their total diet when the dietary molybdenum level is low. But sheep are particularly sensitive to chronic copper toxicity, and even slightly higher copper levels over a long period could cause problems. I'm generally happy to see 8-11 ppm copper in a forage test, and this seems to be a common range in forages. But what about forages grown in old orchards, where farmers periodically sprayed trees with Bordeaux mixture (copper sulfate + hydrated lime + water), or in fields where hog manure or chicken litter was applied as fertilizer? What about feeds composed of copper-containing ingredients or feeds mixed wrongly? I always want to know about elevated levels of copper, and values greater than 15-18 ppm are red flags that I look at very carefully.
But copper absorption is profoundly influenced by molybdenum and sulfur, and to some extent, iron. High dietary levels of these minerals will reduce copper absorption across the gut wall, and if they are high enough, may even cause a copper deficiency. Forage molybdenum levels can range from less than 1 ppm up through 3 or 4 ppm or higher. Sulfur levels are generally 0.10–0.30%. I would consider sulfur above 0.35% to be high.
In an ideal world, I would like to see a ratio of copper to molybdenum (in the total diet) of between 6:1 and 10:1. Higher ratios may increase the risk of copper toxicity, especially if the sulfur levels are also low, and lower ratios suggest the possibility of copper deficiency, especially if sulfur levels are high. Iron can also tie up copper, so I am wary of high iron levels, say above 400 ppm. On the other hand, high iron can also be due to soil contamination of the sample, which is not necessarily a nutritional issue, so I take these iron levels with a grain of salt.
Forages contain other required minerals, of course, such as zinc and manganese. I always scan these numbers for uncommonly high or low values, looking for obvious problems. Selenium, iodine, and cobalt values would also be useful, but most laboratories don't test for them.
Someday, however, I would like to see laboratories provide information about some unusual minerals, like radium and uranium. Because if a forage contained high levels of these minerals, I could feed that forage knowing that I could always find my animals at night.
Podcast CE: A Surgeon’s Perspective on Current Trends for the Management of Osteoarthritis, Part 1
May 17th 2024David L. Dycus, DVM, MS, CCRP, DACVS joins Adam Christman, DVM, MBA, to discuss a proactive approach to the diagnosis of osteoarthritis and the best tools for general practice.
Listen