Livestock provide food, income, education, cultural status…and hope.

African farmland

I took the photo above while travelling in South Africa last year. Whenever I’m faced with the inevitable “But we can just grow corn and soy to feed humans!” anti-livestock rhetoric (as seen in The Guardian this past week), I’m reminded of this picture. It shows tiny rural homes on the edge of a major road, upon which the majority of people walk to work, dodging the traffic as they go. The land is rocky, steep and lacks nutrients, the soil only capable of producing fibrous grasses that can’t be eaten by people. Yet, another few hundred yards down the road, we came across a goat.

African goat (straighter)

For many people in low-income countries, a goat is a lifeline. A source of food that improves the nutrition and health of young children, pregnant women and elderly people. A source of income to allow children to attend school and have a future career, rather than working to support their family before the age of 10. A source of security that allows for improved mental health, female independence and cultural status. Last week I spoke at a Cheltenham Science Festival panel entitled “Should we all become vegan?” It’s easy to suggest that many of us in the developed world could eat less meat. However, the myriad benefits provided by livestock to people in low-income regions should not be foregone on the grounds of foodie ideology bestowed by those of us living in developed regions.

I’m pleased to see Prue Leith, Jenny Eclair, Bob Geldof, Jonathan Dimbleby and others lending their support to Send a Cow’s #UnheardVoices campaign. Let’s recognise livestock’s role in giving hope to those who need it most – and make those voices heard.

Cutting Meat? Or Just Cutting Corners?

DexterIt is equally interesting, inevitable and lamentable to see that another study has come out claiming that the only way to reduce climate change is to cut meat consumption per person.

Meat consumption appears to be the only human activity subject to continuous “we must cease/reduce this” claims on the basis of environmental impacts. If we compare to other greenhouse gas sources, a considerable proportion come from transportation. Yet rather than insisting that every car-owner cut their annual mileage by 25%, the focus has been on reducing emissions by producing more fuel-efficient vehicles. Similarly, no one has yet claimed that we should reduce household lighting by four hours per day, but the compact fluorescent lightbulb (CFL) has become the poster child for improving household energy efficiency.

We have demonstrable proof that beef and dairy producers have improved greenhouse gas emissions (as well as land use, water use, energy efficiency, etc) over time through improved efficiency, and can continue to do so into the future. So why are the gains made by livestock producers dismissed, and reduced meat intakes seen as the only solution? I have an absolute hatred of conspiracy theories, but it is difficult not to see an latent agenda in the preponderance of “Cut meat consumption” papers. Jumping on the bandwagon? Promoting individual dietary opinions as science? Or simply bowing to NGO/media opinions and looking for easy funding and publicity?

As the global population increases to over 9.5 billion people by 2050, with the majority of this growth occurring in the developing world, the demand for milk, meat and eggs is going to increase by 60%. If we are serious about cutting greenhouse gas emissions, it’s time to examine the impacts of all of our actions and concentrate on further efficiency improvements rather than constraining dietary choice.

Does Nature Really “Do Its Thing” on Organic Farms?

Fridge picI am lucky. My fridge is full of food: mostly produced in the UK or Europe; all nutritious, safe and affordable; and almost all produced on conventional farms, with a small amount of organic food (in my case, chocolate). Given that you’re reading this, I’ll hazard a guess that you are lucky too. 795 million other people can’t say the same thing – and feeding all the people on a planet where 1 in 9 don’t currently have enough food is, in my view, our biggest challenge.

The fact that we face this challenge makes me really irritated when celebrity chefs who could make a huge difference, bow instead to popular rhetoric. In his latest blog post, mockney chef and food pundit Jamie Oliver proclaims that “…organic food is natural food, where nature has been allowed to do its thing, and I’m sure most of us will agree that putting natural ingredients into our bodies is only going to be a positive thing.”

If we ignore the nonsensical claim that natural ingredients produce positive results (Really? Let’s examine puffer fish, solanaceae poisoning, dangerous fungi, absinthe, the many consequences of obesity…), let’s simply look at his claim that organic food is natural.  Except, well, it’s not. Agriculture first developed ~12,000 years ago, and ever since then farmers have been doing their best to breed crops and animals that are best suited to their farming system, whether it’s organic or conventional. Want dairy cows that produce high-quality milk from grazing pasture; leaner pork chops; or strawberries that can survive supermarket handling? You’ve got it. All achieved through traditional breeding techniques (otherwise known as “nature doing its thing”): noting that plant or animal A has desirable characteristics and breeding it with plant or animal B to (hopefully) produce better offspring. No scary chemicals, scientists with syringes or genes in test-tubes. Every farm in the world is founded on “nature doing its thing” – not just the organic farms. We can argue whether GMO crops are natural (breeding techniques are simply more refined and specific) or not (scientists playing god…) but that argument becomes redundant in the EU and many other regions, where GMO crops are not approved.

Can organic producers use pesticides? Yes, if they’re compounds approved for organic production (e.g. highly-toxic copper-based fungicides). Can they use antibiotics and wormers? Again yes, if a proven disease problem exists (note that rules differ slightly between the UK and USA). Are organic farmers just merrily sitting back and letting their crops cross-pollinate and reseed, and their bulls run around happily “doing their thing” to whichever cow they come across? No. It’s a beautiful bucolic image to suggest that organic farmers are happily working with Mother Nature whereas conventional farmers have an evil scientist sitting on one shoulder and a big agribusiness corporation on the other, but its simply not true.

According to Mr Oliver, “…the simple fact is that often we don’t actually have to interfere with nature.” The idea of a world where we could feed over 7 billion people without having to actually invest any research dollars into improving food production is lovely, but it’s smoke and mirrors. At the most basic level, what happens if we don’t “interfere” by controlling weeds (whether by chemicals, mechanical tillage or human labour)? Crop yields are reduced, food production goes down and we feed and clothe fewer people. What happens if a cow has problems giving birth? In nature, she dies. On a farm (whether organic or conventional) both she and the calf are saved, providing milk and meat for us to eat. According to the World Organisation for Animal Health, 20% of global animal protein losses are due to diseases for which treatments already exist – we simply need to make them available to every farmer worldwide. Just think how many more people we could feed if we interfered with nature in that way?

Huge amounts of research monies are invested each year to find ways to improve food production on both organic and conventional farms worldwide. Some are highly technical, others are simple, but all are contributing to the goal of feeding the world. Unfortunately, when food pundits jump on the “let’s promote system X” bandwagon as Mr Oliver has done with organic production, using persuasive but false arguments, we lose traction in fulfilling the real goal. Rather than arguing about which foods we can/should be buying, we need to accept that there’s a place for all systems; examine the ways in which all systems can improve soil fertility, animal health and environmental impacts; and make faster progress towards feeding the world while still enjoying our food choices.

How Long is Long-Term? Are We in Danger of Sacrificing Food Security to Satisfy GMO Paranoia?

FrankenfoodsMy Twitter feed is being taken over by two things: 1) arguments and 2) comments that are going to cause arguments. Almost every tweet appears to draw a contrary comment – I’m tempted to post “Elephants have four legs and one trunk” just to see how many people reply “No, there’s an elephant in South Africa called Minnie who only has three legs but has two trunks…”

The latest discussions (debates? arguments? long drawn-out 140-character battles?) have related to the safety of GMOs. Without exception, the argument from the nay-sayers comes down to “We don’t know what the long-term effects are, so we should ban them until we can conclude that they’re safe.”

In other words, we’re trying to prove a negative – show me that there’s no adverse effects whatsoever and I’ll believe it’s ok. Utterly impossible. Can you be absolutely sure that the screen you’re reading this on isn’t causing constant, minute but irreparable damage to your eyes? Water, that essential nutrient without which humans, animals and plants would die, can kill through drowning or intoxication. Even oxygen, without which brain cells are irretrievably damaged in just 10 minutes,  causes seizures and death when inhaled at high pressures. Should we ban these, just in case?

Perhaps we should take a long-term approach to all new technologies. iPhones were only introduced seven years ago, yet many of us spend considerable amounts of time typing on them, or holding them to our ears when they’re not in our pockets – what health-damaging consequences could these shiny new toys confer? What about the now-ubiquitous hand sanitizer? Once only the province of hospitals and germophobes, it’s now sloshed around by the gallon. Touted to kill 99.9% of harmful bacteria – what harm could those chemicals be doing to our fragile physiology?

I’ve yet to meet anybody who, when scheduled for quadruple bypass surgery, demanded that the surgeon only used techniques developed in 1964; or a type I diabetes sufferer who would only use insulin produced from pigs, as it was originally in 1923. When I was treated for breast cancer, I jumped at the chance to be part of a clinical trial involving a new monoclonal antibody treatment, regardless of the very slight risk of heart damage. In medicine, we seem happy to trust that science has the answers – not surprisingly, we prefer to survive today and take our changes with side-effects tomorrow.

With regards to food however, the opposite appears to be the case. The first commercial GMO (the Flavr Savr tomato) was introduced in 1994, GM corn and soy were commercialized in 1996, and not one death or disease has been attributed to any of these crops. Yet the “what are the long-term effects?” concern still persists. So how long-term is long enough? 10 years? 20? 50? Should we keep researching and testing these crops for another 80+ years before allowing them onto the market around the year 2100?

If your answer is yes, just pause for a moment and ask your parents, grandparents or even great-grandparents what life was like during the Great Depression in the USA, or World War II in Europe. Consider what life was like when food was scarce or rationed, when, for example, a British adult was only allowed to buy 4 oz of bacon, 8 oz ground beef, 2 oz each of butter and cheese, 1 fresh egg and 3 pints of milk per week. Those quantities of meat and cheese would only be enough to make two modern bacon cheeseburgers.

By 2050, the global population is predicted to be over 9 billion people. I don’t relish the idea of explaining to my grandchildren that they live with food scarcity, civil unrest (food shortages are one of the major causes of conflict) and malnutrition because public paranoia regarding GMOs meant that a major tool for helping us to improve food production was removed from use. In the developed world we have the luxury of choosing between conventional, natural, local, organic and many other production systems. However, we’re in danger of forgetting that not everybody has the same economic, physical or political freedom to choose. If you gave a basket of food to a family in sub-Saharan Africa subsisting on the equivalent of $30 per week, would they refuse it on the basis that the quinoa wasn’t from Whole Foods, the meat wasn’t organic and the tofu wasn’t labeled GMO-free?

When we have sufficient food being supplied to everybody in the world to allow them to be healthy and productive, we can then start refining the food system. Until then, the emphasis should be on finding solutions to world hunger, not forcing food system paranoia onto those who don’t have a choice.

Are We Increasing Resource Use and Taking Beef from the Mouths of Hungry Children?

Bull eatingCan we really afford to lose the sustainability advantages that productivity-enhancing tools provide?

Beta agonists have been a hotly debated topic in the media recently, after it was suggested that the use of Zilmax™ might be related to welfare issues in supplemented cattle (see note 1), and Tyson announced that they would not purchase cattle produced using the feed supplement.

As the global population increases and consumer interest in food production sustainability continues to grow, we know that to maintain the continuous improvements in beef sustainability that we’ve seen over the past half-century, we need to ensure that economic viability, environmental responsibility and social acceptability are all in place. All cattle producers obviously have the choice as to what tools and practices are used within their operation, but what are the big picture environmental and economic implications of removing technology use from beef production? Let’s look at two tools – beta agonists and implants (see note 2 below for an explanation of these tools).

Figure 1. Extra Cattle NeededIn a traditional beef production system using both tools, we’d need 85 million total cattle (see note 3) to maintain the U.S. annual production of 26 billion lbs of beef (see note 4). If we removed beta-agonists from U.S. beef production we’d need an extra 3.5 million total cattle to support beef production; losing access to implants would require an extra 9.9 million cattle; and removing both tools would increase total cattle numbers to 100 million (a 15 million head increase) to maintain the current beef supply (see note 5).

If we need more cattle to maintain beef supply, we use more resources and have a greater carbon footprint.

If we removed beta-agonists, we would need more natural resources to maintain U.S. beef production:

  • More water, equivalent to supplying 1.9 million U.S. households annually (195 billion gallons)
  • More land, equivalent to an area just bigger than Maryland (14.0 thousand sq-miles)
  • More fossil fuels, equivalent to heating 38 thousand U.S. households for a year (3,123 billion BTU)

If we removed implants, we would need more natural resources to maintain U.S. beef production:

  • More water, equivalent to supplying 4.5 million U.S. households annually (457 billion gallons)
  • More land, equivalent to the area of South Carolina (31.6 thousand sq-miles)
  • More fossil fuels, equivalent to heating 45 thousand U.S. households for a year (3,703 billion BTU)

If we removed both beta-agonists and implants, we would need more natural resources to maintain U.S. beef production:

  • More water, equivalent to supplying 7.3 million U.S. households annually (741 billion gallons)
  • More land, equivalent to the area of Louisiana (51.9 thousand sq-miles)
  • More fossil fuels, equivalent to heating 98 thousand U.S. households for a year (8,047 billion BTU)

Water infographic

Land infographicFuel infographicBeef production costs would also increase if these tools weren’t used. Feed costs would increase by 4.0% without beta-agonists, 8.1% without implants and 11.0% without both tools. These costs ultimately would be passed on through every segment of the beef supply chain (including the retailer or food service segment) and ultimately onto the consumer, making beef a less-affordable protein choice.

In a world where one in seven children currently do not have enough food, keeping food affordable is key to improving their health and well-being. If we use productivity-enhancing tools in one single animal, the extra beef produced is sufficient to supply seven schoolchildren with their beef-containing school meals for an entire year. Is that a social sustainability advantage that we can afford to lose?

Although animal welfare is paramount for all beef production stakeholders from the cow-calf operator to the retailer, it is possible that the consumer perception of productivity-enhancing tools  may be harmed by negative comments on media articles relating to Zilmax™. There is no doubt that we will need to use technologies within food production in order to feed the growing global population, yet we need consumer acceptance of both the technologies that we use, and the reasons why we use them, in order to continue to secure market access for U.S. beef.

Consumer acceptance therefore needs to be a key component of our mission to continuously improve beef sustainability. That does not mean giving in to the uninformed whims of those who blithely assert that we could feed the world by returning to the production systems of the 1940’s or ’50s, but does offer an opportunity to reach out, listen to and engage in a dialogue with our friends, family, customers and colleagues about the advantages that technology offers. We have a bright future ahead, but only if we keep the torch alight.

To read more conversation about the use of technologies within beef production (including the real-life experiences of feedyard operators who use these tools) and for facts and figures relating to beef production, please check out the following websites: Feedyard Foodie, Ask a FarmerFacts About Beef, and the U.S. Farmers and Ranchers Alliance.

Footnotes

1) Merck Animal Health have since pledged to conduct a thorough investigation into the issue and have temporarily suspended Zilmax™ sales in the U.S. and Canada.

2) Beta agonists are animal feed ingredients that help cattle maintain their natural muscle-building ability and add about 20-30 pounds of additional lean muscle instead of fat. Implants (sometimes called growth promotants or growth hormones), are placed into the ear and release hormones slowly, helping cattle maintain natural muscle-building ability while also decreasing the amount of fat gained. 

3) Includes beef cows, calves, bulls, replacement animals, stockers and feedlot cattle plus calves and cull cows from the dairy system.

4) Although this is a considerable amount of beef, it’s still not enough to fulfill current demand for beef in the USA and overseas. 

5) This work was presented as a poster at the Joint Annual Meeting of the American Dairy Science Association and American Society of Animal Science in Indianapolis, IN in July 2013. The poster is available for download here

Are We Producing More Food…and Feeding Fewer People?

Waste foodI’m ashamed to admit that the picture to the left is of the lunch table that a media colleague and I left last week – after spending an hour lamenting the fact that in the US, 40% of food is wasted (30% globally). Admittedly, that waste isn’t all down to restaurant portions (in our defense, we both had to fly home, so doggie bags weren’t an option) – however, according to FAO data here, consumer waste accounts for anything between 5% (in Subsaharan Africa) and 39% of total waste (North America and Oceania). The difference (anything from 61% – 95%) is made up from losses between production and retailing.

Losses from production to retail comprise by far the biggest contribution to waste in the developing world, which makes absolute sense – if food is your biggest household cost and hunger is a constant and real danger, the concept of wasting purchased food would seem ridiculous. In the developing world, a myriad of factors play into food insecurity including low agricultural yields, lack of producer education (particularly for women, who are often the main agricultural workers), political instability and military conflict (Pinstrup-Andersen 2000). However, possibly the biggest threat to food security is a lack of sanitary and transport infrastructure (Godfray et al. 2010) – building a milk pasteurization plant is a great opportunity to improve shelf-life, but can only be effective if producers have the facilities to refrigerate and transport milk. Improving tomato yields can reap economic dividends, but if they are transported to markets packed into plastic bags on the back of a bicycle, the wastage is huge. I’m not going to pretend I have the solutions to global food wastage, but what can we do in our own households?

Just as our grandparents learned during WWI and WWII – when food is scarce, you make the most of every single drop of milk or ounce of grain. Yet in the modern developed world, we can afford to waste almost 2/5 of our household food through not understanding expiration dates (cheese does not spontaneously combust into a listeria-ridden ooze at midnight on the day of the expiration date); throwing away the “useless” parts of food waste (radish leaves and wilted celery are actually really good in soup); or simply buying more than we need. In a recent study of greenhouse gases associated with US dairy production, the carbon footprint of a gallon of milk was increased by almost 20% simply because of the amount of “old” milk that consumers poured down the sink each day.

To go back to the picture above, it’s tempting to blame the restaurants – portion sizes tend to be huge, so in this carb-conscious world, it’s not “our fault” if we forgo the last 500 calories by leaving half a plateful of potato chips – they should have just served a smaller portion in the first place, right? Well, maybe. If we’re feeding dairy cows or beef cattle and seeing more than 5-10% feed unconsumed, we’ll reduce the amount fed. I’m sure that exactly the same practice would pay dividends in the restaurant world, and I’d be willing to bet that they could charge exactly the same price.

I spend most of my time myth-busting, showing that the modern beef and dairy industries are far more efficient than the farming systems of 40 or 70 years ago and that we now produce more food using far fewer resources. However, are we really feeding more people if we’re wasting 40% of our food? To suggest that we return to a practice from the WWII era feels almost heretical, but here’s an idea – rather than defining “sustainable” systems as those producing artisan cheeses from heirloom breeds cared for by hemp-wearing liberal arts graduates, why doesn’t every restaurant (or suburb) have a small herd of backyard pigs? Collect the waste food, boil it for 30 min to avoid disease issues, feed to pigs, produce bacon. What could be better? Admittedly, my mother country has banned this practice (I’m beginning to wonder if anything will be permissible in Europe soon), but let’s start the pigswill revolution! Doesn’t “You don’t have to eat that last potato, it’ll make some really good bacon and help us feed those 1 in 7 kids in our local area who don’t have enough food” sound more realistic than “Think of all the starving orphans who would enjoy your PB&J sandwich” (to which the continual smart-a** answer was “I’ll just mail to to them). Let’s do what the livestock industry does best – recycle waste resources to make safe, affordable, nutritous meat!

Can We Please Have Calls for Moderating Meat Consumption… in Moderation?

Do we need to moderate meat consumption in order to feed the world in 2050? Given beef producers’ track record of ingenuity, it’s possible but not probable.

A Twitter follower (Tweep? Twriend? Twquaintance?) asked yesterday whether we could really supply 9+ billion people with 250 lb of meat per capita in 2050. The question stemmed from a recent paper in which Stockholm scientists claimed that we would all have to reduce meat consumption by 75% by 2050 in order to have enough water to supply the population, and a subsequent rejoinder from the American Society of Animal Science in which several scientists noted the flaws in the Swedish paper, the importance of animal-source foods in the diet and the use of marginal land for grazing livestock.

On Twitter, the comment was made that there appear to be two distinct sides to this argument – one side (the environmentalists and anti-animal agriculture groups) warning that we need to drastically cut meat consumption in order to feed everybody, and the other (the meat industry) turning a blind eye and effectively promoting the idea that we can eat all the meat that we like without having any environmental impact.

Globally, we’re nowhere near 250 lb meat consumption per capita, even US consumers who are often portrayed as meat-guzzling bacon-o-philes by the Huffington Post et al. have an average annual consumption of 171 lb according to the USDA. As current beef consumption is 58 lb per capita in the USA, that’s a lot of pork and chicken that will presumably make up the difference. There’s no doubt that increases in both population size and per capita income in regions such as China and India will have a significant impact on global meat consumption by 2050. However, I have to admit I find the “blind eye” comment a little hard to swallow, given, for example, the beef industry’s commitment to measuring and mitigating both resource use and carbon emissions through current life cycle analysis research, and involvement with groups such as the Global Roundtable for Sustainable Beef.

There is no doubt that beef production uses considerable amounts of land and water, yet should we expect producers to effectively shoot themselves in the foot and suggest that consumers forgo a cheeseburger in favor of an alfalfa sprout salad? Isn’t improved efficiency a characteristic of every successful industry? The motor industry is a major contributor to environmental concerns, yet automobile manufacturers aren’t saying “we’re going to produce cars in the same way that we did in the ‘50s, you’ll just have to drive less”. Instead, the message is something akin to “we’re making cars more energy-efficient so that you can continue to drive without worrying about your car’s environmental impact.

That’s exactly what the beef industry has done, is doing and will continue to do into the future. Since 1977, the US beef industry has cut water use by 12%, land use by 33% and the carbon footprint of one lb of beef by 16%. Providing that producers are still able to use management practices and technologies that improve efficiency, further reductions should be seen in future. Yet we have to look beyond the idea that the USA can feed the world by itself. I’m writing this post from Brazil, which has a huge beef industry, yet on average, Brazilian beef cattle first calve at 4 years of age, only 67% of cows have a calf each year and beef animals take 3 years to reach slaughter weight. Comparisons to the equivalent US figures (2 years, 91% and 15 months respectively), show the potential for amazing reductions in resource use from Brazilian beef production, and this, along with other less-efficient systems, is where we have to focus in future. It’s not about forcing US-style production on every producer; it’s about enabling producers to make the best and most efficient use of resources according to their management system and region. Brazil has just approved the use of beta-agonists in beef production, which will allow the production of more beef using fewer resources. This is just one step on the road to improved efficiency.

So do we need to moderate meat consumption in order to feed the world in 2050? I’d love to be able to answer this by citing a published paper that has taken improvements in meat industry productivity over the next 40 years into account rather than assuming a “business as normal” outcome. In the absence of such a paper, I’ll give a Magic 8-Ball type answer: Given beef producers’ track record of ingenuity, it’s possible but not probable. Globally, there are huge opportunities for improved efficiency and concurrent reductions in resource use from all meat production systems – the key is not to reduce meat production but simply to produce it more efficiently.

All Aboard the “Eat Less Meat” Bandwagon

One of the main criteria for publishing scientific research is that it should be novel, yet not a week goes by without yet another paper concluding that we have to reduce meat consumption in order to mitigate climate change. That’s the headline in media coverage relating to the latest paper from a researcher at the The Woods Hole Research Center (published in Environmental Letters), which examines nitrous oxide emissions (a highly potent greenhouse gas (GHG)) in 2050 under various scenarios.

It’s an interesting paper, not least for some of the assumptions buried within the model. Based on data from the FAO, the authors assume that meat consumption will increase by 14% in the developed world and 32% in the developing world by 2050. Coupled with the predicted population global increase (from the current 7 billion to 8.9 billion in 2050), it’s not surprising that a 50% reduction in meat consumption would be predicted to have a significant effect on total GHG. It’s rather akin to suggesting that each person will own two automobiles in 2050, so we should reduce car manufacture.

However, the more striking result is buried in Figure 1, showing that if efficiency of manure management and fertilizer application were improved, this would have a more significant effect on GHG emissions than reducing meat consumption. Given the considerable improvements in cropping practices, crop genetics and yields over the past 50 years there is absolutely no reason why this should not be achieved in the next 40 years.

Alas, a headline suggesting that agriculture needs to continue to improve manure and fertilizer efficiency just isn’t as sexy as the “eat less meat, save the planet” message so often propounded by the mass media. They say that bad news sells – it’s a shame that the lay press are so enamored with messages that denigrate ruminant production, rather than taking a broader look at the options available for mitigating future climate change.

*Thanks to Jesse R. Bussard for bringing this one to the forefront of my “to do “ list.

Is Corn the Soylent Green of the Future?

I recently had the pleasure of watching the 1973 movie Soylent Green starring Charlton Heston. I won’t spoil the ending for those who havent seen it, but the overarching premise of a big company controlling the food supply to an hungry, overcrowded future population strongly resembles some of the current claims made by the more vehement foodies. The anti-animal-agriculture activists appear to have a similar agenda – if only food production was “sustainable” (a word with a million definitions) without any of those “factory farms (that) pose a serious threat to health” and red meats that “have been linked to obesity, diabetes, cardiovascular disease, and certain types of cancer“, life would be much sweeter.

So what’s the answer? It’s very simple. All that animal feed could simply be fed to humans. According to Pimentel at Cornell University, 800 million people could be fed with the grain that livestock eat in the USA each year. If we ignore the fact that field corn (fed to livestock) is not the same as sweet corn (the corn on the cob that we eat), and assume that field corn could easily be processed into a human foodstuff, Pimentel is right.

Given the average human nutrient requirement (2,000 kCal/day) and the energy yield of an acre of shelled corn (14.5 million kCal), one acre of corn (at 2011 US yields) could supply 20 people with their energy needs (see table below). On a global basis, we currently harvest around 393.5 million acres of corn, therefore we could supply the entire current global population (7.003 billion) using only 90% of the global corn area. Of course that’s assuming zero waste and US crop yields. If we use a more realistic scenario with global corn yields (85 bushels/acre) and 30% food wastage, we can only feed 12 people per acre and would need to increase current corn acreage by 121% to produce enough food to supply the current population. So what happens by the year 2050 when the population is predicted to reach 9.5 billion people? Assuming that we continue to see increases in yield proportional to those over the past 30 years (30% increase in US yields since 1982), that yield increases are exhibited globally, and that we can cut food waste to 10%, we could feed 15 people per acre and we’ll need to increase corn acreage by 79% to provide sufficient corn to feed the global population.

If our dietary requirements can be met by corn alone, the increase in land use won’t be an issue – land currently devoted to soy, peanuts or wheat can be converted to corn. Yet this simplistic argument for vegetarian/veganism suffers from three major flaws. Firstly, it assumes that the global population would be willing to forgo lower-yielding vegetable crops that add variety to the diet – where are the artichokes, kale or radishes in this dietary utopia?

Secondly, as noted by Simon Fairlie in his excellent book, converting to a vegan monoculture would significantly increase the reliance on chemical fertilizers and fossil fuels due to a lack of animal manures. Given current concern over dwindling natural resources, this is an inherently unsustainable proposition.

Finally, corn is currently vilified by many food pundits. The suggestion that our food supply is controlled by corporations who force monoculture corn upon hapless farmers who are then faced with the choice of complying with big ag or being forced out of business are the purview of food pundits (e.g. Michael Pollan and Joel Salatin) and documentaries such as Food Inc. and King Corn. Not a week goes by without another Mommy blogger or journalist publishing an article on the dangers of high-fructose corn syrup, often concluding that if only this ingredient was removed from our food supply, childhood obesity and pediatric type II diabetes would cease to be an issue for little Johnny and Janie pre-schoolers of the future.

It’s frighteningly easy to visualise the Soylent Green-esque vegan future, whereby food is doled out in pre-measured quantities according to dietary requirements – yet what happens when the whistle-blower of 2050 proclaims “it’s made from CORN!”?

Feed = Food? Do livestock really compete with humans for food?

Can we feed up to 10 billion people in 2100 by improving crop yields, reducing deforestation, and reducing meat and dairy consumption? These solutions are among those suggested by Jonathan Foley at the University of Minnesota’s Institute of the Environment to enable the increase in food production required by the future global population. These are logical suggestions, yet the proposal that meat and dairy consumption should be reduced is likely to be the most-debated, particularly as livestock industry stakeholders may regard this as yet another attack on animal agriculture.

The futility of the “Meatless Mondays” campaign has been discussed ad infinitum, yet in contrast to the EWG’s recent report, Foley does not attempt to promote a vegetarian or vegan ideology or to suggest that climate change could be reversed if only we all ate humanely-certified or organic meat. Instead, the report concludes that resources could be saved if we shifted to meat consumption towards pork and poultry production as:

…it takes 30 kilos [66 lb] of grain to produce one kilo [2.2lb] of boneless beef… We’re better off producing grass-fed beef or more chicken and pork, which requires far less grain feed

Based on those data, Foley’s conclusion is entirely logical. However, as Carl Sagan said, “Extraordinary claims require extraordinary evidence” – and here the evidence is lacking. A recent review of feed efficiency by Wilkinson reports that monogastric animals require 4.0 kg (swine) or 2.3 kg (poultry) of feed per kg of gain. Monogastrics are indeed considerably more efficient than their ruminant counterparts as beef animals require 8.8 kg feed per kg gain – considerably more than swine or poultry, but far less than Foley’s estimate.

It would be convenient to argue that the errors in Foley’s feed efficiency data (not to mention religious limitations on pork consumption) negate the report’s conclusions. But isn’t it logical to argue that we should eat meat produced in systems that use fewer resources to produce animal protein? Personally, I spend more than half my time traveling to present precisely that message to the animal industry and to encourage livestock producers to improve efficiency. I absolutely believe that we need to improve productivity and efficiency in order to feed the growing population. However, traditional feed efficiency data have a major flaw – it’s assumed that all animal feed could otherwise be used to feed humans.

Wilkinson suggests that the traditional concept of feed efficiency be re-examined to reflect the quantity of human-edible crop inputs that are used to produce a unit of energy or protein from animal products. For example, humans cannot digest pasture, only 20% of the nutritional value of oilseed meals can be utilized for human food and yet 80% of nutrients within cereals, pulses and soybean meal are human-edible. By partitioning out the human-edible component of animal feed, Wilkinson demonstrates that the human-edible energy feed efficiency ratios for pork and cereal beef are similar (Figure 1*) and that dairy production actually produces twice the amount of human-edible energy than it uses (input:output ratio of 0.5). On a protein basis, cereal beef has a higher human-edible protein feed efficiency ratio (3.0) than pork (2.6), but suckler beef systems where cattle are grazed on pasture again produce more human-edible protein than they consume (input:output ratio of 0.9, Figure 2*). Not only are these revised feed efficiency estimates considerably lower than those quoted by Foley, but they underline the importance of herbivorous grazing animals in utilizing human-inedible forage to produce animal protein.

  

Numbers have power – it’s always tempting to base a suggestion around a single data point that “proves” the argument. Feed efficiency is a useful metric, but as we face an ever-increasing challenge in balancing food demand, resource availability and consumer expectations, it’s critical that we examine the bigger picture. The ruminant animal has a major evolutionary advantage in its ability to digest forages – we may be better acquainted with the human dietary advantages of probiotic bacteria than our ancestors, but until we are equipped with human rumens (humens?) we cannot hope to effectively make use of all crop resources.

*The importance of acknowledging the human-edible component of feed efficiency was part of my presentation at the Alltech Ruminant Solutions Seminar in Ireland this week – to go to a PDF copy of my presentation please click here.