Where’s the Beef? Not in Danish Diets.

Culvers burgerFor those of us in the UK, mentioning Danish livestock production almost inevitably leads to thoughts of Danish bacon (be still my beating heart) – a considerable proportion of their 90% of domestic pork products that are exported. However, any beef lovers in Denmark may be in trouble, as recent news articles suggest that red meat (beginning with beef) will soon be taxed in order to cut consumption and meet greenhouse gas targets.

Despite the number of voices clamouring for reduced meat consumption, it seems clear that the average consumer isn’t going to forgo meat and dairy simply because a new study is publicised in the lay press. I’m firmly of the opinion that the only way that meat consumption will decline is if it becomes too expensive to include in the weekly shopping basket. Indeed, although meat consumption per capita has declined in the USA over the past 10 years, demand (as measured by the price that the consumer is willing to pay) has increased over recent years.

So will taxing meat lead to a reduction in consumption? For those who routinely order a 16 oz (454 g) steak in a restaurant or think nothing of tucking into a chateaubriand, probably not. High end cuts of beef are associated with celebrations and luxury dining, and going out for a broccoli pasta bake just doesn’t have that same ring to it.

However, we live in a world where 793 million people (10.7 % of the global population) are undernourished – and that isn’t simply confined to people in developing regions. That means that almost 1 in 9 people do not have enough food. To low-income consumers, food availability isn’t simply a function of what is on the shelf in the supermarket, it’s directly related to economic cost and convenience. If red meat is taxed, it will still be eaten, but there will be a disproportionate shift towards consumers with a greater income and away from those who are in most need of affordable high-quality protein, including growing children.

Do beef alternatives exist? Absolutely – protein can be supplied from other meats, fish or vegetable-based foods. Yet here’s where the convenience aspect comes in – most of us can probably think of a quick and easy recipe involving beef, but how many can you think of involving tofu or lentils? That’s not to say that we shouldn’t expand our cooking repertoires, but when time is at a premium, quick easy recipes that will feed a family win every time.

If beef becomes unaffordable, it will have to be replaced by another protein – but this substitution does not occur at zero cost. Can tofu or lentils be produced on low-quality pastureland where we can’t grow other human food or fibre crops? Do pork or poultry make such efficient use of forages, pastures and by-products from human food and fibre production that, far from competing with humans for food, the animals produce more human-edible energy or protein than they consume? No. The only livestock that do this are those pesky greenhouse gas-belching bovines.

Greenhouse gases are important, but they are not the only factor that we should consider when advocating for sustainable dietary choices. In a world where millions of people are food-insecure, removing a protein choice from the table of those with low incomes simply adds to the problem of how to feed the world – sustainably.

Cutting Meat? Or Just Cutting Corners?

DexterIt is equally interesting, inevitable and lamentable to see that another study has come out claiming that the only way to reduce climate change is to cut meat consumption per person.

Meat consumption appears to be the only human activity subject to continuous “we must cease/reduce this” claims on the basis of environmental impacts. If we compare to other greenhouse gas sources, a considerable proportion come from transportation. Yet rather than insisting that every car-owner cut their annual mileage by 25%, the focus has been on reducing emissions by producing more fuel-efficient vehicles. Similarly, no one has yet claimed that we should reduce household lighting by four hours per day, but the compact fluorescent lightbulb (CFL) has become the poster child for improving household energy efficiency.

We have demonstrable proof that beef and dairy producers have improved greenhouse gas emissions (as well as land use, water use, energy efficiency, etc) over time through improved efficiency, and can continue to do so into the future. So why are the gains made by livestock producers dismissed, and reduced meat intakes seen as the only solution? I have an absolute hatred of conspiracy theories, but it is difficult not to see an latent agenda in the preponderance of “Cut meat consumption” papers. Jumping on the bandwagon? Promoting individual dietary opinions as science? Or simply bowing to NGO/media opinions and looking for easy funding and publicity?

As the global population increases to over 9.5 billion people by 2050, with the majority of this growth occurring in the developing world, the demand for milk, meat and eggs is going to increase by 60%. If we are serious about cutting greenhouse gas emissions, it’s time to examine the impacts of all of our actions and concentrate on further efficiency improvements rather than constraining dietary choice.

Does Nature Really “Do Its Thing” on Organic Farms?

Fridge picI am lucky. My fridge is full of food: mostly produced in the UK or Europe; all nutritious, safe and affordable; and almost all produced on conventional farms, with a small amount of organic food (in my case, chocolate). Given that you’re reading this, I’ll hazard a guess that you are lucky too. 795 million other people can’t say the same thing – and feeding all the people on a planet where 1 in 9 don’t currently have enough food is, in my view, our biggest challenge.

The fact that we face this challenge makes me really irritated when celebrity chefs who could make a huge difference, bow instead to popular rhetoric. In his latest blog post, mockney chef and food pundit Jamie Oliver proclaims that “…organic food is natural food, where nature has been allowed to do its thing, and I’m sure most of us will agree that putting natural ingredients into our bodies is only going to be a positive thing.”

If we ignore the nonsensical claim that natural ingredients produce positive results (Really? Let’s examine puffer fish, solanaceae poisoning, dangerous fungi, absinthe, the many consequences of obesity…), let’s simply look at his claim that organic food is natural.  Except, well, it’s not. Agriculture first developed ~12,000 years ago, and ever since then farmers have been doing their best to breed crops and animals that are best suited to their farming system, whether it’s organic or conventional. Want dairy cows that produce high-quality milk from grazing pasture; leaner pork chops; or strawberries that can survive supermarket handling? You’ve got it. All achieved through traditional breeding techniques (otherwise known as “nature doing its thing”): noting that plant or animal A has desirable characteristics and breeding it with plant or animal B to (hopefully) produce better offspring. No scary chemicals, scientists with syringes or genes in test-tubes. Every farm in the world is founded on “nature doing its thing” – not just the organic farms. We can argue whether GMO crops are natural (breeding techniques are simply more refined and specific) or not (scientists playing god…) but that argument becomes redundant in the EU and many other regions, where GMO crops are not approved.

Can organic producers use pesticides? Yes, if they’re compounds approved for organic production (e.g. highly-toxic copper-based fungicides). Can they use antibiotics and wormers? Again yes, if a proven disease problem exists (note that rules differ slightly between the UK and USA). Are organic farmers just merrily sitting back and letting their crops cross-pollinate and reseed, and their bulls run around happily “doing their thing” to whichever cow they come across? No. It’s a beautiful bucolic image to suggest that organic farmers are happily working with Mother Nature whereas conventional farmers have an evil scientist sitting on one shoulder and a big agribusiness corporation on the other, but its simply not true.

According to Mr Oliver, “…the simple fact is that often we don’t actually have to interfere with nature.” The idea of a world where we could feed over 7 billion people without having to actually invest any research dollars into improving food production is lovely, but it’s smoke and mirrors. At the most basic level, what happens if we don’t “interfere” by controlling weeds (whether by chemicals, mechanical tillage or human labour)? Crop yields are reduced, food production goes down and we feed and clothe fewer people. What happens if a cow has problems giving birth? In nature, she dies. On a farm (whether organic or conventional) both she and the calf are saved, providing milk and meat for us to eat. According to the World Organisation for Animal Health, 20% of global animal protein losses are due to diseases for which treatments already exist – we simply need to make them available to every farmer worldwide. Just think how many more people we could feed if we interfered with nature in that way?

Huge amounts of research monies are invested each year to find ways to improve food production on both organic and conventional farms worldwide. Some are highly technical, others are simple, but all are contributing to the goal of feeding the world. Unfortunately, when food pundits jump on the “let’s promote system X” bandwagon as Mr Oliver has done with organic production, using persuasive but false arguments, we lose traction in fulfilling the real goal. Rather than arguing about which foods we can/should be buying, we need to accept that there’s a place for all systems; examine the ways in which all systems can improve soil fertility, animal health and environmental impacts; and make faster progress towards feeding the world while still enjoying our food choices.

Who Needs Scientists? Just Let Mother Nature Design Your Greek Yogurt.

Chobani.jpgHow you get to 100 calories matters. Most companies use artificial sweeteners. We think Mother Nature is sweet enough”. Clever marketing from the greek yogurt company Chobani, simultaneously disparaging alternative brands, and playing the ultimate caring, sharing, natural card with the mention of “Mother Nature”. However, earlier this week, Chobani’s #howmatters hashtag set the twitter feeds alight after their new “witty” tagline on the underside of yogurt lids was posted (below).

howmattersThe wording plays beautifully into what is fast becoming a universal fear of science intruding on our food supply – we want real food; food like our grandparents ate; food from traditional breeds and heirloom varieties – providing it doesn’t take us over 2,000 cal per day or increase our cholesterol levels. Rightly or wrongly, many people blame processed foods with hidden sugars and added chemical preservatives for many health issues in developed countries – the epitome of a #firstworldproblem, given that the corresponding #thirdworldproblem is hunger and malnutrition.

However, this time the twitter anger wasn’t from rampaging mommy bloggers, or infuriated activists, but scientists. After all, without science, would Chobani have a product? Yogurt was first developed in ancient times, but the modern pasteurized, long-shelf-life, greek yogurt is rather different to the cultured milk our ancestors would have enjoyed.

FAGEI have a 100-calorie greek yogurt from a rival brand in my fridge, so let’s examine the ingredients (left). Simply pasteurized skimmed milk and live active yogurt cultures (note, no added sweeteners). Louis Pasteur, a 19th century French scientist developed pasteurization (in addition to his discoveries relating to vaccines and microbial fermentation); biologists developed methods to identify and classify the bacteria that ferment milk into yogurt; and food scientists experimented with the exact mixture of bacteria to produce the desired flavor, texture and color of yogurt,  as well as developing the range of other processes needed to make the yogurt safe, appealing and shelf-stable.

Yes, we could make greek yogurt without scientists – after all, the original recipe didn’t originate in a corporate experimental kitchen. But without hundreds of years of scientific input, could we make Greek yogurt that, at 100 calories per serving, is desirable to the consumer and is a safe, affordable source of vitamins, minerals and protein? No. To imply that we could does a huge disservice to food scientists.

It appears that being a modern-day scientist appears to be somewhat equivalent to clubbing baby seals to death. Caring little for human suffering and illness, the cold and clinical scientist rubs his hands together with glee as he removes all nutrients from real food, replacing them with chemicals, additives and genetically-modified ingredients. As a side-line, he develops cocktails of toxic elements, pesticides and embalming fluid and markets them as vaccines. Yes, science is the enemy. Just remember that next time you take an aspirin for a hangover from pasteurized, fermented beverages.

How Long is Long-Term? Are We in Danger of Sacrificing Food Security to Satisfy GMO Paranoia?

FrankenfoodsMy Twitter feed is being taken over by two things: 1) arguments and 2) comments that are going to cause arguments. Almost every tweet appears to draw a contrary comment – I’m tempted to post “Elephants have four legs and one trunk” just to see how many people reply “No, there’s an elephant in South Africa called Minnie who only has three legs but has two trunks…”

The latest discussions (debates? arguments? long drawn-out 140-character battles?) have related to the safety of GMOs. Without exception, the argument from the nay-sayers comes down to “We don’t know what the long-term effects are, so we should ban them until we can conclude that they’re safe.”

In other words, we’re trying to prove a negative – show me that there’s no adverse effects whatsoever and I’ll believe it’s ok. Utterly impossible. Can you be absolutely sure that the screen you’re reading this on isn’t causing constant, minute but irreparable damage to your eyes? Water, that essential nutrient without which humans, animals and plants would die, can kill through drowning or intoxication. Even oxygen, without which brain cells are irretrievably damaged in just 10 minutes,  causes seizures and death when inhaled at high pressures. Should we ban these, just in case?

Perhaps we should take a long-term approach to all new technologies. iPhones were only introduced seven years ago, yet many of us spend considerable amounts of time typing on them, or holding them to our ears when they’re not in our pockets – what health-damaging consequences could these shiny new toys confer? What about the now-ubiquitous hand sanitizer? Once only the province of hospitals and germophobes, it’s now sloshed around by the gallon. Touted to kill 99.9% of harmful bacteria – what harm could those chemicals be doing to our fragile physiology?

I’ve yet to meet anybody who, when scheduled for quadruple bypass surgery, demanded that the surgeon only used techniques developed in 1964; or a type I diabetes sufferer who would only use insulin produced from pigs, as it was originally in 1923. When I was treated for breast cancer, I jumped at the chance to be part of a clinical trial involving a new monoclonal antibody treatment, regardless of the very slight risk of heart damage. In medicine, we seem happy to trust that science has the answers – not surprisingly, we prefer to survive today and take our changes with side-effects tomorrow.

With regards to food however, the opposite appears to be the case. The first commercial GMO (the Flavr Savr tomato) was introduced in 1994, GM corn and soy were commercialized in 1996, and not one death or disease has been attributed to any of these crops. Yet the “what are the long-term effects?” concern still persists. So how long-term is long enough? 10 years? 20? 50? Should we keep researching and testing these crops for another 80+ years before allowing them onto the market around the year 2100?

If your answer is yes, just pause for a moment and ask your parents, grandparents or even great-grandparents what life was like during the Great Depression in the USA, or World War II in Europe. Consider what life was like when food was scarce or rationed, when, for example, a British adult was only allowed to buy 4 oz of bacon, 8 oz ground beef, 2 oz each of butter and cheese, 1 fresh egg and 3 pints of milk per week. Those quantities of meat and cheese would only be enough to make two modern bacon cheeseburgers.

By 2050, the global population is predicted to be over 9 billion people. I don’t relish the idea of explaining to my grandchildren that they live with food scarcity, civil unrest (food shortages are one of the major causes of conflict) and malnutrition because public paranoia regarding GMOs meant that a major tool for helping us to improve food production was removed from use. In the developed world we have the luxury of choosing between conventional, natural, local, organic and many other production systems. However, we’re in danger of forgetting that not everybody has the same economic, physical or political freedom to choose. If you gave a basket of food to a family in sub-Saharan Africa subsisting on the equivalent of $30 per week, would they refuse it on the basis that the quinoa wasn’t from Whole Foods, the meat wasn’t organic and the tofu wasn’t labeled GMO-free?

When we have sufficient food being supplied to everybody in the world to allow them to be healthy and productive, we can then start refining the food system. Until then, the emphasis should be on finding solutions to world hunger, not forcing food system paranoia onto those who don’t have a choice.

Are We Increasing Resource Use and Taking Beef from the Mouths of Hungry Children?

Bull eatingCan we really afford to lose the sustainability advantages that productivity-enhancing tools provide?

Beta agonists have been a hotly debated topic in the media recently, after it was suggested that the use of Zilmax™ might be related to welfare issues in supplemented cattle (see note 1), and Tyson announced that they would not purchase cattle produced using the feed supplement.

As the global population increases and consumer interest in food production sustainability continues to grow, we know that to maintain the continuous improvements in beef sustainability that we’ve seen over the past half-century, we need to ensure that economic viability, environmental responsibility and social acceptability are all in place. All cattle producers obviously have the choice as to what tools and practices are used within their operation, but what are the big picture environmental and economic implications of removing technology use from beef production? Let’s look at two tools – beta agonists and implants (see note 2 below for an explanation of these tools).

Figure 1. Extra Cattle NeededIn a traditional beef production system using both tools, we’d need 85 million total cattle (see note 3) to maintain the U.S. annual production of 26 billion lbs of beef (see note 4). If we removed beta-agonists from U.S. beef production we’d need an extra 3.5 million total cattle to support beef production; losing access to implants would require an extra 9.9 million cattle; and removing both tools would increase total cattle numbers to 100 million (a 15 million head increase) to maintain the current beef supply (see note 5).

If we need more cattle to maintain beef supply, we use more resources and have a greater carbon footprint.

If we removed beta-agonists, we would need more natural resources to maintain U.S. beef production:

  • More water, equivalent to supplying 1.9 million U.S. households annually (195 billion gallons)
  • More land, equivalent to an area just bigger than Maryland (14.0 thousand sq-miles)
  • More fossil fuels, equivalent to heating 38 thousand U.S. households for a year (3,123 billion BTU)

If we removed implants, we would need more natural resources to maintain U.S. beef production:

  • More water, equivalent to supplying 4.5 million U.S. households annually (457 billion gallons)
  • More land, equivalent to the area of South Carolina (31.6 thousand sq-miles)
  • More fossil fuels, equivalent to heating 45 thousand U.S. households for a year (3,703 billion BTU)

If we removed both beta-agonists and implants, we would need more natural resources to maintain U.S. beef production:

  • More water, equivalent to supplying 7.3 million U.S. households annually (741 billion gallons)
  • More land, equivalent to the area of Louisiana (51.9 thousand sq-miles)
  • More fossil fuels, equivalent to heating 98 thousand U.S. households for a year (8,047 billion BTU)

Water infographic

Land infographicFuel infographicBeef production costs would also increase if these tools weren’t used. Feed costs would increase by 4.0% without beta-agonists, 8.1% without implants and 11.0% without both tools. These costs ultimately would be passed on through every segment of the beef supply chain (including the retailer or food service segment) and ultimately onto the consumer, making beef a less-affordable protein choice.

In a world where one in seven children currently do not have enough food, keeping food affordable is key to improving their health and well-being. If we use productivity-enhancing tools in one single animal, the extra beef produced is sufficient to supply seven schoolchildren with their beef-containing school meals for an entire year. Is that a social sustainability advantage that we can afford to lose?

Although animal welfare is paramount for all beef production stakeholders from the cow-calf operator to the retailer, it is possible that the consumer perception of productivity-enhancing tools  may be harmed by negative comments on media articles relating to Zilmax™. There is no doubt that we will need to use technologies within food production in order to feed the growing global population, yet we need consumer acceptance of both the technologies that we use, and the reasons why we use them, in order to continue to secure market access for U.S. beef.

Consumer acceptance therefore needs to be a key component of our mission to continuously improve beef sustainability. That does not mean giving in to the uninformed whims of those who blithely assert that we could feed the world by returning to the production systems of the 1940’s or ’50s, but does offer an opportunity to reach out, listen to and engage in a dialogue with our friends, family, customers and colleagues about the advantages that technology offers. We have a bright future ahead, but only if we keep the torch alight.

To read more conversation about the use of technologies within beef production (including the real-life experiences of feedyard operators who use these tools) and for facts and figures relating to beef production, please check out the following websites: Feedyard Foodie, Ask a FarmerFacts About Beef, and the U.S. Farmers and Ranchers Alliance.

Footnotes

1) Merck Animal Health have since pledged to conduct a thorough investigation into the issue and have temporarily suspended Zilmax™ sales in the U.S. and Canada.

2) Beta agonists are animal feed ingredients that help cattle maintain their natural muscle-building ability and add about 20-30 pounds of additional lean muscle instead of fat. Implants (sometimes called growth promotants or growth hormones), are placed into the ear and release hormones slowly, helping cattle maintain natural muscle-building ability while also decreasing the amount of fat gained. 

3) Includes beef cows, calves, bulls, replacement animals, stockers and feedlot cattle plus calves and cull cows from the dairy system.

4) Although this is a considerable amount of beef, it’s still not enough to fulfill current demand for beef in the USA and overseas. 

5) This work was presented as a poster at the Joint Annual Meeting of the American Dairy Science Association and American Society of Animal Science in Indianapolis, IN in July 2013. The poster is available for download here

Are We Producing More Food…and Feeding Fewer People?

Waste foodI’m ashamed to admit that the picture to the left is of the lunch table that a media colleague and I left last week – after spending an hour lamenting the fact that in the US, 40% of food is wasted (30% globally). Admittedly, that waste isn’t all down to restaurant portions (in our defense, we both had to fly home, so doggie bags weren’t an option) – however, according to FAO data here, consumer waste accounts for anything between 5% (in Subsaharan Africa) and 39% of total waste (North America and Oceania). The difference (anything from 61% – 95%) is made up from losses between production and retailing.

Losses from production to retail comprise by far the biggest contribution to waste in the developing world, which makes absolute sense – if food is your biggest household cost and hunger is a constant and real danger, the concept of wasting purchased food would seem ridiculous. In the developing world, a myriad of factors play into food insecurity including low agricultural yields, lack of producer education (particularly for women, who are often the main agricultural workers), political instability and military conflict (Pinstrup-Andersen 2000). However, possibly the biggest threat to food security is a lack of sanitary and transport infrastructure (Godfray et al. 2010) – building a milk pasteurization plant is a great opportunity to improve shelf-life, but can only be effective if producers have the facilities to refrigerate and transport milk. Improving tomato yields can reap economic dividends, but if they are transported to markets packed into plastic bags on the back of a bicycle, the wastage is huge. I’m not going to pretend I have the solutions to global food wastage, but what can we do in our own households?

Just as our grandparents learned during WWI and WWII – when food is scarce, you make the most of every single drop of milk or ounce of grain. Yet in the modern developed world, we can afford to waste almost 2/5 of our household food through not understanding expiration dates (cheese does not spontaneously combust into a listeria-ridden ooze at midnight on the day of the expiration date); throwing away the “useless” parts of food waste (radish leaves and wilted celery are actually really good in soup); or simply buying more than we need. In a recent study of greenhouse gases associated with US dairy production, the carbon footprint of a gallon of milk was increased by almost 20% simply because of the amount of “old” milk that consumers poured down the sink each day.

To go back to the picture above, it’s tempting to blame the restaurants – portion sizes tend to be huge, so in this carb-conscious world, it’s not “our fault” if we forgo the last 500 calories by leaving half a plateful of potato chips – they should have just served a smaller portion in the first place, right? Well, maybe. If we’re feeding dairy cows or beef cattle and seeing more than 5-10% feed unconsumed, we’ll reduce the amount fed. I’m sure that exactly the same practice would pay dividends in the restaurant world, and I’d be willing to bet that they could charge exactly the same price.

I spend most of my time myth-busting, showing that the modern beef and dairy industries are far more efficient than the farming systems of 40 or 70 years ago and that we now produce more food using far fewer resources. However, are we really feeding more people if we’re wasting 40% of our food? To suggest that we return to a practice from the WWII era feels almost heretical, but here’s an idea – rather than defining “sustainable” systems as those producing artisan cheeses from heirloom breeds cared for by hemp-wearing liberal arts graduates, why doesn’t every restaurant (or suburb) have a small herd of backyard pigs? Collect the waste food, boil it for 30 min to avoid disease issues, feed to pigs, produce bacon. What could be better? Admittedly, my mother country has banned this practice (I’m beginning to wonder if anything will be permissible in Europe soon), but let’s start the pigswill revolution! Doesn’t “You don’t have to eat that last potato, it’ll make some really good bacon and help us feed those 1 in 7 kids in our local area who don’t have enough food” sound more realistic than “Think of all the starving orphans who would enjoy your PB&J sandwich” (to which the continual smart-a** answer was “I’ll just mail to to them). Let’s do what the livestock industry does best – recycle waste resources to make safe, affordable, nutritous meat!

Can We Please Have Calls for Moderating Meat Consumption… in Moderation?

Do we need to moderate meat consumption in order to feed the world in 2050? Given beef producers’ track record of ingenuity, it’s possible but not probable.

A Twitter follower (Tweep? Twriend? Twquaintance?) asked yesterday whether we could really supply 9+ billion people with 250 lb of meat per capita in 2050. The question stemmed from a recent paper in which Stockholm scientists claimed that we would all have to reduce meat consumption by 75% by 2050 in order to have enough water to supply the population, and a subsequent rejoinder from the American Society of Animal Science in which several scientists noted the flaws in the Swedish paper, the importance of animal-source foods in the diet and the use of marginal land for grazing livestock.

On Twitter, the comment was made that there appear to be two distinct sides to this argument – one side (the environmentalists and anti-animal agriculture groups) warning that we need to drastically cut meat consumption in order to feed everybody, and the other (the meat industry) turning a blind eye and effectively promoting the idea that we can eat all the meat that we like without having any environmental impact.

Globally, we’re nowhere near 250 lb meat consumption per capita, even US consumers who are often portrayed as meat-guzzling bacon-o-philes by the Huffington Post et al. have an average annual consumption of 171 lb according to the USDA. As current beef consumption is 58 lb per capita in the USA, that’s a lot of pork and chicken that will presumably make up the difference. There’s no doubt that increases in both population size and per capita income in regions such as China and India will have a significant impact on global meat consumption by 2050. However, I have to admit I find the “blind eye” comment a little hard to swallow, given, for example, the beef industry’s commitment to measuring and mitigating both resource use and carbon emissions through current life cycle analysis research, and involvement with groups such as the Global Roundtable for Sustainable Beef.

There is no doubt that beef production uses considerable amounts of land and water, yet should we expect producers to effectively shoot themselves in the foot and suggest that consumers forgo a cheeseburger in favor of an alfalfa sprout salad? Isn’t improved efficiency a characteristic of every successful industry? The motor industry is a major contributor to environmental concerns, yet automobile manufacturers aren’t saying “we’re going to produce cars in the same way that we did in the ‘50s, you’ll just have to drive less”. Instead, the message is something akin to “we’re making cars more energy-efficient so that you can continue to drive without worrying about your car’s environmental impact.

That’s exactly what the beef industry has done, is doing and will continue to do into the future. Since 1977, the US beef industry has cut water use by 12%, land use by 33% and the carbon footprint of one lb of beef by 16%. Providing that producers are still able to use management practices and technologies that improve efficiency, further reductions should be seen in future. Yet we have to look beyond the idea that the USA can feed the world by itself. I’m writing this post from Brazil, which has a huge beef industry, yet on average, Brazilian beef cattle first calve at 4 years of age, only 67% of cows have a calf each year and beef animals take 3 years to reach slaughter weight. Comparisons to the equivalent US figures (2 years, 91% and 15 months respectively), show the potential for amazing reductions in resource use from Brazilian beef production, and this, along with other less-efficient systems, is where we have to focus in future. It’s not about forcing US-style production on every producer; it’s about enabling producers to make the best and most efficient use of resources according to their management system and region. Brazil has just approved the use of beta-agonists in beef production, which will allow the production of more beef using fewer resources. This is just one step on the road to improved efficiency.

So do we need to moderate meat consumption in order to feed the world in 2050? I’d love to be able to answer this by citing a published paper that has taken improvements in meat industry productivity over the next 40 years into account rather than assuming a “business as normal” outcome. In the absence of such a paper, I’ll give a Magic 8-Ball type answer: Given beef producers’ track record of ingenuity, it’s possible but not probable. Globally, there are huge opportunities for improved efficiency and concurrent reductions in resource use from all meat production systems – the key is not to reduce meat production but simply to produce it more efficiently.

Is Corn the Soylent Green of the Future?

I recently had the pleasure of watching the 1973 movie Soylent Green starring Charlton Heston. I won’t spoil the ending for those who havent seen it, but the overarching premise of a big company controlling the food supply to an hungry, overcrowded future population strongly resembles some of the current claims made by the more vehement foodies. The anti-animal-agriculture activists appear to have a similar agenda – if only food production was “sustainable” (a word with a million definitions) without any of those “factory farms (that) pose a serious threat to health” and red meats that “have been linked to obesity, diabetes, cardiovascular disease, and certain types of cancer“, life would be much sweeter.

So what’s the answer? It’s very simple. All that animal feed could simply be fed to humans. According to Pimentel at Cornell University, 800 million people could be fed with the grain that livestock eat in the USA each year. If we ignore the fact that field corn (fed to livestock) is not the same as sweet corn (the corn on the cob that we eat), and assume that field corn could easily be processed into a human foodstuff, Pimentel is right.

Given the average human nutrient requirement (2,000 kCal/day) and the energy yield of an acre of shelled corn (14.5 million kCal), one acre of corn (at 2011 US yields) could supply 20 people with their energy needs (see table below). On a global basis, we currently harvest around 393.5 million acres of corn, therefore we could supply the entire current global population (7.003 billion) using only 90% of the global corn area. Of course that’s assuming zero waste and US crop yields. If we use a more realistic scenario with global corn yields (85 bushels/acre) and 30% food wastage, we can only feed 12 people per acre and would need to increase current corn acreage by 121% to produce enough food to supply the current population. So what happens by the year 2050 when the population is predicted to reach 9.5 billion people? Assuming that we continue to see increases in yield proportional to those over the past 30 years (30% increase in US yields since 1982), that yield increases are exhibited globally, and that we can cut food waste to 10%, we could feed 15 people per acre and we’ll need to increase corn acreage by 79% to provide sufficient corn to feed the global population.

If our dietary requirements can be met by corn alone, the increase in land use won’t be an issue – land currently devoted to soy, peanuts or wheat can be converted to corn. Yet this simplistic argument for vegetarian/veganism suffers from three major flaws. Firstly, it assumes that the global population would be willing to forgo lower-yielding vegetable crops that add variety to the diet – where are the artichokes, kale or radishes in this dietary utopia?

Secondly, as noted by Simon Fairlie in his excellent book, converting to a vegan monoculture would significantly increase the reliance on chemical fertilizers and fossil fuels due to a lack of animal manures. Given current concern over dwindling natural resources, this is an inherently unsustainable proposition.

Finally, corn is currently vilified by many food pundits. The suggestion that our food supply is controlled by corporations who force monoculture corn upon hapless farmers who are then faced with the choice of complying with big ag or being forced out of business are the purview of food pundits (e.g. Michael Pollan and Joel Salatin) and documentaries such as Food Inc. and King Corn. Not a week goes by without another Mommy blogger or journalist publishing an article on the dangers of high-fructose corn syrup, often concluding that if only this ingredient was removed from our food supply, childhood obesity and pediatric type II diabetes would cease to be an issue for little Johnny and Janie pre-schoolers of the future.

It’s frighteningly easy to visualise the Soylent Green-esque vegan future, whereby food is doled out in pre-measured quantities according to dietary requirements – yet what happens when the whistle-blower of 2050 proclaims “it’s made from CORN!”?

Can Population Control Shrink the Yield Gap? Developing Solutions for Developing Regions

A recent article by Andrew Jack in the Financial Times (the iconic pink-colored newspaper carried by all self-respecting business men on London tube trains each morning) reports that developing countries in Africa will be responsible for the greatest increases in population growth over the next 90 years, with the global population predicted to hit 10 billion by 2100. Given the number of dire predictions currently being bandied around regarding population increases, food demand and climate change it’s easy to become blasé and dismiss it as just another issue that will be solved by the next generation. After all, what difference can we possibly make to rural communities in developing countries?

Rises in per capita are predicted for developing countries over the next century and as Andrew Jack notes, increasing affluence results in improved healthcare, urbanisation and a decline in birth rates. Nonetheless, this pattern is not being currently being exhibited by regions in Africa and S. Asia, which are unable to improve per capita income as population increases overcome economic growth.

So how do we curtail the rise in population growth? Even in a developed country such as the USA, mentioning the politically-charged phrase “population control” often leads to an uncomfortable silence, images of an Orwellian constraint on family size (1 child good, 2 children bad?) or debate over the rights and wrongs of abortion. Yet simply providing access to contraception, considered to be an inherent human right by many women in the developed world, could conceivably (pardon the pun) improve future sustainability.

Reducing the number of children born in developing countries would improve female health and lessen the burden on economic and environmental resources placed by increasing population size. This would be a crucial step forwards in mitigating future food shortages, yet it does not solve the underlying problem. One of the major drivers behind population growth is parental reliance on support from the younger generation. As discussed in Jared Diamond’s excellent book “Guns, Germs and Steel“, higher birth rates potentially allow for a greater number of children to work on the land and improve societal stability. However, the promise of  future affluence is a major stimulant that causes young people to migrate from rural areas to urban regions that are already suffering from significant overcrowding.

A considerable yield gap exists between developing and developed countries, both in terms of animal and crop production. If productivity could be improved and the yield gap reduced, food security would improve in rural areas with positive effects on per capita income and infrastructure, and potential reductions in the number of inner-city migrants. Nonetheless, the major question remains unanswered – how to improve productivity?

If we use dairy farming as an example, a recent report from the Food and Agriculture Organisation noted a negative relationship between productivity and carbon footprint – as we move across the globe from developed to developing regions, productivity decreases and the carbon footprint per kg of milk increases (see graph). Carbon can also be considered a proxy for land, water and energy use, thus reduced productivity increases both resource use and environmental impact. The logical conclusion would therefore be to implement systems and management practices currently seen in N. America, Europe and Australasia in an attempt to educate farmers in S. Asia and Africa. However, sustainability cannot be improved by implementing a one-size-fits-all solution, but calls for a region-specific approach.

Identifying traits within indigenous and imported animal breeds and plant varieties that will make the best use of available resources allows development of production systems that match environmental, social and economic demands. Further implementation of continuous improvement and best management practices in developing countries is crucial in order to improve affluence, reduce population growth and mitigate environmental impact both now and in future. However, this is not a situation that can simply be solved by improving yield per acre, but requires a multifactorial approach coordinating population control, education, food security and human health and welfare. Providing contraception without a concurrent effort to improve agricultural productivity will fail in its prophylactic intent to control either population growth or world hunger.