Livestock provide food, income, education, cultural status…and hope.

African farmland

I took the photo above while travelling in South Africa last year. Whenever I’m faced with the inevitable “But we can just grow corn and soy to feed humans!” anti-livestock rhetoric (as seen in The Guardian this past week), I’m reminded of this picture. It shows tiny rural homes on the edge of a major road, upon which the majority of people walk to work, dodging the traffic as they go. The land is rocky, steep and lacks nutrients, the soil only capable of producing fibrous grasses that can’t be eaten by people. Yet, another few hundred yards down the road, we came across a goat.

African goat (straighter)

For many people in low-income countries, a goat is a lifeline. A source of food that improves the nutrition and health of young children, pregnant women and elderly people. A source of income to allow children to attend school and have a future career, rather than working to support their family before the age of 10. A source of security that allows for improved mental health, female independence and cultural status. Last week I spoke at a Cheltenham Science Festival panel entitled “Should we all become vegan?” It’s easy to suggest that many of us in the developed world could eat less meat. However, the myriad benefits provided by livestock to people in low-income regions should not be foregone on the grounds of foodie ideology bestowed by those of us living in developed regions.

I’m pleased to see Prue Leith, Jenny Eclair, Bob Geldof, Jonathan Dimbleby and others lending their support to Send a Cow’s #UnheardVoices campaign. Let’s recognise livestock’s role in giving hope to those who need it most – and make those voices heard.

Cutting Meat? Or Just Cutting Corners?

DexterIt is equally interesting, inevitable and lamentable to see that another study has come out claiming that the only way to reduce climate change is to cut meat consumption per person.

Meat consumption appears to be the only human activity subject to continuous “we must cease/reduce this” claims on the basis of environmental impacts. If we compare to other greenhouse gas sources, a considerable proportion come from transportation. Yet rather than insisting that every car-owner cut their annual mileage by 25%, the focus has been on reducing emissions by producing more fuel-efficient vehicles. Similarly, no one has yet claimed that we should reduce household lighting by four hours per day, but the compact fluorescent lightbulb (CFL) has become the poster child for improving household energy efficiency.

We have demonstrable proof that beef and dairy producers have improved greenhouse gas emissions (as well as land use, water use, energy efficiency, etc) over time through improved efficiency, and can continue to do so into the future. So why are the gains made by livestock producers dismissed, and reduced meat intakes seen as the only solution? I have an absolute hatred of conspiracy theories, but it is difficult not to see an latent agenda in the preponderance of “Cut meat consumption” papers. Jumping on the bandwagon? Promoting individual dietary opinions as science? Or simply bowing to NGO/media opinions and looking for easy funding and publicity?

As the global population increases to over 9.5 billion people by 2050, with the majority of this growth occurring in the developing world, the demand for milk, meat and eggs is going to increase by 60%. If we are serious about cutting greenhouse gas emissions, it’s time to examine the impacts of all of our actions and concentrate on further efficiency improvements rather than constraining dietary choice.

Does Nature Really “Do Its Thing” on Organic Farms?

Fridge picI am lucky. My fridge is full of food: mostly produced in the UK or Europe; all nutritious, safe and affordable; and almost all produced on conventional farms, with a small amount of organic food (in my case, chocolate). Given that you’re reading this, I’ll hazard a guess that you are lucky too. 795 million other people can’t say the same thing – and feeding all the people on a planet where 1 in 9 don’t currently have enough food is, in my view, our biggest challenge.

The fact that we face this challenge makes me really irritated when celebrity chefs who could make a huge difference, bow instead to popular rhetoric. In his latest blog post, mockney chef and food pundit Jamie Oliver proclaims that “…organic food is natural food, where nature has been allowed to do its thing, and I’m sure most of us will agree that putting natural ingredients into our bodies is only going to be a positive thing.”

If we ignore the nonsensical claim that natural ingredients produce positive results (Really? Let’s examine puffer fish, solanaceae poisoning, dangerous fungi, absinthe, the many consequences of obesity…), let’s simply look at his claim that organic food is natural.  Except, well, it’s not. Agriculture first developed ~12,000 years ago, and ever since then farmers have been doing their best to breed crops and animals that are best suited to their farming system, whether it’s organic or conventional. Want dairy cows that produce high-quality milk from grazing pasture; leaner pork chops; or strawberries that can survive supermarket handling? You’ve got it. All achieved through traditional breeding techniques (otherwise known as “nature doing its thing”): noting that plant or animal A has desirable characteristics and breeding it with plant or animal B to (hopefully) produce better offspring. No scary chemicals, scientists with syringes or genes in test-tubes. Every farm in the world is founded on “nature doing its thing” – not just the organic farms. We can argue whether GMO crops are natural (breeding techniques are simply more refined and specific) or not (scientists playing god…) but that argument becomes redundant in the EU and many other regions, where GMO crops are not approved.

Can organic producers use pesticides? Yes, if they’re compounds approved for organic production (e.g. highly-toxic copper-based fungicides). Can they use antibiotics and wormers? Again yes, if a proven disease problem exists (note that rules differ slightly between the UK and USA). Are organic farmers just merrily sitting back and letting their crops cross-pollinate and reseed, and their bulls run around happily “doing their thing” to whichever cow they come across? No. It’s a beautiful bucolic image to suggest that organic farmers are happily working with Mother Nature whereas conventional farmers have an evil scientist sitting on one shoulder and a big agribusiness corporation on the other, but its simply not true.

According to Mr Oliver, “…the simple fact is that often we don’t actually have to interfere with nature.” The idea of a world where we could feed over 7 billion people without having to actually invest any research dollars into improving food production is lovely, but it’s smoke and mirrors. At the most basic level, what happens if we don’t “interfere” by controlling weeds (whether by chemicals, mechanical tillage or human labour)? Crop yields are reduced, food production goes down and we feed and clothe fewer people. What happens if a cow has problems giving birth? In nature, she dies. On a farm (whether organic or conventional) both she and the calf are saved, providing milk and meat for us to eat. According to the World Organisation for Animal Health, 20% of global animal protein losses are due to diseases for which treatments already exist – we simply need to make them available to every farmer worldwide. Just think how many more people we could feed if we interfered with nature in that way?

Huge amounts of research monies are invested each year to find ways to improve food production on both organic and conventional farms worldwide. Some are highly technical, others are simple, but all are contributing to the goal of feeding the world. Unfortunately, when food pundits jump on the “let’s promote system X” bandwagon as Mr Oliver has done with organic production, using persuasive but false arguments, we lose traction in fulfilling the real goal. Rather than arguing about which foods we can/should be buying, we need to accept that there’s a place for all systems; examine the ways in which all systems can improve soil fertility, animal health and environmental impacts; and make faster progress towards feeding the world while still enjoying our food choices.

Who Needs Scientists? Just Let Mother Nature Design Your Greek Yogurt.

Chobani.jpgHow you get to 100 calories matters. Most companies use artificial sweeteners. We think Mother Nature is sweet enough”. Clever marketing from the greek yogurt company Chobani, simultaneously disparaging alternative brands, and playing the ultimate caring, sharing, natural card with the mention of “Mother Nature”. However, earlier this week, Chobani’s #howmatters hashtag set the twitter feeds alight after their new “witty” tagline on the underside of yogurt lids was posted (below).

howmattersThe wording plays beautifully into what is fast becoming a universal fear of science intruding on our food supply – we want real food; food like our grandparents ate; food from traditional breeds and heirloom varieties – providing it doesn’t take us over 2,000 cal per day or increase our cholesterol levels. Rightly or wrongly, many people blame processed foods with hidden sugars and added chemical preservatives for many health issues in developed countries – the epitome of a #firstworldproblem, given that the corresponding #thirdworldproblem is hunger and malnutrition.

However, this time the twitter anger wasn’t from rampaging mommy bloggers, or infuriated activists, but scientists. After all, without science, would Chobani have a product? Yogurt was first developed in ancient times, but the modern pasteurized, long-shelf-life, greek yogurt is rather different to the cultured milk our ancestors would have enjoyed.

FAGEI have a 100-calorie greek yogurt from a rival brand in my fridge, so let’s examine the ingredients (left). Simply pasteurized skimmed milk and live active yogurt cultures (note, no added sweeteners). Louis Pasteur, a 19th century French scientist developed pasteurization (in addition to his discoveries relating to vaccines and microbial fermentation); biologists developed methods to identify and classify the bacteria that ferment milk into yogurt; and food scientists experimented with the exact mixture of bacteria to produce the desired flavor, texture and color of yogurt,  as well as developing the range of other processes needed to make the yogurt safe, appealing and shelf-stable.

Yes, we could make greek yogurt without scientists – after all, the original recipe didn’t originate in a corporate experimental kitchen. But without hundreds of years of scientific input, could we make Greek yogurt that, at 100 calories per serving, is desirable to the consumer and is a safe, affordable source of vitamins, minerals and protein? No. To imply that we could does a huge disservice to food scientists.

It appears that being a modern-day scientist appears to be somewhat equivalent to clubbing baby seals to death. Caring little for human suffering and illness, the cold and clinical scientist rubs his hands together with glee as he removes all nutrients from real food, replacing them with chemicals, additives and genetically-modified ingredients. As a side-line, he develops cocktails of toxic elements, pesticides and embalming fluid and markets them as vaccines. Yes, science is the enemy. Just remember that next time you take an aspirin for a hangover from pasteurized, fermented beverages.

How Long is Long-Term? Are We in Danger of Sacrificing Food Security to Satisfy GMO Paranoia?

FrankenfoodsMy Twitter feed is being taken over by two things: 1) arguments and 2) comments that are going to cause arguments. Almost every tweet appears to draw a contrary comment – I’m tempted to post “Elephants have four legs and one trunk” just to see how many people reply “No, there’s an elephant in South Africa called Minnie who only has three legs but has two trunks…”

The latest discussions (debates? arguments? long drawn-out 140-character battles?) have related to the safety of GMOs. Without exception, the argument from the nay-sayers comes down to “We don’t know what the long-term effects are, so we should ban them until we can conclude that they’re safe.”

In other words, we’re trying to prove a negative – show me that there’s no adverse effects whatsoever and I’ll believe it’s ok. Utterly impossible. Can you be absolutely sure that the screen you’re reading this on isn’t causing constant, minute but irreparable damage to your eyes? Water, that essential nutrient without which humans, animals and plants would die, can kill through drowning or intoxication. Even oxygen, without which brain cells are irretrievably damaged in just 10 minutes,  causes seizures and death when inhaled at high pressures. Should we ban these, just in case?

Perhaps we should take a long-term approach to all new technologies. iPhones were only introduced seven years ago, yet many of us spend considerable amounts of time typing on them, or holding them to our ears when they’re not in our pockets – what health-damaging consequences could these shiny new toys confer? What about the now-ubiquitous hand sanitizer? Once only the province of hospitals and germophobes, it’s now sloshed around by the gallon. Touted to kill 99.9% of harmful bacteria – what harm could those chemicals be doing to our fragile physiology?

I’ve yet to meet anybody who, when scheduled for quadruple bypass surgery, demanded that the surgeon only used techniques developed in 1964; or a type I diabetes sufferer who would only use insulin produced from pigs, as it was originally in 1923. When I was treated for breast cancer, I jumped at the chance to be part of a clinical trial involving a new monoclonal antibody treatment, regardless of the very slight risk of heart damage. In medicine, we seem happy to trust that science has the answers – not surprisingly, we prefer to survive today and take our changes with side-effects tomorrow.

With regards to food however, the opposite appears to be the case. The first commercial GMO (the Flavr Savr tomato) was introduced in 1994, GM corn and soy were commercialized in 1996, and not one death or disease has been attributed to any of these crops. Yet the “what are the long-term effects?” concern still persists. So how long-term is long enough? 10 years? 20? 50? Should we keep researching and testing these crops for another 80+ years before allowing them onto the market around the year 2100?

If your answer is yes, just pause for a moment and ask your parents, grandparents or even great-grandparents what life was like during the Great Depression in the USA, or World War II in Europe. Consider what life was like when food was scarce or rationed, when, for example, a British adult was only allowed to buy 4 oz of bacon, 8 oz ground beef, 2 oz each of butter and cheese, 1 fresh egg and 3 pints of milk per week. Those quantities of meat and cheese would only be enough to make two modern bacon cheeseburgers.

By 2050, the global population is predicted to be over 9 billion people. I don’t relish the idea of explaining to my grandchildren that they live with food scarcity, civil unrest (food shortages are one of the major causes of conflict) and malnutrition because public paranoia regarding GMOs meant that a major tool for helping us to improve food production was removed from use. In the developed world we have the luxury of choosing between conventional, natural, local, organic and many other production systems. However, we’re in danger of forgetting that not everybody has the same economic, physical or political freedom to choose. If you gave a basket of food to a family in sub-Saharan Africa subsisting on the equivalent of $30 per week, would they refuse it on the basis that the quinoa wasn’t from Whole Foods, the meat wasn’t organic and the tofu wasn’t labeled GMO-free?

When we have sufficient food being supplied to everybody in the world to allow them to be healthy and productive, we can then start refining the food system. Until then, the emphasis should be on finding solutions to world hunger, not forcing food system paranoia onto those who don’t have a choice.

Are We Producing More Food…and Feeding Fewer People?

Waste foodI’m ashamed to admit that the picture to the left is of the lunch table that a media colleague and I left last week – after spending an hour lamenting the fact that in the US, 40% of food is wasted (30% globally). Admittedly, that waste isn’t all down to restaurant portions (in our defense, we both had to fly home, so doggie bags weren’t an option) – however, according to FAO data here, consumer waste accounts for anything between 5% (in Subsaharan Africa) and 39% of total waste (North America and Oceania). The difference (anything from 61% – 95%) is made up from losses between production and retailing.

Losses from production to retail comprise by far the biggest contribution to waste in the developing world, which makes absolute sense – if food is your biggest household cost and hunger is a constant and real danger, the concept of wasting purchased food would seem ridiculous. In the developing world, a myriad of factors play into food insecurity including low agricultural yields, lack of producer education (particularly for women, who are often the main agricultural workers), political instability and military conflict (Pinstrup-Andersen 2000). However, possibly the biggest threat to food security is a lack of sanitary and transport infrastructure (Godfray et al. 2010) – building a milk pasteurization plant is a great opportunity to improve shelf-life, but can only be effective if producers have the facilities to refrigerate and transport milk. Improving tomato yields can reap economic dividends, but if they are transported to markets packed into plastic bags on the back of a bicycle, the wastage is huge. I’m not going to pretend I have the solutions to global food wastage, but what can we do in our own households?

Just as our grandparents learned during WWI and WWII – when food is scarce, you make the most of every single drop of milk or ounce of grain. Yet in the modern developed world, we can afford to waste almost 2/5 of our household food through not understanding expiration dates (cheese does not spontaneously combust into a listeria-ridden ooze at midnight on the day of the expiration date); throwing away the “useless” parts of food waste (radish leaves and wilted celery are actually really good in soup); or simply buying more than we need. In a recent study of greenhouse gases associated with US dairy production, the carbon footprint of a gallon of milk was increased by almost 20% simply because of the amount of “old” milk that consumers poured down the sink each day.

To go back to the picture above, it’s tempting to blame the restaurants – portion sizes tend to be huge, so in this carb-conscious world, it’s not “our fault” if we forgo the last 500 calories by leaving half a plateful of potato chips – they should have just served a smaller portion in the first place, right? Well, maybe. If we’re feeding dairy cows or beef cattle and seeing more than 5-10% feed unconsumed, we’ll reduce the amount fed. I’m sure that exactly the same practice would pay dividends in the restaurant world, and I’d be willing to bet that they could charge exactly the same price.

I spend most of my time myth-busting, showing that the modern beef and dairy industries are far more efficient than the farming systems of 40 or 70 years ago and that we now produce more food using far fewer resources. However, are we really feeding more people if we’re wasting 40% of our food? To suggest that we return to a practice from the WWII era feels almost heretical, but here’s an idea – rather than defining “sustainable” systems as those producing artisan cheeses from heirloom breeds cared for by hemp-wearing liberal arts graduates, why doesn’t every restaurant (or suburb) have a small herd of backyard pigs? Collect the waste food, boil it for 30 min to avoid disease issues, feed to pigs, produce bacon. What could be better? Admittedly, my mother country has banned this practice (I’m beginning to wonder if anything will be permissible in Europe soon), but let’s start the pigswill revolution! Doesn’t “You don’t have to eat that last potato, it’ll make some really good bacon and help us feed those 1 in 7 kids in our local area who don’t have enough food” sound more realistic than “Think of all the starving orphans who would enjoy your PB&J sandwich” (to which the continual smart-a** answer was “I’ll just mail to to them). Let’s do what the livestock industry does best – recycle waste resources to make safe, affordable, nutritous meat!

Can We Please Have Calls for Moderating Meat Consumption… in Moderation?

Do we need to moderate meat consumption in order to feed the world in 2050? Given beef producers’ track record of ingenuity, it’s possible but not probable.

A Twitter follower (Tweep? Twriend? Twquaintance?) asked yesterday whether we could really supply 9+ billion people with 250 lb of meat per capita in 2050. The question stemmed from a recent paper in which Stockholm scientists claimed that we would all have to reduce meat consumption by 75% by 2050 in order to have enough water to supply the population, and a subsequent rejoinder from the American Society of Animal Science in which several scientists noted the flaws in the Swedish paper, the importance of animal-source foods in the diet and the use of marginal land for grazing livestock.

On Twitter, the comment was made that there appear to be two distinct sides to this argument – one side (the environmentalists and anti-animal agriculture groups) warning that we need to drastically cut meat consumption in order to feed everybody, and the other (the meat industry) turning a blind eye and effectively promoting the idea that we can eat all the meat that we like without having any environmental impact.

Globally, we’re nowhere near 250 lb meat consumption per capita, even US consumers who are often portrayed as meat-guzzling bacon-o-philes by the Huffington Post et al. have an average annual consumption of 171 lb according to the USDA. As current beef consumption is 58 lb per capita in the USA, that’s a lot of pork and chicken that will presumably make up the difference. There’s no doubt that increases in both population size and per capita income in regions such as China and India will have a significant impact on global meat consumption by 2050. However, I have to admit I find the “blind eye” comment a little hard to swallow, given, for example, the beef industry’s commitment to measuring and mitigating both resource use and carbon emissions through current life cycle analysis research, and involvement with groups such as the Global Roundtable for Sustainable Beef.

There is no doubt that beef production uses considerable amounts of land and water, yet should we expect producers to effectively shoot themselves in the foot and suggest that consumers forgo a cheeseburger in favor of an alfalfa sprout salad? Isn’t improved efficiency a characteristic of every successful industry? The motor industry is a major contributor to environmental concerns, yet automobile manufacturers aren’t saying “we’re going to produce cars in the same way that we did in the ‘50s, you’ll just have to drive less”. Instead, the message is something akin to “we’re making cars more energy-efficient so that you can continue to drive without worrying about your car’s environmental impact.

That’s exactly what the beef industry has done, is doing and will continue to do into the future. Since 1977, the US beef industry has cut water use by 12%, land use by 33% and the carbon footprint of one lb of beef by 16%. Providing that producers are still able to use management practices and technologies that improve efficiency, further reductions should be seen in future. Yet we have to look beyond the idea that the USA can feed the world by itself. I’m writing this post from Brazil, which has a huge beef industry, yet on average, Brazilian beef cattle first calve at 4 years of age, only 67% of cows have a calf each year and beef animals take 3 years to reach slaughter weight. Comparisons to the equivalent US figures (2 years, 91% and 15 months respectively), show the potential for amazing reductions in resource use from Brazilian beef production, and this, along with other less-efficient systems, is where we have to focus in future. It’s not about forcing US-style production on every producer; it’s about enabling producers to make the best and most efficient use of resources according to their management system and region. Brazil has just approved the use of beta-agonists in beef production, which will allow the production of more beef using fewer resources. This is just one step on the road to improved efficiency.

So do we need to moderate meat consumption in order to feed the world in 2050? I’d love to be able to answer this by citing a published paper that has taken improvements in meat industry productivity over the next 40 years into account rather than assuming a “business as normal” outcome. In the absence of such a paper, I’ll give a Magic 8-Ball type answer: Given beef producers’ track record of ingenuity, it’s possible but not probable. Globally, there are huge opportunities for improved efficiency and concurrent reductions in resource use from all meat production systems – the key is not to reduce meat production but simply to produce it more efficiently.

All Aboard the “Eat Less Meat” Bandwagon

One of the main criteria for publishing scientific research is that it should be novel, yet not a week goes by without yet another paper concluding that we have to reduce meat consumption in order to mitigate climate change. That’s the headline in media coverage relating to the latest paper from a researcher at the The Woods Hole Research Center (published in Environmental Letters), which examines nitrous oxide emissions (a highly potent greenhouse gas (GHG)) in 2050 under various scenarios.

It’s an interesting paper, not least for some of the assumptions buried within the model. Based on data from the FAO, the authors assume that meat consumption will increase by 14% in the developed world and 32% in the developing world by 2050. Coupled with the predicted population global increase (from the current 7 billion to 8.9 billion in 2050), it’s not surprising that a 50% reduction in meat consumption would be predicted to have a significant effect on total GHG. It’s rather akin to suggesting that each person will own two automobiles in 2050, so we should reduce car manufacture.

However, the more striking result is buried in Figure 1, showing that if efficiency of manure management and fertilizer application were improved, this would have a more significant effect on GHG emissions than reducing meat consumption. Given the considerable improvements in cropping practices, crop genetics and yields over the past 50 years there is absolutely no reason why this should not be achieved in the next 40 years.

Alas, a headline suggesting that agriculture needs to continue to improve manure and fertilizer efficiency just isn’t as sexy as the “eat less meat, save the planet” message so often propounded by the mass media. They say that bad news sells – it’s a shame that the lay press are so enamored with messages that denigrate ruminant production, rather than taking a broader look at the options available for mitigating future climate change.

*Thanks to Jesse R. Bussard for bringing this one to the forefront of my “to do “ list.

Is Corn the Soylent Green of the Future?

I recently had the pleasure of watching the 1973 movie Soylent Green starring Charlton Heston. I won’t spoil the ending for those who havent seen it, but the overarching premise of a big company controlling the food supply to an hungry, overcrowded future population strongly resembles some of the current claims made by the more vehement foodies. The anti-animal-agriculture activists appear to have a similar agenda – if only food production was “sustainable” (a word with a million definitions) without any of those “factory farms (that) pose a serious threat to health” and red meats that “have been linked to obesity, diabetes, cardiovascular disease, and certain types of cancer“, life would be much sweeter.

So what’s the answer? It’s very simple. All that animal feed could simply be fed to humans. According to Pimentel at Cornell University, 800 million people could be fed with the grain that livestock eat in the USA each year. If we ignore the fact that field corn (fed to livestock) is not the same as sweet corn (the corn on the cob that we eat), and assume that field corn could easily be processed into a human foodstuff, Pimentel is right.

Given the average human nutrient requirement (2,000 kCal/day) and the energy yield of an acre of shelled corn (14.5 million kCal), one acre of corn (at 2011 US yields) could supply 20 people with their energy needs (see table below). On a global basis, we currently harvest around 393.5 million acres of corn, therefore we could supply the entire current global population (7.003 billion) using only 90% of the global corn area. Of course that’s assuming zero waste and US crop yields. If we use a more realistic scenario with global corn yields (85 bushels/acre) and 30% food wastage, we can only feed 12 people per acre and would need to increase current corn acreage by 121% to produce enough food to supply the current population. So what happens by the year 2050 when the population is predicted to reach 9.5 billion people? Assuming that we continue to see increases in yield proportional to those over the past 30 years (30% increase in US yields since 1982), that yield increases are exhibited globally, and that we can cut food waste to 10%, we could feed 15 people per acre and we’ll need to increase corn acreage by 79% to provide sufficient corn to feed the global population.

If our dietary requirements can be met by corn alone, the increase in land use won’t be an issue – land currently devoted to soy, peanuts or wheat can be converted to corn. Yet this simplistic argument for vegetarian/veganism suffers from three major flaws. Firstly, it assumes that the global population would be willing to forgo lower-yielding vegetable crops that add variety to the diet – where are the artichokes, kale or radishes in this dietary utopia?

Secondly, as noted by Simon Fairlie in his excellent book, converting to a vegan monoculture would significantly increase the reliance on chemical fertilizers and fossil fuels due to a lack of animal manures. Given current concern over dwindling natural resources, this is an inherently unsustainable proposition.

Finally, corn is currently vilified by many food pundits. The suggestion that our food supply is controlled by corporations who force monoculture corn upon hapless farmers who are then faced with the choice of complying with big ag or being forced out of business are the purview of food pundits (e.g. Michael Pollan and Joel Salatin) and documentaries such as Food Inc. and King Corn. Not a week goes by without another Mommy blogger or journalist publishing an article on the dangers of high-fructose corn syrup, often concluding that if only this ingredient was removed from our food supply, childhood obesity and pediatric type II diabetes would cease to be an issue for little Johnny and Janie pre-schoolers of the future.

It’s frighteningly easy to visualise the Soylent Green-esque vegan future, whereby food is doled out in pre-measured quantities according to dietary requirements – yet what happens when the whistle-blower of 2050 proclaims “it’s made from CORN!”?

Can Population Control Shrink the Yield Gap? Developing Solutions for Developing Regions

A recent article by Andrew Jack in the Financial Times (the iconic pink-colored newspaper carried by all self-respecting business men on London tube trains each morning) reports that developing countries in Africa will be responsible for the greatest increases in population growth over the next 90 years, with the global population predicted to hit 10 billion by 2100. Given the number of dire predictions currently being bandied around regarding population increases, food demand and climate change it’s easy to become blasé and dismiss it as just another issue that will be solved by the next generation. After all, what difference can we possibly make to rural communities in developing countries?

Rises in per capita are predicted for developing countries over the next century and as Andrew Jack notes, increasing affluence results in improved healthcare, urbanisation and a decline in birth rates. Nonetheless, this pattern is not being currently being exhibited by regions in Africa and S. Asia, which are unable to improve per capita income as population increases overcome economic growth.

So how do we curtail the rise in population growth? Even in a developed country such as the USA, mentioning the politically-charged phrase “population control” often leads to an uncomfortable silence, images of an Orwellian constraint on family size (1 child good, 2 children bad?) or debate over the rights and wrongs of abortion. Yet simply providing access to contraception, considered to be an inherent human right by many women in the developed world, could conceivably (pardon the pun) improve future sustainability.

Reducing the number of children born in developing countries would improve female health and lessen the burden on economic and environmental resources placed by increasing population size. This would be a crucial step forwards in mitigating future food shortages, yet it does not solve the underlying problem. One of the major drivers behind population growth is parental reliance on support from the younger generation. As discussed in Jared Diamond’s excellent book “Guns, Germs and Steel“, higher birth rates potentially allow for a greater number of children to work on the land and improve societal stability. However, the promise of  future affluence is a major stimulant that causes young people to migrate from rural areas to urban regions that are already suffering from significant overcrowding.

A considerable yield gap exists between developing and developed countries, both in terms of animal and crop production. If productivity could be improved and the yield gap reduced, food security would improve in rural areas with positive effects on per capita income and infrastructure, and potential reductions in the number of inner-city migrants. Nonetheless, the major question remains unanswered – how to improve productivity?

If we use dairy farming as an example, a recent report from the Food and Agriculture Organisation noted a negative relationship between productivity and carbon footprint – as we move across the globe from developed to developing regions, productivity decreases and the carbon footprint per kg of milk increases (see graph). Carbon can also be considered a proxy for land, water and energy use, thus reduced productivity increases both resource use and environmental impact. The logical conclusion would therefore be to implement systems and management practices currently seen in N. America, Europe and Australasia in an attempt to educate farmers in S. Asia and Africa. However, sustainability cannot be improved by implementing a one-size-fits-all solution, but calls for a region-specific approach.

Identifying traits within indigenous and imported animal breeds and plant varieties that will make the best use of available resources allows development of production systems that match environmental, social and economic demands. Further implementation of continuous improvement and best management practices in developing countries is crucial in order to improve affluence, reduce population growth and mitigate environmental impact both now and in future. However, this is not a situation that can simply be solved by improving yield per acre, but requires a multifactorial approach coordinating population control, education, food security and human health and welfare. Providing contraception without a concurrent effort to improve agricultural productivity will fail in its prophylactic intent to control either population growth or world hunger.