The famine has returned to Ethiopia it has never left, nearly a quarter of a century after the world’s pop stars gathered to banish it at Live Aid, raising £150m for relief efforts in 1985. Millions of impoverished Ethiopians face the threat of malnutrition and possibly starvation this winter in what is shaping up to be the country’s worst food crisis for decades. The government is busy robbing ballots and consolidating its grip in the famined population working for a bread a day in the country side in its own land sold for international grabbers for a piece of nothing.
Famine Cover-Ups vs. Fake Famines
By William Easterly and Laura Freschi |
Is Ethiopia having a famine? As often is the case, there are two forces pulling in opposite directions that make it hard to answer the question.
On the one hand, the authoritarian government wants to cover up any famine to mute criticism of its performance. Ethiopia is due for elections next year, and the government is determined not to go the way of previous regimes toppled in part because of anger at famines in the 1970s and 1990s. The government’s solution? Prohibit journalists from entering the worst-off areas, and fight tooth and nail with aid agencies to repress or delay information on humanitarian needs.
Complicating the situation further is that the government army is operating against insurgents in the suspected famine areas in the South and cites security reasons for not allowing outsiders to enter, so nobody really knows what is happening there.
On the other hand, NGOs have a well known tendency to cry wolf and exaggerate—to see famine where there is no famine—perhaps in order to raise more money for their own organization (I am echoing here fierce accusations of exactly this from Ethiopians I talked to during my visit who were NOT allied with the government).
For example, aid organizations and journalists saw signs of famine in Mali in the summer of 2005. Reuters reported that aid and donations were urgently needed in Mali “where the same famine that struck neighboring Niger is intensifying.” In another article, an Oxfam official was quoted saying: “They say there’s no famine in Mali, but that’s false. People aren’t able to eat for three or four days. Forget the political or academic definitions.” While Mali had suffered a series of droughts and an invasion of locusts which exacerbated the chronic food insecurity there, deaths did not approach famine levels. The predicted high numbers of deaths from famine in Niger in 2005 and Malawi in 2002 also thankfully did not materialize. It’s impossible to know how last minute appeals for funds may have affected these outcomes, but the fact remains that desperate pleas to end exaggerated famines are a blunt instrument in addressing the causes of chronic malnutrition and long term food insecurity.
In his classic book “Famine Crimes” Alex De Waal observes that NGOs make “habitual inflation of estimates of expected deaths.” De Waal notes that during the pre-Christmas prime fundraising season, ‘One million dead by Christmas’… has been heard every year since 1968 and has never been remotely close to the truth.”
Put into the current mix a credulous Western media that is happy to check the box “Ethiopia = famine,” and is unable to handle subtleties like chronic food insecurity and chronic malnutrition vs. emergency famine. Between unreliable media, NGOs, and government, it is tragically difficult to know when tragedy is happening.
GLOBAL: Analysis: What is a famine?
Photo: Edward Parsons/IRIN
|A skeletal child receives emergency food through a tube at a centre in Niger in 2005|
JOHANNESBURG, 13 May 2010 (IRIN) – Aid agencies and donors have warned of the possibility of a famine in Niger, evoking images of the last food crisis in the Sahelian country in 2005. Some media organizations have already pronounced the current crisis a famine. So, what exactly is a famine?
“There is no clear boundary or definition [of a famine],” said Christopher Barrett, a food aid expert who teaches development economics at Cornell University, in the US.
“Clearly, 1984 in Ethiopia was a famine [a million people died and an estimated eight million were on food aid]; equally clearly, 2009 in the United States was not [the US Department of Agriculture said on average 33.7 million Americans received food vouchers each month in 2009, the highest number ever].
Barrett said the typical explanation of a famine was “greater than usual mortality that is caused by insufficient availability of or access to food, whether directly due to starvation or far more commonly, indirectly, due to disease or injury associated with severe under-nutrition.”
Stephen Devereux, author of Theories of Famine, a definitive reference book on the subject, noted that dictionary definitions such as “extreme scarcity of food” described a “few symptoms of famine” and selected some factors to “suggest causes”, but failed to provide a “comprehensive and concise” definition.
“A good working definition of famine must describe a subsistence crisis afflicting particular groups of people within a bounded region over a specified period of time,” he wrote.
Hundreds, if not thousands, of researchers, academics and humanitarian aid workers have tried defining it. Devereux quoted an academic as saying that “Famine is like insanity: hard to define, but glaring enough when recognized.”
Why defining it is important?
Controversy has dogged the application of “famine” to several recent humanitarian emergencies: Sudan in 1998, Ethiopia in 1999/2000 and 2002/03, and Malawi in 2002.
In an influential paper in 2004, Devereux and Paul Howe, both researchers at the Institute of Development Studies at the University of Sussex, proposed a scale to measure famines. “Both before and during these [African] crises, observers failed to agree on how serious the situation was, or how serious it was likely to get,” they commented.
|The search for a definition is not merely a technocratic or instrumentalist concern – it has political significance|
A dire lack of food in Niger in 2005 prompted another debate: using their scale, Devereux and Howe pronounced the situation a “famine”, while pointing out that the international community used “less emotive terms, like ‘food crisis’.” This could possibly have been under pressure from the government at the time which did not acknowledge the crisis.
Defining famine is not “merely a semantic issue – these controversies have important implications for famine response and accountability”, and a lack of consensus over the definition has delayed interventions and the distribution of resources during a crisis, Howe and Devereux wrote.
They cited the 1999/2000 food crisis in Ethiopia as an example. “The contentious issue here was that of scale: because the emergency was confined to a single region [Somali in eastern Ethiopia]”, aid agencies avoided the “F-word”, saying the term should be saved for very severe situations.
A retrospective mortality study suggested that aid agencies had responded late in drawing people into relief camps, “where communicable diseases such as measles spread rapidly, contributing to an estimated 19,900 deaths in Gode zone alone [in Ethiopia’s Somali region]”, Howe and Devereux said.
The 1984/85 Ethiopian famine was another tragic example of donors responding only when people started dying. “In the light of this ‘No corpses, no food aid’ myopia (not to mention callousness),” Devereux said in his book, he had to agree with Alex de Waal, the British writer, author of Famine Crimes: politics & the disaster relief industry in Africa and researcher, “who concludes pessimistically that there is no good definition on which to make a diagnosis of impending famine.”
The search for a definition is not merely a “technocratic or instrumentalist concern – it has political significance”, Howe and Devereux noted in their 2004 paper.
Who do you hold accountable for the deaths from a famine?
Governments and agencies [among] whose job[s] it is to prevent famines “have often exploited the ambiguities in the term to contest whether a famine has occurred, thereby evading even limited accountability for their action – or inaction.”
Accountability, even after [an estimated] 70,000 deaths related to a famine in Sudan in 1998, took the “‘soft’ form of internal agency evaluations and lesson-learning workshops”.
Most definitions of famine had centred on a lack of food, but in the past 25 years the thinking on food security has shifted to the link between poverty and vulnerability rather than low food production, Barrett wrote in a paper. This stemmed directly from a “pathbreaking” book by economist and Nobel Laureate Amartya Sen, Poverty and Famines: An Essay on Entitlement and Deprivation, published in 1982.
Sen’s “famous opening sentences underscore that ‘starvation is the characteristic of some people not having enough food to eat. It is not the characteristic of there being not enough food to eat. While the latter can be a cause of the former, it is but one of many possible causes’.”
Alternate approaches to defining a famine
In the absence of agreement on a workable definition, various approaches have evolved to help humanitarian actors respond to a crisis in time: theoretical, coping strategies, nutrition surveillance, and early warning systems, Devereux commented. According to Sen, theoretical approaches describe the situation but do not provide a “diagnosis”.
The “coping strategies” approach evolved in the 1980s and ’90s, when researchers assessed the response of people affected by a food crisis at various stages to predict a famine.
A typical early-stage response was rationing food and looking for other sources of income. If the crisis persisted, people sold assets.
The next stage was dependence on food aid. If that failed, starvation and famine followed. Howe and Devereux pointed out that these indicators could not be applied universally and were context-specific.
The “nutrition surveillance” approach relied on nutritional data [ particularly measurements of children against tabulated benchmarks,] and used indicators that could be applied universally. These were used by the UN [clearing house] Refugee Nutrition Information System (RNIS), and the World Food Programme, but had limitations in predicting or defining a famine, the researchers noted.
There were “no generally accepted criteria of what rates of malnutrition or mortality indicate specifically that a famine has started. Even within the humanitarian community, some nutritionists and epidemiologists favour a crude mortality rate (CMR) of one or more deaths daily per 10,000 people as the cut-off, rather than the two deaths per 10,000 proposed by the RNIS,” Devereux and Howe said.
Most nutritional indicators dealt with children aged between six years and 59 months, said the researchers, citing the Sphere Project, a voluntary initiative aimed at improving the humanitarian aid system, which noted that there were no agreed definitions and thresholds for moderate and severe malnutrition among older population groups.
Devereux pointed out that when faced with a food crisis, the adults in a household ate less to ensure that their children had enough food.
“During famines, therefore, child malnutrition might be a ‘trailing indicator’ that may not manifest itself until well after adult nutrition status has deteriorated significantly.”
|During famines, therefore, child malnutrition might be a ‘trailing indicator’ that may not manifest itself until well after adult nutrition status has deteriorated significantly|
Nutritional data could also not function as a reliable indicator of a famine because a child could be malnourished as a result of factors besides a lack of food, such as disease.
The “early warning systems” approach to a famine evolved from the Indian Famine Codes developed in the 1880s by the British colonial regime, according to Devereux.
The Famine Codes described three levels of food insecurity – near scarcity, scarcity, and famine – which used indicators such as three successive years of crop failure, crop yields, numbers of people affected, and food prices, but the measure of these indicators was very subjective.
Various agencies – the UN Food and Agriculture Organization (FAO), the Famine Early Warning Systems Network (FEWSNET), the UK-based aid agency, Oxfam, and Save the Children (UK) – have developed systems to predict a famine, but most do not have a precise definition for famine.
One of the most successful prediction tools was the Turkana District Early Warning System, based on the Indian Famine Codes and used in the pastoralist areas of northern Kenya, according to Howe and Devereux.
It monitored indicators such as rainfall; market prices and availability of cereals; livestock production, purchases and sales; rangeland condition and trends; ecological changes; and enrolment on food-for-work projects. The system identifies three levels of crisis – alarm, alert, and emergency – each associated with a pre-planned set of ‘off-the-shelf’ responses.”
Devereux and Howe used a combination of the crude mortality rate and nutritional data to draw up their scale for measuring the intensity of a famine.
Since then, FAO had done some inter-agency work to develop the famine scales into the Integrated Phase Classification (IPC) Tool, Devereux told IRIN. It also took into account the indicators in an early warning system developed by the Food Security Analysis Unit (FSAU) [led by] the FAO in Somalia.
The IPC used a number of indicators to pronounce a famine, including acute malnutrition in more than 30 percent of the children, two deaths per 10,000 people every day, a pandemic illness, access to less than four litres of water every day and 2,100 kilocalories of food, large-scale displacement of people, civil strife, and complete loss of assets and source of income.
Daniel Maxwell, an associate professor at the Friedman School of Nutrition Science and Policy at Tufts University in the US, said the IPC definition was the “best consensus”, and was “now widely adopted around the world by FAO”.