|
FOCUS: Civil War Has Broken Out Inside the Democratic Party. |
|
|
Sunday, 25 June 2017 10:46 |
|
Excerpt: "America is in the middle of a major political realignment. While the focus is on the Republican party's internecine fight among corporate realists, political ideologues and the wild-card president, it is a mistake to assume that the Democrats are going to sweep into office in 2018 and 2020 to replace the corroding Republicans."
Supporters of US Senator candidate Bernie Sanders cheer at a recent speech. (photo: Jim Young/AFP/Getty Images)

Civil War Has Broken Out Inside the Democratic Party.
By Heather Cox Richardson, Michael Cohen and Jean Hannah Edelstein, Guardian UK
25 June 17
Last week’s defeat in a high-profile congressional contest sparked a tough fight over the heart of the Democratic party. Heather Cox Richardson, Jean Hannah Edelstein and Michael Cohen look at what the future might hold
merica is in the middle of a major political realignment. While the focus is on the Republican party’s internecine fight among corporate realists, political ideologues and the wild-card president, it is a mistake to assume that the Democrats are going to sweep into office in 2018 and 2020 to replace the corroding Republicans. The Democrats are also in a profound struggle over their future.
The 2016 election marked the end of a political era. Just as Republicans expecting an easy nomination of Jeb Bush in 2016 were blindsided by the rise of charismatic outsider Donald Trump, so too were Democrats expecting the easy nomination of Hillary Clinton surprised by a powerful challenge from elderly Vermont socialist Bernie Sanders. Both Trump and Sanders ran on powerful populist messages, slashing at politics-as-usual and bemoaning that Washington served the wealthy. Democratic primary rules put in place after the party’s disastrous nomination of South Dakota senator George McGovern in 1972 meant that, unlike Republicans leaders who were incapable of stopping Trump, establishment Democrats could hold off the Sanders surge. But the insurgency opened a rift in the party.
The election of Trump exacerbated the Democrats’ intra-party conflict as Sanders supporters insisted that he could have won, while Clinton supporters dismissed those claims, pointing out that, among other things, Sanders never had to endure an opposition news dump. The two sides squared off in February, three months after the election, over the chairmanship of the Democratic National Committee. This position, contested for the first time since 1985, tossed new names to the front of the party. Ultimately, the choice came down to establishment-backed Tom Perez, President Obama’s secretary of labor, or Minnesota representative Keith Ellison, the first Muslim elected to Congress. Perez won 235 votes to Ellison’s 200, and then, acknowledging the tensions in the party, tapped Ellison to be deputy chair.
Ellison pledged support for Perez, but cooler heads have not prevailed. Last week, when 30-year-old political newcomer Jon Ossoff lost a special election to reactionary Republican Karen Handel in Georgia’s 6th district, Democratic critics laid blame for the loss not on the nature of the district (staunchly Republican) – it was Newt Gingrich’s – but on the toxicity of House minority leader Nancy Pelosi.
To understand what’s going on now, it might make sense to return to pre-war America, since the Democrats, like the rest of America, are coming to grips with the end of the New Deal era. The party came out of the 1930s having created a new, activist liberal state designed to prevent the return of the great depression by using the government to defend the rights of labour and level the economic playing field that had tilted so steeply toward the wealthy. This liberal state was wildly popular, so popular that Republican Dwight D Eisenhower felt obliged to adopt and expand its premises.
With the country firmly behind what was known as the “liberal consensus”, Democrats continued to expand FDR’s New Deal, recognising that economic fairness required ameliorating racial inequality. When Republicans ran the reactionary Barry Goldwater against President Lyndon Baines Johnson in 1964, the resulting landslide gave Democrats a super-majority in Congress. Working with moderate Republicans to cut racist southern Democrats out of their centrist coalition, they passed the 1964 Civil Rights Act and the 1965 Voting Rights Act, and launched LBJ’s War on Poverty.
But, in part because of the economic prosperity it created, this centre did not hold. In 1968, Republican candidate Richard M Nixon attacked it from the right by bringing white racists into his party, while Democrats destroyed it from the left by shattering over the Vietnam war. Angry at the establishment Democratic hawks who had carried the nation to war in southeast Asia, affluent American youth flocked to the standard of anti-war Minnesota Senator Eugene McCarthy, a Democrat.
The outcome was a free-for-all for the party leadership. President Johnson withdrew from the race, to be replaced by his vice-president, Hubert Humphrey; Senator Robert Kennedy jumped in to challenge McCarthy only to be assassinated. The Democratic National Convention dutifully nominated establishment candidate Humphrey, but the mayor of Chicago, where the convention was held, turned police against the protesters who descended on his city. The resulting violence enabled Republicans to tar the Democratic party as an elite establishment using tax dollars to cater to lawless thugs. The result just went Nixon’s way.
In 1972 the Democrats continued to move away from their traditional defence of labour towards social issues, and they haemorrhaged voters. In that year, anti-establishment candidate Senator George McGovern won the party’s nomination with the support of young activists, only to go down to such a sweeping popular defeat that the party establishment created “superdelegates”, party war horses and leaders who would also vote on nominees, and presumably avoid another disaster similar to that in 1972.
Democrat Jimmy Carter won the presidency in 1976 after Nixon’s spectacular implosion over Watergate, but the party’s crumbling coalition was no match for the rise of Movement Conservatives. Their narrative was simple: the Democrats’ New Deal government redistributed tax dollars from hardworking white men to lazy minorities and women. This easy – and false – explanation for the economic stresses of the 1970s drained working-class Americans away from the Democrats and into the party of Ronald Reagan. And there they stayed, for the most part, even as neoliberalism gutted the American middle class.
As they did so, Democrats tried to undercut Republican accusations that they were nascent communists hell-bent on redistributing wealth by moving to the centre on economic policy while mobilising voters by focusing on social issues. President Clinton famously ended “welfare as we know it” and signed the repeal of the 1933 Glass-Steagall Act, which had prevented financial bubbles by keeping commercial and financial banks from being one and the same; President Obama defended banks in the aftermath of the great recession as key to recovery.
And so, we have come to the end of an era. The destruction of the New Deal state in a time of globalism has created an American economy that looks much like that of the 1920s, with extraordinary wealth concentrated at the very top of society. Thus the populist moment of 2016, when voters on both sides set out to smash the establishment, on the one hand electing Donald Trump and, on the other, rending the Democratic party in two.
Unlike the Republicans, though, who will have to reinvent themselves if they are ever to recover from the damage of the Trump era, the Democrats have the opportunity to heal their differences for an easier transition to a new political era. Establishment Democrats are not wrong to put faith in experience: Clinton, after all, lost the electoral college, but won the popular vote by more than two points. The upstart Democrats who rallied to Sanders are, though, demanding a focus on economic fairness, one that echoes the Democratic leadership of the 1930s. “True individual freedom cannot exist without economic security and independence,” FDR said in 1944. “People who are hungry and out of a job are the stuff of which dictatorships are made.”
Heather Cox Richardson is professor of history at Boston College
We must learn from Jeremy Corbyn’s success and speak to younger voters
One might have thought that the November election would have drawn a clear line under Democratic centrism. But the defeat of Jon Ossoff in Georgia’s 6th congressional district may have been its true death wheeze. Even with six times as much funding as his opponent and a crazed and incompetent Republican president, Ossoff could not get enough of the district’s wealthy and well-educated Republicans to vote for him to flip the district.
When Bernie Sanders remarked that he wasn’t sure that Ossoff was a true progressive, it wasn’t a kind thing to say, but it also wasn’t inaccurate. The future of the Democratic party is not men like Ossoff. We must learn from the comeback of Jeremy Corbyn in the UK election and start putting our might and money behind candidates who are truly on the left.
We scoff at accounts of the 45th president still presenting visitors to his office with a map that lays out his electoral victory, but many Democrats are also preoccupied with the details of the election and the reasons for Hillary Clinton’s defeat. It’s clear that sexism was a significant factor, as was the intervention from ex-FBI director James Comey and possible interference from Russia. But those in the party who are willing to do real soul-searching must admit that the lack of the anticipated Democratic party landslide must also be blamed on the failure of the party’s policies to resonate with people in the states that decided the election – places in the middle of the country that have seen their livelihoods dry up, rather than flourish, under late capitalism.
Trump’s promises that he would solve the problems that plague their communities – problems such as unemployment, poverty and the opioid crisis – seem to be empty promises. But the Democrats could have done a far better job of showing that they cared about these middle-American communities: for example, through actually turning up in them. Clinton’s hobnobbing with Hollywood stars held little appeal for Americans in the middle of the country.
We need to look to movements such as the Women’s March, which inspired a record-breaking number of people to take to the streets, and the Run for Something campaign, which helps progressive people to run for office – and has elicited a huge, enthusiastic response from new candidates. They’re the best hope Democrats have of effecting change in 2018 and beyond. But only if they motivate turnout from the young voters who came out for Obama but couldn’t be bothered to vote for Clinton.
This means focusing on real issues that mean a lot to young people: education debt relief; steady employment; healthcare that makes it possible for them to afford to start families.
Though his continued engagement with the DNC shows Bernie Sanders’s ambition to promote this agenda, it’s time for him to step aside. His refusal to register as a Democrat invalidates any true claim he has to be at its helm. Many of his critiques of the party are legitimate, but if Sanders is not willing to commit to working on the inside for change, he needs to support someone who is willing to do it.
Elizabeth Warren is the obvious choice: compared to the likes of Nancy Pelosi or Joe Biden, she’s an outsider, but she’s still a Democrat who has shown her commitment to the party. Her economic populism speaks to many of the same concerns that Trump claimed he would alleviate, but she offers solutions that will buoy the middle class by making the wealthy contribute more, rather than promising to drive growth through deregulation that simply makes the ultra-wealthy more so. And her commitment to progressive social values is clear, unlike Sanders, whose remark that “you just can’t exclude people who disagree with us on [reproductive rights]” elicited blowback from women on the left who do not want their rights to be regarded as something to bargain with.
As the Senate Republicans push forward a healthcare bill that will cause the death and bankruptcy of many Americans who have the misfortune to be unwell and middle-class, now should be a clear opportunity for Democrats to assert that they’ll offer a better alternative. The opportunity will be lost if we continue to debate what it means to be a Democrat. The centre had its shot. It’s time to clear a path for Warren, the left, and a party that values diversity and speaks to young people.
Jean Hannah Edelstein is a writer based in New York
Liberals should be wary of policies that will scare away the middle classes
It has been a rough couple of months for the Democratic party. As Republicans have sought to roll back the key legislative accomplishments of President Obama, it has been one disaster after another. Even with President Trump’s approval ratings at historically low levels, Democrats continue to lose special elections around the country.
But in spite of these losses, there is a clear glimmer of hope – one that could presage a significant Democratic victory in congressional elections next year. Democrats are losing, but they are losing by much smaller margins than they have in the past.
Take for example, the special election in Georgia last week. The race, which quickly took on national import, will end up as the most expensive congressional election in US history. While the Democratic candidate narrowly lost by almost four points the district had been solidly Republican for decades. In a race the same night in South Carolina, the Democratic candidate lost by three points – in a seat that Republicans had won by more than 20 points just last November.
What all this suggests is that there is serious enthusiasm among Democratic partisans and not as much among Republicans. If, in 2018, Democrats are able to perform as well as their candidates did in these four special elections, they would be the odds-on favourites to win back the House of Representatives.
So how do they keep that momentum going? First, they must make the 2018 election a referendum on Trump, who is singularly despised by Democrats – and increasingly by much of the country. Second, if Republicans somehow succeed in repealing Obamacare and passing legislation that will take away health insurance from more than 20 million people, it will hand Democrats a slam-dunk campaign issue. But even if they fail, Republican votes in Congress could be an albatross that Democrats can hang around the necks of Republican candidates in 2018.
But for Democrats to expand their support they may also need to also take a page from Trump. In 2016 Trump ran the nastiest and most dishonest presidential campaign in modern American history. But one thing he did effectively was convince millions of voters that he would “drain the swamp” in Washington and be a voice for the struggling middle class. That anyone believed he would actually follow through on such an agenda is strong evidence that you can fool some of the people all the time.
But Democrats should take a similarly populist approach. Many liberals argue that means talking about single-payer healthcare and free college education, but it’s far from clear that those policies are what voters want. Pledging to raise taxes on the wealthy, protecting health insurance for poor and working Americans, expanding childcare and social security benefits, raising the minimum wage, making college loans more accessible and waging war on the opioid epidemic ravaging broad swatches of America will be far more effective.
Populism is key for Democrats, but it needs to be the kind of economic populism that signals to the American middle class that the party is in touch with their concerns and will fight for them if they are returned to power.
Doing so will give Democrats the opportunity to reach not just their most loyal partisans – who will be committed to vote no matter what – but also disillusioned Trump voters or those who sat out 2016.
Certainly, Republicans will have their message ready to go: harsh attacks on liberal elites that have long worked for the party and were critical to victory in the Georgia special election. In an era of intense political polarisation, pledging to stick it to the other side is still a pretty effective strategy for Republicans.
But with a fully mobilised Democratic base and a smattering of moderate and independent voters, it might just be enough to return the Democrats to power. In the end, Trump hatred will be a boon to the party, but the kind of seismic victory Democrats need may require a return to the party’s populist roots as the voice of the American middle class.
Michael Cohen is the author of American Maelstrom: the 1968 Election and the Politics of Division

|
|
Making America Scared Again Won't Make Us Safer |
|
|
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=45345"><span class="small">Sally Yates, The Washington Post</span></a>
|
|
Sunday, 25 June 2017 08:17 |
|
Yates writes: "There is broad consensus that the 'lock them all up and throw away the key' approach embodied in mandatory minimum drug sentences is counterproductive, negatively affecting our ability to assure the safety of our communities. But last month, Attorney General Jeff Sessions rolled back the clock to the 1980s, reinstating the harsh, indiscriminate use of mandatory minimum drug sentences imposed at the height of the crack epidemic."
Sally Yates. (photo: Carolyn Kaster/AP)

Making America Scared Again Won't Make Us Safer
By Sally Yates, The Washington Post
25 June 17
n today’s polarized world, there aren’t many issues on which Democrats and Republicans agree. So when they do, we should seize the rare opportunity to move our country forward. One such issue is criminal-justice reform, and specifically the need for sentencing reform for drug offenses.
All across the political spectrum, in red states and blue states, from Sen. John Cornyn (R-Tex.) and the Koch brothers to Sen. Patrick Leahy (D-Vt.) and the American Civil Liberties Union, there is broad consensus that the “lock them all up and throw away the key” approach embodied in mandatory minimum drug sentences is counterproductive, negatively affecting our ability to assure the safety of our communities.
But last month, Attorney General Jeff Sessions rolled back the clock to the 1980s, reinstating the harsh, indiscriminate use of mandatory minimum drug sentences imposed at the height of the crack epidemic. Sessions attempted to justify his directive in a Post op-ed last weekend, stoking fear by claiming that as a result of then-Attorney General Eric H. Holder Jr.’s Smart on Crime policy, the United States is gripped by a rising epidemic of violent crime that can only be cured by putting more drug offenders in jail for more time.
That argument just isn’t supported by the facts. Not only are violent crime rates still at historic lows — nearly half of what they were when I became a federal prosecutor in 1989 — but there is also no evidence that the increase in violent crime some cities have experienced is the result of drug offenders not serving enough time in prison. In fact, a recent study by the bipartisan U.S. Sentencing Commission found that drug defendants with shorter sentences were actually slightly less likely to commit crimes when released than those sentenced under older, more severe penalties.
Contrary to Sessions’s assertions, Smart on Crime focused our limited federal resources on cases that had the greatest impact on our communities — the most dangerous defendants and most complex cases. As a result, prosecutors charged more defendants with murder, assault, gun crimes and robbery than ever before. And a greater percentage of drug prosecutions targeted kingpins and drug dealers with guns.
During my 27 years at the Justice Department, I prosecuted criminals at the heart of the international drug trade, from high-level narcotics traffickers to violent gang leaders. And I had no hesitation about asking a judge to impose long prison terms in those cases.
But there’s a big difference between a cartel boss and a low-level courier. As the Sentencing Commission found, part of the problem with harsh mandatory-minimum laws passed a generation ago is that they use the weight of the drugs involved in the offense as a proxy for seriousness of the crime — to the exclusion of virtually all other considerations, including the dangerousness of the offender. Looking back, it’s clear that the mandatory-minimum laws cast too broad a net and, as a result, some low-level defendants are serving far longer sentences than are necessary — 20?years, 30 years, even mandatory life sentences, for nonviolent drug offenses.
Under Smart on Crime, the Justice Department took a more targeted approach, reserving the harshest of those penalties for the most violent and significant drug traffickers and encouraging prosecutors to use their discretion not to seek mandatory minimum sentences for lower-level, nonviolent offenders. Sessions’s new directive essentially reverses that progress, limiting prosecutors’ ability to use their judgment to ensure the punishment fits the crime.
That’s a problem for several reasons. First, it’s fiscally irresponsible and undermines public safety. Since 1980, the U.S. prison population has exploded from 500,000 to more than 2.2 million, resulting in the highest incarceration rate in the world and costing more than $80 billion a year. The federal prison population has grown 700 percent, with the Federal Bureau of Prisons budget now accounting for more than 25 percent of the entire Justice Department budget. That has serious public safety consequences: Every dollar spent imprisoning a low-level nonviolent drug offender for longer than necessary is a dollar we don’t have to investigate and prosecute serious threats, from child predators to terrorists. It’s a dollar we don’t have to support state and local law enforcement for cops on the street, who are the first lines of defense against violent crime. And it’s a dollar we don’t have for crime prevention or recidivism reduction within our prison system, essential components of building safe communities.
But just as significant are the human costs. More than 2 million children are growing up with a parent behind bars, including 1 in 9 African American children. Huge numbers of Americans are being housed in prisons far from their home communities, creating precisely the sort of community instability where violent crime takes root. Indiscriminate use of mandatory minimum sentencing has caused many Americans to lose faith in the criminal-justice system, undermining the type of police-community relationships that are so crucial to making our streets safer.
While there is always room to debate the most effective approach to criminal justice, that debate should be based on facts, not fear. It’s time to move past the campaign-style rhetoric of being “tough” or “soft” on crime. Justice and the safety of our communities depend on it.

|
|
|
The Second Amendment Didn't Save Philando Castile |
|
|
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=26684"><span class="small">Jelani Cobb, The New Yorker</span></a>
|
|
Sunday, 25 June 2017 08:15 |
|
Cobb writes: "The cycle of lethal police violence, community outrage, and legal proceedings that yield no consequences came around again last Friday in St. Paul, Minnesota."
Family and friends of Philando Castile after Jeronimo Yanez was found not guilty on all counts in the shooting death of Mr. Castile. (image: Elizabeth Flores/Star Tribune)

The Second Amendment Didn't Save Philando Castile
By Jelani Cobb, The New Yorker
25 June 17
he cycle of lethal police violence, community outrage, and legal proceedings that yield no consequences came around again last Friday in St. Paul, Minnesota. A jury acquitted a police officer, Jeronimo Yanez, of all three charges—one count of second-degree manslaughter and two counts of dangerous discharge of a firearm—arising from the shooting death, a year ago, of Philando Castile.
On Tuesday, four days after the verdict, Minnesota state investigators made public the dash-cam video from Yanez’s car. Officer Yanez had said that he saw Castile drive by, thought he resembled a suspect in a robbery case, and decided to pull him over. In the video, the officer can be heard calmly telling Castile that his brake light is broken, and asking to see his license and registration. Castile then says, also calmly, “Sir, I have to tell you I do have a firearm on me.” Listening to the audio, it seems reasonable to assume that Castile is informing the officer that he has a weapon—for which he turned out to have a valid permit—to avoid trouble rather than to court it. Still, Yanez is prompted to place his hand on his own gun, and shortly afterward he shouts, “Don’t pull it out!” Castile’s actions cannot be seen in the video, but he and his girlfriend, Diamond Reynolds, who was also in the car, along with her four-year-old daughter, tell Yanez that Castile isn’t reaching for his gun; she later says that he was getting his identification from his wallet. Within seconds, the officer fires seven shots into the car. Two of the bullets hit Castile, who is heard to say, “I wasn’t reaching.” He died half an hour later.
That video now serves as a tragic prequel to one that Reynolds live-streamed to Facebook, after the shooting, as she sat next to Castile in the front seat of his car. That video—an unnerving first-person testimony, in which she tells Yanez, with stunning composure, “You killed my boyfriend”—was viewed millions of times, and brought an inescapable notoriety to the case. Reynolds later told reporters that she and Castile had done “nothing but what the police officer asked of us” and added, of Castile, that “nothing within his body language said ‘kill me.’ ”
The decision in the Castile case differed from other, similar cases of police violence in that it highlighted a kind of divided heart of Second Amendment conservatism, at least with regard to race. David French, in National Review, called the decision a miscarriage of justice. He wrote, “Castile was following Yanez’s commands, and it’s simply false that the mere presence of a gun makes the encounter more dangerous for the police. It all depends on who possesses the gun. If he’s a concealed-carry permit-holder, then he’s in one of the most law-abiding demographics in America.” Colion Noir, an African-American gun-rights activist who serves as the face of the N.R.A.’s black-outreach campaign, also criticized the decision, writing in an online post that Yanez’s mistakes cost Castile his life, and that “covert racism is a real thing and is very dangerous.” In the days after the shooting, the N.R.A. itself had offered only a tepid response, without mentioning Castile’s name: “The reports from Minnesota are troubling and must be thoroughly investigated. In the meantime, it is important for the NRA not to comment while the investigation is ongoing. Rest assured, the NRA will have more to say once all the facts are known.” After Yanez was acquitted, it said nothing at all. Noir, in his post, also questioned whether Yanez would have had the same reaction had a white motorist identified himself as armed. The same might be asked of the N.R.A.’s non-reaction to the verdict.
The Black Lives Matter movement emerged, fundamentally, as a response to the disparate valuation that we place upon human lives. That is why the rejoinder “all lives matter” misses the point. In the hours following last week’s shocking shooting of Representative Steve Scalise and three others, in Alexandria, Virginia, the broad outpouring of concern reminded us of how society responds when people whose lives it values are harmed. In that sentiment, media coverage of the shootings did not automatically focus on controversial statements that Scalise has made or votes he has cast. To do so at such a moment seemed unbefitting.
Responses to the deaths of unarmed victims of police violence, by contrast, routinely feature the victims’ failures, shortcomings, and oversights. We were told, for example, that Eric Garner, who died after police on Staten Island put him in a choke hold, had been arrested on numerous occasions for petty offenses. Representative Peter King, of New York, pointed to the factor of Garner’s physical unfitness. “If he had not had asthma and a heart condition and was so obese, almost definitely he would not have died,” King said. Imperfect victims, as feminists who fought for stronger rape laws a generation ago understood, become perfect excuses in an unequal judicial system.
Yet there was some feeling that the verdict in Philando Castile’s death would be different from the decisions in similar cases that had preceded it. That thought hinged on a belief that his status as a lawfully licensed gun-owner, his long-standing employment as a cafeteria manager at an elementary school, and his general lack of serious missteps might exempt him from the idea that his death was his own fault. And, in fact, less blame was levelled in this case: Castile had been stopped by the police fifty times in the thirteen years before his death, but that record was widely interpreted as evidence of racial profiling rather than of personal culpability.
There was also an evidentiary reason to believe that this case might turn out differently. A second officer, Joseph Kauser, who arrived at the scene before the shooting, when Yanez called for support, and approached Castile’s car with his fellow-officer, testified that Castile was “relaxed and calm” during his exchange with Yanez. Kauser said that he believed that Yanez had acted appropriately, but that he himself had not drawn his gun, and he testified that he had not felt threatened. In the end, however, the result was indistinguishable from those in previous cases. There were no appeals for a less vitriolic dialogue, no impermeable hope that this time things would change. There was simply the numb reckoning that we’ll all go down this road again.

|
|
The Nazis Used It, We Use It Too: Famine as a Weapon of War |
|
|
Sunday, 25 June 2017 08:11 |
|
de Waal writes: "Mass starvation as a consequence of the weather has very nearly disappeared: today's famines are all caused by political decisions, yet journalists still use the phrase 'man-made famine' as if such events were unusual."
No end in sight: Half of the country's 26 million population are now struggling to eat, according to Oxfam, as the war in the country rages on. (photo: Reuters)

The Nazis Used It, We Use It Too: Famine as a Weapon of War
By Alex de Waal, London Review of Books
25 June 17
Alex de Waal on the return of famine as a weapon of war
n its primary use, the verb ‘to starve’ is transitive: it’s something people do to one another, like torture or murder. Mass starvation as a consequence of the weather has very nearly disappeared: today’s famines are all caused by political decisions, yet journalists still use the phrase ‘man-made famine’ as if such events were unusual.
Over the last half-century, famines have become rarer and less lethal. Last year I came close to thinking that they might have come to an end. But this year, it’s possible that four or five famines will occur simultaneously. ‘We stand at a critical point in history,’ the head of the UN Office for the Co-ordination of Humanitarian Affairs, the former Tory MP Stephen O’Brien, told the Security Council in March, in one of his last statements before stepping down: ‘Already at the beginning of the year we are facing the largest humanitarian crisis since the creation of the United Nations.’ It’s a ‘critical’ point, I’d argue, not because it is the worst crisis in our lifetime, but because a long decline – lasting seven decades – in mass death from starvation has come to an end; in fact it has been reversed.
O’Brien had no illusions about the causes of the four famines, actual or imminent, that he singled out in north-eastern Nigeria, Somalia, South Sudan and Yemen. In each case, the main culprits are wars that result in the destruction of farms, livestock herds and markets, and ‘explicit’ decisions by the military to block humanitarian aid. In Nigeria, villages in the path of the war between Boko Haram and the army have been stripped of assets, income and food. As the army slowly reduces the areas under Boko Haram control, they are finding small towns where thousands starved to death last year. The counter-insurgency grinds on, and the specialists who compile the data fed into the blandly named ‘integrated food security phase classification’ (IPC) system worry that in this year’s ‘hungry season’, approximately June to October, communities in the war zones will again move up the IPC scale: from level four (‘humanitarian emergency’) to five (‘famine’). Last year in Nigeria, the UN and relief agencies could say that they didn’t appreciate the full extent of the crisis. This year we have been given due warning.
In South Sudan, the government and the rebel armies have fought much less against each other than against the civilian population. In the summer of 2016, evidence from aid agencies showed nutrition and death rates in the region that met the UN criteria for determining that a food crisis has reached famine levels. Fearing that declaring famine would antagonise the South Sudanese government, already paranoid and cracking down on international aid agencies (aid workers were being robbed, raped and murdered), the UN prevaricated. By February, even veterans of South Sudan’s horrendous famines of the 1980s were saying that this was as bad as anything in their experience, perhaps worse. The UN duly declared a famine.
Yemen, however, is the biggest impending disaster. Don’t be fooled by pictures showing hungry people in arid landscapes: the weather had nothing to do with the famine. More than seven million people in Yemen are hungry; far more are likely to die of starvation and disease than in battles and air raids. The military intervention led by Saudi Arabia and the United Arab Emirates has strangled the country’s economy. Before the war, 80 per cent of Yemen’s food was imported, mostly through the Red Sea port of al-Hudaida. At Saudi insistence, backed by the US and the UK, the UN Security Council imposed a blockade on Yemen and while there’s an exemption for food, the inspection procedures are slow and laborious. Since Saudi aircraft bombed the container docks at al-Hudaida, all ships have to be unloaded the old-fashioned way, using derricks and stevedores. Roads, bridges and markets have been damaged or destroyed, slowing commerce to a crawl. The Bank of Yemen, relocated from the Houthi-controlled capital, Sana’a, to the enclave controlled by the recognised government, no longer pays salaries. The Houthi forces also impose their own blockades, laying siege to the highland city of Taizz. Food is the biggest weapon, and lack of food the biggest killer, in the Yemen war.
Unlike their blunt statements on war crimes in South Sudan, UN and aid agency statements on Yemen are muted: it’s hard to escape the conclusion that they feel unable to criticise Security Council decisions. While the famine deepens, the British and American navies persist in enforcing the blockade and diplomats at the Security Council discuss how they could recalibrate the embargo. All are in danger of becoming accessories to starvation.
Only in Somalia is drought partially responsible for the situation, though the war between a coalition of north-east African armies and the militant group al-Shabaab is primarily responsible for the immiseration of areas in the south of the country. Until this year, Somalia was the only country this century where the UN had declared the presence of famine: that was in 2011. In their recent book Dan Maxwell and Nisar Majid describe that famine as a ‘collective failure’. Incompetence on the part of the Somali authorities and corruption are other factors. A final element in the 2011-12 famine – which still rankles with aid professionals who struggled to halt an eminently preventable disaster – was the restriction on humanitarian work imposed by the US Patriot Act of 2001. Intended to criminalise support – material or symbolic, deliberate or inadvertent – for any group on the terrorist list, the Patriot Act meant that it was practically impossible for an aid agency to operate in the famine-stricken area without risking prosecution in a US court. In principle, if al-Shabaab hijacked a truckload of food provided by the Red Cross, the Red Cross would be criminally liable. Even the threat of prosecution posed a risk to their reputation that aid agencies weren’t ready to run. Staff at USAID and the State Department worked to find a way round this provision, but the Justice Department was immovable until the UN’s declaration of famine prompted a belated attempt to find a solution. In the nine months it took the DoJ to come up with one, the world’s biggest aid donor shipped no food to Somalia. Perhaps 260,000 Somalis, mainly children, died in that time. Most of the deaths could have been prevented if the Obama Administration had been more alert to a disaster caused by its decision to leave the Patriot Act untouched.
The humanitarian workaround – ‘carve out’ is the term used – of the Patriot Act is still in place. But it’s provisional and unclear, and the chilling effect of security surveillance of humanitarian actions in countries like Somalia, Syria and Yemen remains. Feeding the hungry and treating the sick are subject to security screening. It’s not only burdensome and intrusive, but deters the energetic and creative aid work needed in these crises.
Perhaps even more damaging has been the clampdown on money transfers. Remittances from the diaspora contribute at least 30 per cent of Somalia’s national income, and in the absence of a normal banking system, funds are transmitted through companies that use the hawala system. The businessmen who run these companies are interested in profit not ideology, but since 2001 counter-terrorist organisations have tended to target them as possible accomplices to terror, rather than as commercial service providers who might co-operate in a regulatory framework that serves everyone’s interests. Since November 2001, when the US shut down al-Barakaat, the biggest of these companies, on the basis of (unfounded) allegations that it was involved in terrorist financing, the Somali financial sector has been repeatedly battered by arbitrary restrictions and – as a consequence – the commercial banks have refused to do business with them.
Drought and crop failure have a part to play in this year’s hunger in Somalia, while the much more widespread drought in neighbouring Ethiopia passed off last year without famine thanks to an expeditious relief effort led by the government. At one point, the Ethiopian government and the UN World Food Programme were feeding 18 million Ethiopians, a higher number than the in-need populations of the four countries on today’s danger list combined. There’s nothing inevitable about people dying from hunger when the rains fail. That fact can never be repeated too often.
The organisation I work for, the World Peace Foundation, has compiled a catalogue of every case of famine or forced mass starvation since 1870 that killed at least 100,000 people. There are 61 entries on the list, responsible for the deaths of at least 105 million people. About two thirds of the famine deaths in this period were in Asia, about 20 per cent in Europe and the USSR, just under 10 per cent in Africa. The biggest killers were famines that resulted from political decisions, among them the Gilded Age famines, the Great War famines in the Middle East, including the forced starvation of a million Armenians, the Russian Civil War famine, Stalin’s starvation of Ukraine from 1932 until 1934 (now known as the Holodomor), the Nazi ‘hunger plan’ for the Soviet Union, the famines during the Chinese Civil War, the starvation inflicted by the Japanese during the Second World War, and by Mao’s Great Leap Forward of 1958-62, the largest famine on record, which killed at least 25 million.
*
These political famines seem scarcely to register in our collective imagination. They are strikingly absent too from the books which construct theories of famine and policies for food security. Even Amartya Sen did not take them into account when developing his ‘entitlement theory’ of famine causation in Poverty and Famine (1981), which overturned explanations of famine based exclusively on food shortage. In the WPF’s catalogue of great famines, 72 million deaths occurred when famine was being used as an instrument of genocide or recklessly inflicted by government policy. Ignoring these famines, or ascribing them to natural disasters, is a major error.
Another blind spot is even more remarkable: the neglect of starvation on the part of genocide scholars. It’s striking because the intellectual father of genocide studies, Raphael Lemkin, was keenly interested in the politics of food and famine. In fact, in Axis Rule in Occupied Europe (1944) he devoted more space to starvation and related deprivation than to mass killing. Elaborating on the physical debilitation of groups as a technique of genocide, he began by describing ‘racial discrimination in feeding’ and detailed Nazi guidelines specifying the portion of basic nutrients allocated to different groups, ranging in the case of carbohydrates from 100 per cent for Germans to 76-77 per cent for Poles, 38 per cent for Greeks and 27 per cent for Jews. The second mechanism Lemkin described was the endangering of health by overcrowding in ghettos, withholding medicine and heating fuel, and transporting people in cattle trucks and freight cars. The third was mass killings, which he described in a single paragraph.
When Lemkin began writing his book, starvation was the Nazis’ most effective instrument of mass murder. The rationale for Operation Barbarossa was that the Ukraine and southern Russia were resource-rich lands that would provide Lebensraum for the German people. Central to the planning of Barbarossa was the question of how to feed the Wehrmacht. At the post-Nuremberg trial of senior civil servants in 1947, the prosecution reproduced a document entitled ‘Memorandum on the Result of Today’s Conference with the State Secretaries concerning Barbarossa’, dated 2 May 1941, just a few weeks before the invasion. It begins: ‘1. The war can only be continued if the entire armed forces are fed from Russia during the third year of the war. 2. As a result, there is no doubt that “x” million people [zig Millionen Menschen] will starve to death if we take out from the country whatever we need.’ It was written by Herbert Backe, state secretary of the Reich Ministry for Food and Agriculture. While the memo left the number of victims blank, Backe’s arithmetic suggested that the entire urban population of the European Soviet Union – thirty million ‘surplus eaters’ – should be starved to death.
The Hungerplan, to give it its German proper name, began with the forcible starving of Soviet prisoners of war. Crowded into vast camps without any shelter, 1.3 million died in the four months after the invasion. About 2.5 million had died this way by the end of the war. But the Hungerplan proved impossible to implement fully. Starving people in large numbers is extremely hard work. Stalin’s administration of famine in Ukraine a decade earlier had called on the entire apparatus of the Communist Party, and the German invaders had no such infrastructure. They besieged Leningrad, where a million died. In the occupied cities of Kiev and Kharkov they restricted food supplies and similar numbers perished. But the peasants, who had honed their survival skills in two post-1917 famines, didn’t succumb easily. German soldiers also relied on locally grown food, and so Backe’s office ordered that peasants be permitted to carry on producing crops. The hunger planners fell short of their original target by more than twenty million.
Even at this reduced scale, the Hungerplan was a crime comparable in numerical terms to the Final Solution. Indeed, forced starvation was one of the instruments of the Holocaust. Eighty thousand Jews starved to death in the Warsaw Ghetto. Rudolf Höss, commandant of Auschwitz from May 1940 to December 1943, testifying before the Nuremberg Tribunal, estimated that ‘in the camp of Auschwitz alone in that time 2,500,000 persons were exterminated and that a further 500,000 died from disease and starvation.’ In The Taste of War: World War II and the Battle for Food Lizzie Collingham makes the point that the failure to starve ‘useless eaters’ in sufficient numbers, sufficiently quickly, became a rationale for expediting their mass murder by killing squads and gas chambers.
Backe was interrogated but by the time the Ministries Trial began in 1947, he had committed suicide, fearing he would be handed over to the Soviets. His predecessor as minister for food and agriculture, Walther Darré, an ideologue of ‘blood and soil’ and aggressive eastward expansion, was found guilty of plunder and despoliation, and sentenced to seven years in prison but released after two. Though Backe’s memo was produced as evidence, the Hungerplan was not mentioned by name. The Allies were in no hurry to criminalise famine or economic warfare.
The legal difficulties in prosecuting starvation as a crime included the need to determine whether starvation was itself unlawful, and if it was what sort of a crime it might be, and how guilt might be proved. The laws of war did not prohibit starvation in pursuit of a military goal: it was legitimate to starve a besieged city into submission, or to blockade an entire country. In the post-Nuremberg High Command Trial, American prosecutors brought charges against Field Marshal Wilhelm von Leeb for crimes committed during the siege of Leningrad. But there was no legal basis on which to find Leeb guilty of starving the city, or even of sustaining the pressure of hunger on the residents by firing at civilians trying to leave. The judges found Leeb’s orders extreme but not criminal, though they added that they wished the law were otherwise. They cited the Lieber Code – drawn up for the Union army in the American Civil War – which permitted starvation if it hastened military victory. In October 1948, Leeb was sentenced to time served, for transmitting the Barbarossa Jurisdiction Order, and released.
By the time of the war crimes trials, the British navy was already a seasoned exponent of maritime bockade. In 1909 the House of Lords refused to ratify the London Declaration on the laws of naval war, on the grounds that doing so would restrict the navy’s ability to block the flow of foodstuffs to an enemy. Establishing an international court to determine the legality of intercepting ships on the high seas, the Lords felt, would amount to a contravention of British sovereignty. Britain blockaded Germany during the First World War, and about 750,000 German civilians died of hunger. That blockade was kept in place (and tightened) for eight months after the Armistice in order to compel the Germans to sign the Versailles Treaty. In 1942 Churchill came under heavy pressure to lift the blockade on Greece, and only reluctantly and minimally relented – an episode that resulted in the foundation of the Oxford Committee for Famine Relief, now known as Oxfam. The following year, the cabinet made feeding the British Isles a higher priority than preventing famine in Bengal, a decision that cost as many as three million lives. Most tellingly, the name chosen for the aerial mining of Japanese harbours in 1945 by the US Air Force was Operation Starvation.
The Nuremberg Charter didn’t (despite Lemkin’s urging) make genocide an indictable offence, but it did include ‘crimes against humanity’. Starvation-related prosecutions were possible under Article 6, which classed ‘inhumane acts’, ‘extermination’ and ‘persecution’ as ‘crimes against humanity’. There’s a rationale for this: depriving someone of food can be a form of torture, an infliction of suffering pure and simple or with some ulterior goal in mind (such as forcing hungry persons to abandon their villages). Had the drafters of the charter made starvation a crime in its own right, there would have been uncomfortable implications for the Allies, given their own use of blockades. The final judgments at Nuremberg use the term ‘starvation’, but it is ancillary to the wider crimes committed by the Nazi leadership.
There are extraordinary evidentiary problems in prosecuting cases of starvation as murder (or extermination). Only in the case of prisoners, where the victims and their food supplies are entirely controlled by the jailer, can there be proof beyond reasonable doubt that the perpetrator is responsible for the death of the victim. In other instances, the defence could argue that the victim failed to avail himself of opportunities to find food or that he might have survived were it not for other factors over which the defendant had no control, such as crop failures, high food prices, or infectious disease. Yet no charges were brought at Nuremberg for the killing by forced starvation of millions of prisoners of war.
In 1991, I tried to persuade the Ethiopian special prosecutor to press famine-related charges against the officials of the just deposed military regime of Mengistu Haile Mariam. Although the incoming government of former guerrillas from the Ethiopian People’s Revolutionary Democratic Front was sympathetic (their own monument to the martyrs of the struggle in Mekele shows the starving alongside other victims of war), the prosecutor wouldn’t consider setting such a precedent. The International Criminal Tribunal for the Former Yugoslavia didn’t prosecute General Stanislav Gali?, who supervised the siege of Sarajevo, for causing starvation on the grounds that while people had gone hungry, no Sarajevan had actually died of hunger. The best opportunity for specifying starvation as a crime arose with the court set up to try the Khmer Rouge leadership. More than a million Cambodians died from starvation, but the prosecutors took the same route as their predecessors at Nuremberg and folded famine-related crimes within other charges.
In 1977, the International Committee of the Red Cross argued successfully for the new provisions to be added to the Geneva Conventions of 1949: Article 54 of Protocol I states that ‘starvation of civilians as a method of warfare is prohibited.’ This is a bold statement of humanitarian law, but its application is limited. First, it obtains only in international conflicts, not in civil wars. And second, as the legal scholar David Marcus pointed out, the obligation on warring parties to permit relief aid ‘retreats in the face of the military necessity of blockade’. In 1998, when the Rome Statute for the International Criminal Court (ICC) was negotiated, a Cuban proposal to prohibit blockades was rejected. At precisely the same time, the US and its allies were enforcing sanctions on Iraq.
The reluctance to acknowledge famine crimes seemed to matter less as long as famines were becoming rarer and less lethal. Other measures, legal, humanitarian and political, would suffice. And because acts of starvation are invariably associated with other war crimes or crimes against humanity, outlawing and prosecuting acts that were already prohibited was a way of discouraging the use of famine. Once again, following the Nuremberg model, judges in international tribunals repeatedly expressed their abhorrence of starvation being used as a tactic, and found defendants guilty of war crimes and crimes against humanity that overlapped with faminogenesis – Marcus’s term for creating or compounding famine.
From the 1980s, international relief operations expanded hugely. For a relief worker in the field, the priority is getting assistance to the hungry: documenting and exposing the crimes that gave rise to the hunger are more than a distraction: they can be an obstacle. In 1988, at the beginning of the civil war in Sudan, thousands of southern Sudanese were dying in camps for displaced persons controlled by pro-government militia along the north-south internal border, with the worst death rates in the small town of Abyei. I argued with a relief worker about the need to condemn the army officers who were responsible for this. ‘I would sup with the devil to get food to Abyei,’ he said. The following year, James Grant, then head of Unicef, accepted a dinner invitation from General Fadallah Burma Nasir, co-ordinator of what was called the ‘militia policy’. Grant left the dinner with a life-saving agreement: Operation Lifeline Sudan was the first ever UN relief effort to cross civil war battle lines. In my book Famine Crimes, published twenty years ago, I excoriated the humanitarians for neglecting – and therefore perpetuating – the political and military causes of famine, but there is much to be said for Grant’s decision to meet immediate needs and turn a blind eye to their causes.
The success – and eventual thwarting – of apolitical humanitarianism was most starkly evident under George W. Bush. Campaigning in New Hampshire in 2000, Bush promised he would never use the denial of food as an instrument of foreign policy. He picked Andrew Natsios as his administrator of USAID: a figure with extensive experience both in official disaster assistance and as vice president of the aid agency World Vision. A few years earlier, Natsios had taken a controversial stand in favour of aiding North Korea during that country’s famine, on the grounds that it was morally right for the US to send aid to feed the hungry and might also make good political sense. When he took office, he called USAID’s senior staff together and told them to be alert to the danger signs of famine, and always to make its prevention a priority. In one of the most significant and under-acknowledged actions of his tenure, he authorised aid to Darfur in September 2003, six months before the humanitarian crisis there became a public scandal. Loudly attacked by the Save Darfur Coalition for his pragmatism and reluctance to describe Darfur as genocide, Natsios did more to save Darfurian lives than all his critics put together.
The Bush administration provides a vivid illustration of the fact that a political commitment to prevent famine can yield results. But the War on Terror and the invasion of Iraq were even more compelling demonstrations that starvation has a promising future when the norms of liberal internationalism are violated. Each of today’s famines results in part from the Bush-Cheney doctrine that national security and counter-terrorism take precedence over all other considerations. This doctrine assumes that relief aid will go to feed insurgents, or enable them to legitimise their rule over captive populations. At the same time, those groups regard Western aid as an enemy weapon that will undermine their standing with local communities – there’s some truth in this – or as a tool of espionage. The presumption that relief supplies and relief workers are neutral: that they operate in what they refer to as a ‘humanitarian space’, is vanishing. That much is evident in the Nigerian war on Boko Haram and in the Saudi-Emirati onslaught on Yemen. Somalia hasn’t recovered from the devastation of the 2011 famine, in which the precepts of counter-terrorism meant that a humanitarian response wasn’t possible until it was too late. South Sudan owes its independence, in a roundabout way, to the support extended during the 1990s to the rebels of the Sudan People’s Liberation Army (SPLA) by the Clinton administration, with the express intention of creating a new state with a ‘regime that will not let Khartoum become a viper’s nest for terrorist activities’. The SPLA leadership took this to mean that they were entitled to become a member of the club of nations but didn’t need to abide by its rules – as long as they enjoyed the status of victims, and remained enemies of the Islamists in Khartoum.
Western humanitarianism was compromised once counter-terrorism enabled the overruling of humanitarian principles by security dictat, as Peter Gill explains in Today We Drop Bombs, Tomorrow We Build Bridges: How Foreign Aid Became a Casualty of War ‘I am serious about making sure we have the best relationship with the NGOs who are such a force multiplier for us, such an important part of our combat team,’ Colin Powell announced seven weeks after 9/11. Powell’s message was not lost on militant jihadis, who deliberately blurred the distinction between intelligence agencies and aid agencies in their clampdown on foreign relief.
Counter-humanitarianism has several motivations. Extremist groups such as Isis and al-Shabaab reject Western aid. Some regimes decide to ignore humanitarian concerns and prioritise national security, as Assad has done in Syria, or the Saudis with their blockade of Yemen. The legal and moral exceptionalism counter-terrorism’s proponents have granted themselves is a further version of this. Xenophobia is another: famine prevention is based on the now jeopardised notion that the poor, strangers and outsiders are just as worthy of assistance as friends and familiars.
Drawing on a long Anglo-American tradition of economic warfare and blockade, the counter-humanitarian trend in London and Washington is both morally distasteful and practically stupid. When international aid fails to feed the hungry and treat the sick, extremist projects flourish. If security strategists and xenophobes think that humanitarian crises will burn themselves out at a safe distance they are mistaken: the biggest demographic outcome of famine has always been migration – the Gulf countries are learning this lesson, as millions of Yemenis cross their borders. The threat to the values of the humanitarians coincides with dramatic demands on their knowledge and skills. Their best strategy is to take the initiative and propose that starvation be added to the list of crimes against humanity.

|
|