RSN Fundraising Banner
FB Share
Email This Page
add comment
Politics
America's War on Syrian Civilians Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=57588"><span class="small">Anand Gopal, The New Yorker</span></a>   
Saturday, 19 December 2020 09:26

Gopal writes: "Bombs killed thousands of civilians in Raqqa, and the city was decimated. US lawyers insist that war crimes weren't committed, but it's time to look honestly at the devastation that accompanies 'targeted' air strikes."

Babies are carried away from destroyed buildings following an air strike in Syria. (photo: Ameer Alhalbi/AFP/Getty Images)
Babies are carried away from destroyed buildings following an air strike in Syria. (photo: Ameer Alhalbi/AFP/Getty Images)


America's War on Syrian Civilians

By Anand Gopal, The New Yorker

19 December 20


Bombs killed thousands of civilians in Raqqa, and the city was decimated. U.S. lawyers insist that war crimes weren’t committed, but it’s time to look honestly at the devastation that accompanies “targeted” air strikes.

or four months in 2017, an American-led coalition in Syria dropped some ten thousand bombs on Raqqa, the densely populated capital of the Islamic State. Nearly eighty per cent of the city, which has a population of three hundred thousand, was destroyed. I visited shortly after ISIS relinquished control, and found the scale of the devastation difficult to comprehend: the skeletal silhouettes of collapsed apartment buildings, the charred schools, the gaping craters. Clotheslines were webbed between stray standing pillars, evidence that survivors were somehow living among the ruins. Nobody knows how many thousands of residents died, or how many are now homeless or confined to a wheelchair. What is certain is that the decimation of Raqqa is unlike anything seen in an American conflict since the Second World War.

As then, this battle was waged against an enemy bent on overthrowing an entire order, in an apparently nihilistic putsch against reason itself. But Raqqa was no Normandy. Although many Syrians fought valiantly against ISIS and lost their lives, the U.S., apart from a few hundred Special Forces on the ground, relied on overwhelming airpower, prosecuting the entire war from a safe distance. Not a single American died. The U.S. still occasionally conducts conventional ground battles, as in Falluja, Iraq, where, in 2004, troops engaged in fierce firefights with insurgents. But the battle for Raqqa—a war fought from cavernous control rooms thousands of miles away, or from aircraft thousands of feet in the sky—is the true face of modern American combat.

We have been conditioned to judge the merit of today’s wars by their conduct. The United Nations upholds norms of warfare that, among other things, prohibit such acts as torture, rape, and hostage-taking. Human-rights groups and international lawyers tend to designate a war “humane” when belligerents have avoided harming civilians as much as possible. However, in “Asymmetric Killing: Risk Avoidance, Just War, and the Warrior Ethos” (Oxford), Neil Renic, a scholar of international relations, challenges this standard. He argues that, when assessing the humanity of a war, we should look not only to the fate of civilians but also to whether combatants have exposed themselves to risk on the battlefield. Renic suggests that when one side fully removes itself from danger—even if it goes to considerable lengths to protect civilians—it violates the ethos of humane warfare.

The core principle of humane warfare is that fighters may kill one another at any time, excepting those who are rendered hors de combat, and must avoid targeting civilians. It’s tempting to say that civilians enjoy this protected status because they are innocent, but, as Renic points out, civilians “feed hungry armies, elect bellicose leaders, and educate future combatants.” In Syria, home to a popular revolution, entire towns were mobilized for the war effort. Civilians—even children—acted as lookouts, arms smugglers, and spies. What really matters, then, is the type of danger that someone in a battle zone presents. The moment that a person picks up a weapon, whether donning a uniform or not, he or she poses a direct and immediate danger. This is the crucial distinction between armed personnel and civilians.

But what if the belligerents themselves don’t pose a direct and immediate danger? Renic argues that in such theatres as Pakistan, where Americans deploy remote-controlled drones to kill their enemies while rarely stepping foot on the battlefield, insurgents on the ground cannot fight back—meaning that, in terms of the threat that they constitute, they are no different from civilians. It would then be just as wrong, Renic suggests, to unleash a Hellfire missile on a group of pickup-riding insurgents as it would be to annihilate a pickup-riding family en route to a picnic.

One might respond that, say, the Pakistani Taliban does pose an immediate threat to Pakistani civilians, if not to U.S. soldiers. But Renic contends that the U.S., by avoiding the battlefield, has turned civilians into attractive targets for insurgents eager for a fight. Whether this claim is correct or not, it’s clear that risk-free combat has brought warfare into new moral territory, requiring us to interrogate our old notions of battlefield right and wrong. If we can distinguish combatants from civilians only by the danger that they pose to other combatants, then the long-distance violence of modern warfare is inhumane. Renic concludes that the “increasingly sterile, bureaucratized, and detached mode of American killing” has the flavor of punishment rather than of war in any traditional sense. In Barack Obama’s recent memoir, he writes that, as President, he wanted to save “the millions of young men” in the Muslim world who were “warped and stunted by desperation, ignorance, dreams of religious glory, the violence of their surroundings.” Yet he claims that, owing to where they lived, and the machinery at his disposal, he ended up “killing them instead.” Leaving aside Obama’s crude generalizations, Renic argues that he could indeed have saved them—by “severely restricting” remote warfare.

Renic’s book is part of a broader trend of scholars and human-rights activists contending with the wreckage caused by America’s recent conflicts abroad. Their studies share a basic quest: how can we use rules to make warfare more humane? Whereas Renic focusses on moral rules, much of this other work is concerned with legal rules. In the aftermath of the Raqqa battle, Amnesty International and other organizations sifted through the rubble, carefully documenting whether this or that bombing complied with the laws of war. This work is salutary, but a troubling question looms behind it: in our drive to subject the battlefield to rules, are we overlooking deeper moral truths about the nature of war itself?

The notion that warfare should be governed by rules is ancient, and dates at least to Augustine, who argued that a legitimate ruler can wage war when he has good intentions and a just cause. In the Middle Ages, the Church attempted to ban the crossbow, and took efforts to protect ecclesiastical property and noncombatants from wartime violence. But it was only in the nineteenth century that states attempted to fashion laws and treaties to regulate wartime conduct. During the American Civil War, the Union implemented the Lieber Code, which sought to restrict the imposition of unnecessary suffering—torture or poisoning, for example—on the enemy. The code also enshrined as legal convention the principle of “military necessity”: if violence had a strategic purpose—that is, if it could help win a war—it was allowed. In the Hague Conventions of 1899 and 1907, world powers accepted vague limits on wartime conduct while upholding the principle of military necessity. States agreed to a moratorium on balloon-launched munitions, which had little tactical value, but were silent on the question of motorized aircraft.

Many nations ignored even these lax regulations. The Hague Conventions prohibited “asphyxiating gases,” but world powers flouted the treaties with abandon in the trenches of the First World War. The conventions effectively outlawed the intentional targeting of civilians, but by the Second World War belligerents had recognized the military advantage of bombing towns and villages. In 1942, British policy actually barred aircraft from targeting military facilities, ordering them instead to strike working-class areas of German cities—“for the sake of increasing terror,” as Churchill later put it. In 1943, the U.S. and British Air Forces of Operation Gomorrah rained down fire and steel upon Hamburg for seven nights, killing fifty-eight thousand civilians. Urban bombing campaigns left millions of homeless and shell-shocked Germans roaming a ravaged land that W. G. Sebald later described as the “necropolis of a foreign, mysterious people, torn from its civil existence and its history, thrown back to the evolutionary stage of nomadic gatherers.” Then came the nuclear bombs dropped on Hiroshima and Nagasaki, which killed about two hundred and fifty thousand people. In all, Allied terror raids may have claimed some half a million civilian lives. The pattern continued in the Korean War; Secretary of State Dean Rusk later recalled that the U.S. had bombed “every brick that was standing on top of another, everything that moved.”

During the Vietnam War, a powerful antiwar movement emerged for the first time since the First World War. Through television, the news of such atrocities as the My Lai massacre reached directly into American living rooms, and conscientious objectors and antiwar activists appealed to international law to justify their opposition to the carnage. They were more successful in shaping U.S. conduct than they could have ever imagined. After the war, the Pentagon revamped its arsenal with such inventions as laser-guided munitions, which could carry out “precision strikes.” The U.S. military began to follow the principles of the Hague Conventions, as well as those found in other treaties, calling these combined regulations the Law of Armed Conflict. American terror bombings became a thing of the past. In the first Gulf War, hundreds of specialist attorneys sat alongside generals at CENTCOM headquarters in Saudi Arabia, and elsewhere, to insure that the U.S. followed legal rules of warfare. It was the largest per-capita wartime deployment of lawyers in American history.

On the face of it, scrupulous adherence to the law is a victory for the cause of humane war. Yet the ruins of Syria tell a more complicated story. Not long before the U.S. assault on Raqqa, Russian and Syrian forces launched a major offensive to capture the rebel-held eastern side of Aleppo. Paying no heed to international law, they retook the city with savage efficiency, laying waste to crowded markets and hospitals. Yet the end result looked no different from Raqqa: a large civilian death toll, honeycombed apartment buildings, streets choked with rubble, entire neighborhoods flattened.

The U.S.-led coalition waged its assault on Raqqa with exacting legal precision. It vetted every target carefully, with a fleet of lawyers scrutinizing strikes the way an in-house counsel pores over a corporation’s latest contract. During the battle, the coalition commander, Lieutenant General Stephen J. Townsend, declared, “I challenge anyone to find a more precise air campaign in the history of warfare.” Although human-rights activists insist that the coalition could have done more to protect civilians, Townsend is right: unlike Russia, America does not bomb indiscriminately. The U.S. razed an entire city, killing thousands in the process, without committing a single obvious war crime.

During the summer of 2016, residents of Tokhar, a riverside hamlet in northern Syria, gathered every night in four houses on the community’s edge, hoping to evade gunfire and bombs. This was the farthest point from a front line, a mile away, where U.S.-backed forces were engaging ISIS fighters. Every night, a drone hovered over Tokhar, filming the villagers’ procession from their scattered homes to these makeshift bunkers. The basements became crowded with farmers, mothers, schoolgirls, and small children. On July 18th, at around 3 A.M., the houses exploded. Thick smoke covered the night sky. Limbs were strewn across the rubble. Children were buried under collapsed walls.

People from surrounding villages spent two weeks digging out bodies. The coalition, meanwhile, announced that it had destroyed “nine ISIL fighting positions, an ISIL command and control node, and 12 ISIL vehicles” in the area that night. Eventually, after reports surfaced that many civilians had died, the coalition admitted to killing twenty-four. When a colleague and I visited, a year after the raid, we documented at least a hundred and twenty dead civilians, and found no evidence that any ISIS members had been present near the four houses. A mother told me that some small children were obliterated, their bodies never found.

“We take all measures during the targeting process . . . to comply with the principles of the Law of Armed Conflict,” U.S. Marine Major Adrian J. T. Rankine-Galloway said. The essence of this legal code is that militaries cannot intentionally kill civilians. It is true that no one in the chain of command wished to massacre civilians that night—not the pilot or the targeteers or the lawyers. The U.S. points to this fact in calling the Tokhar incident an error, regrettable but not illegal. Yet, though it is reasonable to invoke intention when referring to the mind-set of an individual—this is the idea behind the legal concept mens rea—it seems odd to ascribe a mental state to a collective actor like an army or a state. It is clear, however, that the coalition could have foreseen the outcome of its actions: it had filmed the area for weeks, and intelligence indicating that the village was populated would not have been difficult to gather. During the coalition’s campaign against ISIS, it often based its bombing decisions on faulty assumptions about civilian life; in Mosul, it targeted a pair of family homes after failing to observe civilians outdoors over the course of a few afternoons. Iraqis typically avoid the blazing midday heat. Four people died. The Law of Armed Conflict excuses genuine errors and proscribes intentional killing, but most American warfare operates in a gray zone, which exists, in part, because the law itself is so vague.

A second pillar of the legal code is the rule of proportionality: states can kill civilians if they are aiming for a military target, as long as the loss of civilian life is proportional to the military advantage they gain by the attack. What this means is anyone’s guess: how do you measure “military advantage” against human lives? During the Mosul battle, snipers went onto the roof of the home of Mohammed Tayeb al-Layla, a former dean of engineering at Mosul University. According to neighbors, he and his wife rushed upstairs, pleading with them to leave. In a flash, a warhead flattened the home, killing the snipers, al-Layla and his wife, and their daughter, who was downstairs. It’s nearly impossible to say how one would weigh two dead snipers against a dead family, but most conventions would consider the killing lawful. Much of the destruction in Raqqa follows the example of the al-Layla household: death by a thousand proportional strikes.

American officials are quick to point out that ISIS deserves a good share of the blame: militants dispersed themselves throughout schools and apartment buildings, and otherwise lived among the civilian population. Yet this does not necessarily absolve the U.S. When counter-insurgency doctrine was in vogue during the conflicts in Iraq and Afghanistan, American forces sought to win “hearts and minds” by embedding in population centers. For an Afghan, few sights stirred as much dread as a column of beige armored Humvees snaking through a crowded market. If a suicide bomber attacked the Humvees, Americans would rightly condemn him for his disregard for the surrounding civilians—even if he had the force of the law, in the guise of proportionality, behind him.

The contradictions of U.S. military conduct don’t go unnoticed. Human-rights organizations frequently accuse the U.S. of committing war crimes, including in the Raqqa battle. In nearly every case, though, the U.S. can muster a convincing defense. What is in dispute is not whether or not the U.S. killed civilians but the interpretation of the law: the U.S. uses a much looser interpretation of intentionality and proportionality than most human-rights groups do. After such deaths occur, no independent arbiter adjudicates the U.S.’s actions—only vanquished forces ever get dragged before an international tribunal. The Pentagon is left to judge itself, and, unsurprisingly, almost always finds in its own favor. The law’s ambiguities allow the U.S. to classify atrocities like that in Tokhar as accidents, even if the deadly results were foreseeable, and therefore avoidable.

How many civilian deaths in Raqqa were avoidable? In Tokhar, it was possible to reconstruct the evidence, but often it is not. Without transparency in the targeting process, the military usually has the final word. Yet there is one way we can intuitively know when an armed force has an alternative to causing civilian suffering. When U.S. forces are faced with a pair of ISIS gunmen on the roof of an apartment building, they can call in a five-hundred-pound laser-guided bomb—or they can approach the enemy on foot, braving enemy fire, and secure the building through old-fashioned battle. In the past, armies have sometimes chosen the harder path: during the Second World War, when Allied French pilots carried out bombing raids on Vichy territory—part of their homeland—they flew at lower altitudes, in order to avoid striking civilians, even though it increased the chances that they’d be shot down. For the U.S. military, however, the rules are blind to the question of risk. The law doesn’t consider whether an armed force could have avoided unnecessary civilian suffering by exposing itself to greater danger. For Neil Renic, wars waged exclusively through drones, therefore, point to the “profound discord between what is lawful on the battlefield and what is moral.”

This may be why the U.S. military today tends to downplay the old martial virtue of courage. Historically, though, the concept was so central to the idea of good soldiering that weapons or tactics lacking in valor sparked objections from the ranks. Renic writes that when aircraft first entered the modern arsenal, in the nineteen-tens, fighter pilots engaged in dogfights reminiscent of the gallantry of a medieval duel. But such long-distance tactics as mortar fire and aerial bombardment had little to do with valor. A pilot from the First World War recalled, “You did not sit in a muddy trench while someone who had no personal enmity against you loosed off a gun, five miles away, and blew you to smithereens.” He concluded, “That was not fighting; it was murder. Senseless, brutal, ignoble.” A British airman from the Second World War wrote, “I was a fighter pilot, never a bomber pilot, and I thank God for that. I do not believe I could ever have obeyed orders as a bomber pilot; it would have given me no sense of achievement to drop bombs on German cities.”

Though sniping causes far less devastation, it has long aroused a similar unease. In the First World War, a British brigadier-general denounced the practice as “an act of cold murder, horrible to the most callous, distasteful to all but the most perverted.” During the American Revolution, a young British officer trained his rifle’s sights on a target, only to decide that “it was not pleasant to fire at the back of an unoffending individual who was acquitting himself very coolly of his duty.” The individual in question was George Washington.

In 2014, the bio-pic “American Sniper” ignited a debate about whether its protagonist, a legendary marksman, had fabricated parts of his story. But, Renic points out, nobody questioned the moral legitimacy of sniping itself, an indication of the extent to which courage has vanished as a battlefield norm in today’s wars. Even if he is overstating the role of valor historically, it’s clear that the U.S. military today goes to great lengths to avoid risk, justifying its conduct instead by extolling the Law of Armed Conflict. A military that emphasizes courage may wind up protecting more civilians, but with bravery comes body bags—and, the moment that body bags arrived in the U.S., we would be forced to contend with the hard questions that the law lets us ignore. Were those deaths of Americans worth it? What is the purpose of this war? Should it be fought, and, if so, fought differently? These are conversations that neither the military nor human-rights organizations appear interested in having.

Critics might say that the ruins of Syria reveal the limited value of the laws of war: two armies, operating under greatly differing norms, produced nearly identical results in Raqqa and Aleppo. Defenders might retort that such rules, even when vague or overly permissive, are better than none at all. Probably both views are correct, but the focus on legality may have lulled us into a comfort with war itself. Human-rights groups have found the U.S. guilty of dozens of war crimes in Afghanistan, but most American killing has been lawful: a housewife wandering too close to a convoy, a farmer gunned down on faulty assumptions, a family made victim to the rule of proportionality. Americans seem to become exercised about the miseries of combat only when the rules are flagrantly violated; as long as they are not, a war quietly slides into the background—even into a permanent state of being. If the Afghan war continued for another twenty years, it’s doubtful whether it would arouse much domestic opposition, even though the over-all suffering may be as great as a wanton slaughter that ended in a decisive victory. The U.S. cannot carry out such a slaughter without violating the law and provoking widespread opposition, and so the conflict remains at a perpetual low boil. The U.S. finds itself in a peculiar situation in which it can neither win nor lose its wars.

Faced with this bitter truth, some thinkers espouse the doctrine of realism, which bluntly states that the battlefield is no place for moral strictures. But this doctrine can be used to excuse terrible and unnecessary suffering. Another approach is pacifism, which, for all its merits, asks us to condemn both the tyrant and those violently resisting tyranny. That leaves the moral tradition of “just war,” which maintains that warfare is a fixture of human existence, so the best we can hope for is to regulate when and how it is waged. This is the essential idea informing the laws of war.

Yet, although armed conflict is not disappearing anytime soon, that doesn’t mean we must reduce war solely to a question of legal violations and battlefield rules. Even if we can never abolish war, Immanuel Kant argued, we should act as if we could, and design our institutions accordingly. Today in America, we could work to insulate the Pentagon’s decisions from defense contractors and other vested interests; more important, we could revert the decision to make war to democratic control. After 9/11, Congress passed the Authorization for the Use of Military Force, which Presidents have since invoked to justify at least thirty-seven military activities in fourteen countries, including the U.S. war in Syria, without formal declaration or public debate. Whether this or that pile of rubble was produced lawfully, or whether or not American boots touched Syrian soil, is not nearly as important as the fact that the U.S. was free to raze a foreign city with no public discussion or accountability. Perhaps only when our foreign adventures are subject to democratic constraints will we view the starting and ending of wars—not just their conduct—as a matter of life and death.

e-max.it: your social media marketing partner
 
On the Hubris of Self-Destructing Stars Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=38164"><span class="small">Kareem Abdul-Jabbar, The Hollywood Reporter</span></a>   
Friday, 18 December 2020 13:38

Abdul-Jabbar writes: "The only thing people enjoy more than watching a celebrity's rocketing ascent to international fame is watching an aging celebrity's flaming plummet to the hard, cold ground of disgrace and obscurity."

Kareem Abdul-Jabbar. (photo: Etienne Laurent/Shutterstock)
Kareem Abdul-Jabbar. (photo: Etienne Laurent/Shutterstock)


On the Hubris of Self-Destructing Stars

By Kareem Abdul-Jabbar, The Hollywood Reporter

18 December 20


No matter their previous achievements, celebrities deserve legacy-killing backlash when they spread ignorance: "Great success in one field can lead to the delusion that all your thoughts are great."

he only thing people enjoy more than watching a celebrity's rocketing ascent to international fame is watching an aging celebrity's flaming plummet to the hard, cold ground of disgrace and obscurity. It is both a warning against hubris — believing you're too famous to fall — and a reminder that the same people who made you popular can turn on you. Most celebrities nod in understanding at Harvey Dent's observation in The Dark Knight Rises: "You either die a hero or live long enough to see yourself become the villain." Some, like Bill Cosby and Harvey Weinstein, committed heinous acts to obliterate their achievements. But social media has provided a weapon for others to commit instantaneous career suicide and destroy any good-faith legacy they spent a lifetime building. Like Howard Hughes, whose contributions to aviation and filmmaking were overshadowed by such eccentricities as collecting his own nail clippings in jars, these figures are obscuring their own careers.

Rudy Giuliani once graced the cover of Time as "America's Mayor" for his post-9/11 demeanor of calm authority. But the aura began to dim a few years later, and he head-butted the final nail into the coffin of that noble legacy Nov. 18 as he blathered on in a cringeworthy news conference about unproven conspiracies while black streaks streamed from his hair. This came only a few weeks after he was captured by a hidden camera in the latest Borat movie with his hand down his pants while lying on a bed in the presence of a teenage girl. (Giuliani insists he was tucking in his shirt.)

Sadly, Giuliani is not alone in his stumble from grace. Few are more beloved than J.K. Rowling, whose Harry Potter books make up the best-selling series in history. Yet her anti-trans tweets may not only damage the Potter and Fantastic Beasts franchises, they could end up tainting her entire literary legacy. Even the stars of the movies — Daniel Radcliffe, Emma Watson, Rupert Grint and Eddie Redmayne — have spoken out against her position. John Cleese's tone-deaf defense of Rowling left many fans bitterly disappointed, tarnishing his reputation.

It would be tempting to dismiss this self-mutilation as merely the triggering of overly sensitive "cancel culture." But some of this public braying does immediate harm to the foundation of society. Giuliani's attacks on the integrity of the 2020 elections, without any substantive evidence, has undermined the democratic process. A post-election poll indicated that 77 percent of Republicans think Joe Biden won because of fraud. Since no credible proof has ever been shown, this opinion can only be held because they practice flat-earther, anti-vaxxer cult-think: Someone in authority told me what I want to hear, so it must be true. Unfortunately, they include celebrities as "authorities." (Yes, I'm aware that I am a sports celebrity, but I have been writing books and articles about history, culture and politics for 30 years to establish my credibility.)

Actors seem especially intent on self-implosion. Roseanne Barr had achieved the near impossible, sabotaging her career not once but twice. After she left her top-rated sitcom, she faded into irrelevance with out-of-left-field political musings. Seeking to connect to the Trump demographic, ABC gave Roseanne new life, but her character was killed off after she went on a racist rant. James Woods, winner of a Golden Globe and Emmy, was once considered a dynamic actor. Now, after his caustic social commentary tweets, he's viewed as the cranky geezer who won't let you get your ball from his yard. Jon Voight, once a shining star among actors, recently posted a rambling video calling the political left "Satan" and promoting conspiracies about the election, reducing him from brilliant Oscar winner to cultural dumpster diver. Black Panther actor Letitia Wright posted a link to a YouTube video questioning the COVID-19 vaccine and vaccines in general. After a tsunami of social media backlash, she wrote: "My intention was not to hurt anyone. My ONLY intention of posting the video was it raised my concerns with what the vaccine contains and what we are putting in our bodies. Nothing else." At best, that's naive, and at worst, disingenuous. If someone wants to raise concerns — that's legitimate — they need to do basic research: Find facts, statistics and qualified authorities. Because the reality is that when she posts, readers believe she endorses the false conclusions — and that can't be undone.

Social media companies have begun slapping warnings on some messages that are false, incite violence or cause harm to society. But this needs to be done with more consistency and vigilance. Studies indicate that when readers see these warnings, they are less likely to read or believe things. However, as another study showed, there can be a backfire effect in which content that isn't flagged, even when inaccurate, is perceived as true.

Many Americans imbue stars with political and social intelligence they just don't have. Great success in one field can lead to the delusion that all your thoughts are great. It doesn't help to be surrounded by fawning people whose job it is to agree with everything you say. The irresponsibility of tweeting irrational and harmful opinions to millions, regardless of the damaging consequences to their country or people's lives, proves that those stars deserve the harsh backlash. Unfortunately, the long-term result may be that their professional legacies could become brief footnotes to the memory of their collection of mason jars filled with their excreted opinions.

e-max.it: your social media marketing partner
 
In the Wealthiest Country in History, Americans Are Desperately Struggling With Hunger Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=50468"><span class="small">Luke Savage, Jacobin</span></a>   
Friday, 18 December 2020 13:38

Excerpt: "Tens of millions of Americans are struggling to feed themselves, as cases of shoplifting to obtain basic food staples surge worldwide. The pandemic is wreaking economic havoc while Congress dithers."

People wait in a long line to receive a food bank donation at the Barclays Center on May 15, 2020, in the Brooklyn borough of New York City. (photo: Stephanie Keith/Getty)
People wait in a long line to receive a food bank donation at the Barclays Center on May 15, 2020, in the Brooklyn borough of New York City. (photo: Stephanie Keith/Getty)


In the Wealthiest Country in History, Americans Are Desperately Struggling With Hunger

By Luke Savage, Jacobin

18 December 20


Tens of millions of Americans are struggling to feed themselves, as cases of shoplifting to obtain basic food staples surge worldwide. The pandemic is wreaking economic havoc while Congress dithers.

or a few short weeks last spring, it seemed like the tectonic plates of America’s political consensus could be about to shift. Amid talk of massive public spending, enhanced support for the unemployed, and even the conscription of private industry under the Defense Production Act, it momentarily appeared that the pandemic might at least put a dent in decades of bipartisan aversion to social democracy and activist government.

Following an initial burst of activity, of course, Washington’s chronic addiction to markets and hostility to social welfare provisions soon returned, with resurgent spikes in COVID cases and lockdowns doing little to shake Beltway elites into a renewed sense of urgency. Amid continued economic turmoil, Americans haven’t received direct cash aid since a wave of $1,200 checks went out last spring. Enhanced unemployment benefits, which saved countless people from careering over a financial cliff, were quickly scaled back as senior Democrats and Republicans alike mused that they might make it harder for employers to hire. As Congress continues a series of fraught and tedious deliberations almost certain to yield an inadequate level of aid, both a federal moratorium on evictions and remaining unemployment assistance for 12 million people are set to expire by year’s end.

There can be no doubt that a more aggressive and activist federal response including measures similar to those undertaken in parts of Europe and Asia could have prevented a great deal of needless hardship. Whatever ultimately emerges from the current round of relief talks, the human consequences of Beltway complacency are becoming more visible every day, and recent reporting from the Washington Post underscores just how dire the situation has become.

According to a recent analysis by the paper, more Americans are currently going hungry than at any point in at least the past twenty-two years (1998 being the first year the Census Bureau started collecting the relevant data). According to a Census Bureau survey conducted in late October and early November, one out of every eight people reported not having enough to eat “sometimes or often” throughout the past week. Widespread hunger is now affecting some 26 million American adults, and the bureau’s top-line figure narrows to one in six in households with children.

These, of course, are only generalized figures — particular areas, groups, and populations being disproportionately affected by hunger.

Almost twice the overall rate, or some 22 percent, of black households reported going hungry in the survey. Much of the Post’s reporting centers on Houston, Texas, home to the country’s largest food bank and one of the cities worst hit by the growing hunger epidemic. In October, the Houston Food Bank reported a 45 percent increase in the amount of food it distributed, when compared against the same period from the previous year. One in five adults in the city are reportedly going hungry. Though the worst outbreaks of hunger are mostly in Southern states like Alabama, Louisiana, and Mississippi, higher than average rates are also scattered in areas across the Midwest and even some coastal regions.

The situation has become so severe that many retailers and police departments are reporting a spike in shoplifting — particularly of consumables and household staples like bread, pasta, and baby food.

For a country whose political class so proudly clings to a story of national exceptionalism, these developments should be a dramatic wake-up call. While Americans have never benefited from social democratic institutions as robust as those found in many less wealthy European countries, decades of neoliberal orthodoxy have successfully ground their equivalents down to dust. During boom periods, the result is an economy that still leaves millions struggling and financially insecure.

Amid a global pandemic, the country’s aversion to welfarism and lack of a functioning social safety net is quite literally pushing tens of millions toward starvation.

e-max.it: your social media marketing partner
 
Pfizer Helped Create the Global Patent Rules. Now It's Using Them to Undercut Access to the Covid Vaccine. Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=45458"><span class="small">Sarah Lazare, In These Times</span></a>   
Friday, 18 December 2020 13:38

Lazare writes: "The pharmaceutical company is opposing a proposal at the World Trade Organization to expand vaccine access to poor countries."

People walk by the Pfizer world headquarters in New York on November 9, 2020. (photo: Kena Betancup/Getty)
People walk by the Pfizer world headquarters in New York on November 9, 2020. (photo: Kena Betancup/Getty)


Pfizer Helped Create the Global Patent Rules. Now It's Using Them to Undercut Access to the Covid Vaccine.

By Sarah Lazare, In These Times

18 December 20


The pharmaceutical company is opposing a proposal at the World Trade Organization to expand vaccine access to poor countries.

he phar­ma­ceu­ti­cal giant Pfiz­er, whose Covid-19 vac­cine with Ger­man part­ner BioN­Tech was approved Decem­ber 11 for emer­gency use in the Unit­ed States, has emerged as a vocal oppo­nent of a glob­al effort to ensure poor coun­tries are able to access the vac­cine. In Octo­ber, India and South Africa put for­ward a pro­pos­al that the World Trade Orga­ni­za­tion (WTO) pause enforce­ment of patents for Covid-19 treat­ments, under the organization’s intel­lec­tu­al prop­er­ty agree­ment, “Trade-Relat­ed Aspects of Intel­lec­tu­al Prop­er­ty Rights,” or TRIPS. Now sup­port­ed by near­ly 100 coun­tries, the pro­pos­al would allow for the more afford­able pro­duc­tion of gener­ic treat­ments dur­ing the dura­tion of the pan­dem­ic. As wealthy coun­tries hoard vac­cine stocks, and one study warns a quar­ter of the world’s pop­u­la­tion won’t get the vac­cine until 2022, the pro­pos­al?—?if approved?—?could poten­tial­ly save count­less lives in the Glob­al South.

But so far, the Unit­ed States, the Euro­pean Union, Britain, Nor­way, Switzer­land, Japan and Cana­da have suc­cess­ful­ly blocked this pro­pos­al, in a con­text where delay will almost cer­tain­ly bring more deaths. The phar­ma­ceu­ti­cal indus­try, con­cerned with pro­tect­ing its prof­its, is a pow­er­ful part­ner in this oppo­si­tion, with Pfiz­er among its lead­ers. “The (intel­lec­tu­al prop­er­ty), which is the blood of the pri­vate sec­tor, is what brought a solu­tion to this pan­dem­ic and it is not a bar­ri­er right now,” Albert Bourla, chief exec­u­tive of Pfiz­er, declared last week. And in a Decem­ber 5 arti­cle in The Lancet, Pfiz­er reg­is­tered its oppo­si­tion to the pro­pos­al, say­ing, “a one-size-fits-all mod­el dis­re­gards the spe­cif­ic cir­cum­stances of each sit­u­a­tion, each prod­uct and each country.”

Pfizer’s appeals make it sound as though the frame­work of intel­lec­tu­al prop­er­ty rules and phar­ma­ceu­ti­cal monop­o­lies is a com­mon-sense glob­al order whose ben­e­fits to human soci­ety are appar­ent. But, in real­i­ty, these inter­na­tion­al norms are rel­a­tive­ly recent, and were shaped, in part, by Pfiz­er itself. From the mid-1980s to the ear­ly 1990s, the com­pa­ny played a crit­i­cal role in estab­lish­ing the very WTO intel­lec­tu­al prop­er­ty rules that it is now invok­ing to argue against free­ing up vac­cine sup­plies for poor coun­tries. The “blood of the pri­vate sec­tor” that Bourla appeals to is not some nat­ur­al state of affairs, but reflects a glob­al trade struc­ture the com­pa­ny helped cre­ate?—?to the detri­ment of poor peo­ple around the world who seek access to life-sav­ing drugs.

A cor­po­rate campaign

In the mid-1980s, Edmund Pratt, then chair­man of Pfiz­er, had a mis­sion: He want­ed to ensure that strong intel­lec­tu­al prop­er­ty (IP) pro­tec­tions were includ­ed in the Uruguay Round of the Gen­er­al Agree­ment on Tar­iffs and Trade (GATT) talks?—?the multi­na­tion­al trade nego­ti­a­tions that would result in the estab­lish­ment of the WTO in 1995. His cal­cu­lus was sim­ple: Such pro­tec­tions were vital for pro­tect­ing the glob­al “com­pet­i­tive­ness”?—?or bot­tom line?—?of his com­pa­ny and oth­er U.S. indus­tries.

To his great advan­tage, Pratt had con­sid­er­able insti­tu­tion­al pow­er beyond his imme­di­ate cor­po­rate rank. As authors Cha­ran Dev­ereaux, Robert Z. Lawrence and Michael D. Watkins note in their book, Case Stud­ies in U.S. Trade Nego­ti­a­tion, Pratt served on the Advi­so­ry Com­mit­tee on Trade Nego­ti­a­tions for the Carter and Rea­gan admin­is­tra­tions. In 1986, he co-found­ed the Intel­lec­tu­al Prop­er­ty Com­mit­tee (IPC), which would go on to build rela­tion­ships with indus­tries across Europe and Japan, meet with offi­cials from the World Intel­lec­tu­al Prop­er­ty Orga­ni­za­tion of the Unit­ed Nations, and lob­by aggres­sive­ly?—?all for the pur­pose of ensur­ing IP was includ­ed in the trade negotiations.

Both glob­al­ly and domes­ti­cal­ly, Pfiz­er played an impor­tant role in pro­mot­ing the idea that inter­na­tion­al trade should be con­tin­gent on strong intel­lec­tu­al prop­er­ty rules, while cast­ing coun­tries that do not fol­low U.S. intel­lec­tu­al prop­er­ty rules as engag­ing in “pira­cy.” As Peter Dra­hos and John Braith­waite note in their book Infor­ma­tion Feu­dal­ism, “Like the beat of a tom-tom, the mes­sage about intel­lec­tu­al prop­er­ty went out along the busi­ness net­works to cham­bers of com­merce, busi­ness coun­cils, busi­ness com­mit­tees, trade asso­ci­a­tions and busi­ness bod­ies. Pro­gres­sive­ly, Pfiz­er exec­u­tives who occu­pied key posi­tions in strate­gic busi­ness orga­ni­za­tions were able to enroll their sup­port for a trade-based approach to intel­lec­tu­al property.”

It was not a giv­en, at the time, that intel­lec­tu­al prop­er­ty would be includ­ed in trade nego­ti­a­tions. Many Third World coun­tries resist­ed such inclu­sion, on the grounds that stronger intel­lec­tu­al prop­er­ty rules would pro­tect the monop­oly pow­er of cor­po­ra­tions and under­mine domes­tic price con­trols, as explained in Case Stud­ies in U.S. Trade Nego­ti­a­tion. In 1982, Indi­an Prime Min­is­ter Indi­ra Gand­hi told the World Health Assem­bly, “the idea of a bet­ter ordered world is one in which med­ical dis­cov­ery will be free of all patents and there will be no prof­i­teer­ing from life and death.” The Chris­t­ian Sci­ence Mon­i­tor report­ed in 1986, “Brazil and Argenti­na have spear­head­ed a group that has blocked US attempts to include intel­lec­tu­al prop­er­ty pro­tec­tion in the new round of talks.”

But Pratt had pow­er­ful allies, includ­ing IBM chair­man John Opel, and their efforts played an impor­tant role in secur­ing the inclu­sion of TRIPS?—?which sets intel­lec­tu­al prop­er­ty rules?—?in the GATT nego­ti­a­tions. Pratt, for his part, took some cred­it for the devel­op­ment. “The cur­rent GATT vic­to­ry, which estab­lished pro­vi­sions for intel­lec­tu­al prop­er­ty, result­ed in part from the hard-fought efforts of the U.S. gov­ern­ment and U.S. busi­ness­es, includ­ing Pfiz­er, over the past three decades. We’ve been in it from the begin­ning, tak­ing a lead­er­ship role,” Pratt declared, accord­ing to the book, Whose Trade Orga­ni­za­tion? A Com­pre­hen­sive Guide to the WTO.

Dur­ing the TRIPS nego­ti­a­tions, the IPC played an active role in orga­niz­ing cor­po­rate lead­ers in the Unit­ed States, as well as Europe and Japan, to sup­port strong intel­lec­tu­al prop­er­ty rules. By the time the WTO was for­mal­ly estab­lished, and the TRIPs Agree­ment con­clud­ed, Pratt was no longer chair­man of Pfiz­er. But his con­tri­bu­tion, and the role of Pfiz­er, was still strong­ly felt. As Dev­ereaux, Lawrence and Watkins note, one U.S. nego­tia­tor said it was Pratt and Opel who “basi­cal­ly engi­neered, pushed, and cajoled the gov­ern­ment into includ­ing IP as one of the top­ics for nego­ti­a­tion” in the first place.

The WTO’s TRIPS Agree­ment, which went into effect in 1995, would go on to be the “most impor­tant agree­ment on intel­lec­tu­al prop­er­ty of the 20th cen­tu­ry,” Dra­hos and Braith­waite write. It brought most of the world under min­i­mum stan­dards for intel­lec­tu­al prop­er­ty, includ­ing patent monop­o­lies for phar­ma­ceu­ti­cal com­pa­nies, with some lim­it­ed safe­guards and flex­i­bil­i­ty.

Dean Bak­er, econ­o­mist and co-founder of the Cen­ter for Eco­nom­ic and Pol­i­cy Research (CEPR), a left-lean­ing think tank, tells In These Times, “TRIPS required devel­op­ing coun­tries, and coun­tries around the world, to adopt a U.S.-type patent and copy­right rule. Pre­vi­ous­ly, both had been out­side trade agree­ments, so coun­tries could have what­ev­er rules they want. India already had a well-devel­oped phar­ma­ceu­ti­cal indus­try by the 1990s. Pre-TRIPS, India did­n’t allow drug com­pa­nies to patent drugs. They could patent process­es, but not drugs.”

Cut­ting off access to medicines

TRIPS brought prof­its to phar­ma­ceu­ti­cal com­pa­nies and “raised phar­ma­ceu­ti­cal costs in the U.S. and fur­ther restrict­ed the avail­abil­i­ty of life­sav­ing drugs in WTO devel­op­ing coun­tries,” accord­ing to cor­po­rate watch­dog group Pub­lic Cit­i­zen. This dynam­ic played out ruth­less­ly dur­ing the AIDS cri­sis, which was in full swing as the WTO was cre­at­ed. “It took the South African gov­ern­ment almost a decade to break the monop­o­lies held by for­eign drug com­pa­nies that kept the coun­try hostage, and kept peo­ple there dying,” wrote Achal Prab­ha­la, Arjun Jayadev and Dean Bak­er in a recent piece in the New York Times.

It is dif­fi­cult to think of a clear­er case for sus­pend­ing intel­lec­tu­al prop­er­ty laws than a glob­al pan­dem­ic, a posi­tion that is cer­tain­ly not fringe in today’s polit­i­cal con­text. In addi­tion to a swath of glob­al activists, main­stream human rights groups and UN human rights experts have added their voic­es to the demand for a sus­pen­sion of patent laws. Their calls fol­low the glob­al jus­tice move­ment of the 1990s and ear­ly 2000s, which focused on the tremen­dous role of the WTO, along with oth­er glob­al insti­tu­tions like the World Bank and Inter­na­tion­al Mon­e­tary Fund, in expand­ing the pow­er of cor­po­ra­tions to under­mine domes­tic pro­tec­tions, from labor to envi­ron­ment to pub­lic health. The out­sized pow­er of the Unit­ed States and U.S. cor­po­ra­tions in the WTO?—?on dis­play in the block­ing of the pro­pos­al for a patent waiv­er?—?has been a key point of criticism.

Pfiz­er is not alone in stak­ing out its oppo­si­tion to paus­ing intel­lec­tu­al prop­er­ty rules. Phar­ma­ceu­ti­cal indus­try trade groups and indi­vid­ual com­pa­nies?—?includ­ing Mod­er­na, which is behind anoth­er lead­ing Covid-19 vac­cine?—?have all come out in full force against the pro­pos­al for reprieve from strin­gent intel­lec­tu­al prop­er­ty rules.

“The influ­ence of the phar­ma­ceu­ti­cal indus­try is enor­mous,” Bak­er tells In These Times. “Need­less to say, Trump is going to go with the phar­ma­ceu­ti­cal indus­try. Even Biden is going to be hear­ing from the phar­ma­ceu­ti­cal indus­try and will be hard pressed to do some­thing they don’t like. There’s no one oth­er than the phar­ma­ceu­ti­cal indus­try who’s going to stand up against this. They’re the ones that are push­ing it.”

The phar­ma­ceu­ti­cal indus­try is fight­ing to hoard life-sav­ing infor­ma­tion about vac­cines and Covid-19 treat­ments despite the tremen­dous role of pub­lic funds in enabling their devel­op­ment. Pfizer’s part­ner BioN­Tech, for exam­ple, received sig­nif­i­cant pub­lic fund­ing from Ger­many. But at an esti­mat­ed cost of $19.50 per dose for the first 100 mil­lion dos­es, the vac­cine is like­ly too cost­ly for many poor coun­tries, par­tic­u­lar­ly in light of its expen­sive stor­age require­ments. Phar­ma­ceu­ti­cal com­pa­ny AstraZeneca, which pro­duced a vac­cine with Oxford, has made some com­mit­ments to increase access to poor coun­tries, and it says it won’t make a prof­it from the vac­cine dur­ing the pan­dem­ic. But it “has retained the right to declare the end of the pan­dem­ic as ear­ly as July 2021,” Prab­ha­la, Jayadev and Bak­er note.

Indeed, the data emerg­ing indi­cates what could have been pre­dict­ed months ago: One could make a map of glob­al pover­ty, lay it over a map of vac­cine access, and it would be a vir­tu­al one-to-one match. “The U.S., Britain, Cana­da and oth­ers are hedg­ing their bets, reserv­ing dos­es that far out­num­ber their pop­u­la­tions,” the New York Times reports, “as many poor­er nations strug­gle to secure enough.” This is a log­i­cal out­come for a sys­tem designed from the onset to rein­force long-exist­ing pow­er struc­tures informed by an entrenched lega­cy of colo­nial­ism. Regard­less of “intent,” once again major­i­ty black and brown coun­tries, by and large, are left to suf­fer and die while wealthy Glob­al North coun­tries far exceed their need­ed capac­i­ty (although this is no guar­an­tee of equi­table dis­tri­b­u­tion with­in Glob­al North countries).

Giv­en the risk that we could see a glob­al apartheid of vac­cine dis­tri­b­u­tion, in which poor coun­tries con­tin­ue to face dev­as­tat­ing loss while rich coun­tries pur­sue herd immu­ni­ty, vague assur­ances of cor­po­rate benev­o­lence are not enough. As Bak­er puts it, “Why would­n’t you want every vac­cine avail­able as wide­ly as possible?”

e-max.it: your social media marketing partner
 
FOCUS: Biden Pick, Native American Deb Haaland to Head Department That Once Vowed to "Civilize or Exterminate" Her People Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=51519"><span class="small">Juan Cole, Informed Comment</span></a>   
Friday, 18 December 2020 12:22

Cole writes: "Rep. Deb Haaland will be president-elect Joe Biden's nominee to head the Department of the Interior. She is the first Native American to head the department, and her appointment is full of ironies."

Rep. Deb Haaland is President-elect Biden's pick for Secretary of the Interior. (photo: Getty)
Rep. Deb Haaland is President-elect Biden's pick for Secretary of the Interior. (photo: Getty)


Biden Pick, Native American Deb Haaland to Head Department That Once Vowed to "Civilize or Exterminate" Her People

By Juan Cole, Informed Comment

18 December 20

 

ep. Deb Haaland (D-NM) will be president-elect Joe Biden’s nominee to head the Department of the Interior. She is the first Native American to head the department, and her appointment is full of ironies. The Department, formed in 1849, includes the Bureau of Indian Affairs and has a long history of abusing and cheating American Indians (that is what most of them in the US prefer to be called).

Donald A. Grinde, Jr. writes that in 1851, the then secretary of the interior Alexander H. H. Stuart said that American Indians were “encompassed by an unbroken chain of civilization . . and the only alternatives left are, to civilize or exterminate.” Historian Frederick Jackson Turner declared the frontier over in a famous address to the American Historical Association in 1893, but Stuart appears to have backdated it to mid-century. Stuart also exemplified the frankly genocidal mindset of the white elite of the US.

The Department of the Interior saw the some half a million Native Americans left after European pandemics and genocides had ravaged them as in need of a European-style education and instruction on how to farm. Professor Grinde points out that Native Americans had plenty of culture, it was just oral and was discounted by the whites:

“Elders as well as people knowledgeable about specific ideas and techniques instructed members of their societies about a broad range of topics including his- tory, religion, arts and crafts, literature, geography, zoology, botany, medicine, law, political science, astronomy, soil science, and theater.”

They were stewards of North America for thousands of years and they didn’t mess it up with fossil fuels, global heating, chemical pollution and micro-plastics.

American Indian boarding schools were set up to separate children from their families and cultures and indoctrinate them in white ways, teaching them that their own religion and culture was inferior.

Despite the First Amendment guarantee of freedom of religion, the federal government interfered with and demeaned the practice of Native American religion as “heathenish” and federal boarding schools actively proselytized for Christianity.

National Geographic says that in 1883, Secretary of the Interior Henry M. Teller wrote a letter to Indian Affairs commissioner Hiram Price complaining about American Indian sacred dance as “heathenish,” and attacking medicine men for discouraging children from adopting European ways or going off to white-rune schools. NG continues,

“In response, Price promulgated a set of rules that became known as the “Code of Indian Offences.” These outlawed many traditional Native-American religious practices. It established a Court of Indian Offences and outlawed “the sun-dance, the scalp-dance, and the war-dance,” and associated activities. Participation in such activities would lead to withholding of rations for 10 days. A second offence would result in withholding rations for up to 30 days, or incarceration for up to 30 days.”

It was not until the 1970s that Native American religious practice was recognized as legitimate by Congress.

Since the Republicans in Congress nowadays are so het up about protecting religious sensibilities even at the expense of other civil liberties, maybe they’d like to apologize for their forebears’ assault on Native American religion?

The Department of the Interior was also supposed to oversee hundreds of treaties made between the native Nations and the US government. Corruption and arrogance led to many of those treaties being broken by the whites, writes Sarah Pruitt at History.com.

The US government organized the Great Sioux Reservation on the territory of and around the Black Hills of Dakota. The Sioux and Arapaho people had exclusive rights to it (the Sioux include the Dakata, Lakota and Nakota).

But then gold was found in the Black Hills, and there was a white gold rush, and the were essentially stolen from their rightful owners despite the treaty. That was what caused the Battle of Little Big Horn in 1876.

In the 1970s, the Supreme Court offered the Sioux $100 million in reparations for their loss of the Black Hills. The tribe rejected the offer, Pruitt writes, saying that the hills are sacred and were never for sale.

In 2017, Trump signed an order much reducing the size of the Bears Ears national park, endangering American Indian heritage and burial sites and provoking a lawsuit from the latter. Trump was the true heir of Alexander Stuart and Hiram Price.

I don’t think we really recognize the true historical significance of Deb Haaland heading the department of the interior. Here is an opportunity for rethinking policy toward the first Americans, and for forms of healing and restitution.

e-max.it: your social media marketing partner
 
<< Start < Prev 251 252 253 254 255 256 257 258 259 260 Next > End >>

Page 257 of 3432

THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community.

RSNRSN