RSN Fundraising Banner
FB Share
Email This Page
add comment
Politics
How Revolutionary Pacifism Can Preserve the Species Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=7646"><span class="small">Noam Chomsky, AlterNet</span></a>   
Sunday, 25 August 2013 14:57

Chomsky writes: "Can we proceed to at least limit the scourge of war? One answer is given by absolute pacifists, including people I respect though I have never felt able to go beyond that."

Author, historian and political commentator Noam Chomsky. (photo: Ben Rusk/flickr)
Author, historian and political commentator Noam Chomsky. (photo: Ben Rusk/flickr)


How Revolutionary Pacifism Can Preserve the Species

By Noam Chomsky, Alternet

25 August 13

 

Modern warfare capabilities have taken humanity to the brink. It starts with accepting violence as a solution.

The following is the text of lecture given by Chomsky upon being awarded the Sydney Peace Prize, November 1, 2011. It remains one of the most powerful and persuasive arguments for recognizing the dangers that modern, industrialized warfare pose to the future of humankind.

s we all know, the United Nations was founded "to save succeeding generations from the scourge of war." The words can only elicit deep regret when we consider how we have acted to fulfill that aspiration, though there have been a few significant successes, notably in Europe.

For centuries, Europe had been the most violent place on earth, with murderous and destructive internal conflicts and the forging of a culture of war that enabled Europe to conquer most of the world, shocking the victims, who were hardly pacifists, but were "appalled by the all-destructive fury of European warfare," in the words of British military historian Geoffrey Parker.

And [it] enabled Europe to impose on its conquests what Adam Smith called "the savage injustice of the Europeans," England in the lead, as he did not fail to emphasize. The global conquest took a particularly horrifying form in what is sometimes called "the Anglosphere," England and its offshoots, settler-colonial societies in which the indigenous societies were devastated and their people dispersed or exterminated.

But since 1945 Europe has become internally the most peaceful and in many ways most humane region of the earth - which is the source of some its current travail, an important topic that I will have to put aside.

In scholarship, this dramatic transition is often attributed to the thesis of the "democratic peace": democracies do not go to war with one another. Not to be overlooked, however, is that Europeans came to realise that the next time they indulge in their favorite pastime of slaughtering one another, the game will be over: civilization has developed means of destruction that can only be used against those too weak to retaliate in kind, a large part of the appalling history of the post-World War II years.

It is not that the threat has ended. US-Soviet confrontations came painfully close to virtually terminal nuclear war in ways that are shattering to contemplate, when we inspect them closely. And the threat of nuclear war remains all too ominously alive, a matter to which I will briefly return.

Can we proceed to at least limit the scourge of war? One answer is given by absolute pacifists, including people I respect though I have never felt able to go beyond that. A somewhat more persuasive stand, I think, is that of the pacifist thinker and social activist A.J. Muste, one of the great figures of 20th century America, in my opinion: what he called "revolutionary pacifism."

Muste disdained the search for peace without justice. He urged that "one must be a revolutionary before one can be a pacifist" - by which he meant that we must cease to "acquiesce [so] easily in evil conditions," and must deal "honestly and adequately with this ninety percent of our problem" - "the violence on which the present system is based, and all the evil - material and spiritual - this entails for the masses of men throughout the world."

Unless we do so, he argued, "there is something ludicrous, and perhaps hypocritical, about our concern over the ten per cent of the violence employed by the rebels against oppression" - no matter how hideous they may be. He was confronting the hardest problem of the day for a pacifist, the question whether to take part in the anti-fascist war.

In writing about Muste's stand 45 years ago, I quoted his warning that "The problem after a war is with the victor. He thinks he has just proved that war and violence pay. Who will teach him a lesson?" His observation was all too apt at the time, while the Indochina wars were raging. And on all too many other occasions since.

The allies did not fight "the good war," as it is commonly called, because of the awful crimes of fascism. Before their attacks on western powers, fascists were treated rather sympathetically, particularly "that admirable Italian gentleman," as FDR called Mussolini.

Even Hitler was regarded by the US State Department as a "moderate" holding off the extremists of right and left. The British were even more sympathetic, particularly the business world. Roosevelt's close confidant Sumner Welles reported to the president that the Munich settlement that dismembered Czechoslovakia "presented the opportunity for the establishment by the nations of the world of a new world order based upon justice and upon law," in which the Nazi moderates would play a leading role.

As late as April 1941, the influential statesman George Kennan, at the dovish extreme of the postwar planning spectrum, wrote from his consular post in Berlin that German leaders have no wish to "see other people suffer under German rule," are "most anxious that their new subjects should be happy in their care," and are making "important compromises" to assure this benign outcome.

Though by then the horrendous facts of the Holocaust were well known, they scarcely entered the Nuremberg trials, which focused on aggression, "the supreme international crime differing only from other war crimes in that it contains within itself the accumulated evil of the whole": in Indochina, Iraq, and all too many other places where we have much to contemplate.

The horrifying crimes of Japanese fascism were virtually ignored in the postwar peace settlements. Japan's aggression began exactly 80 years ago, with the staged Mukden incident, but for the West, it began 10 years later, with the attack on military bases in two US possessions.

India and other major Asian countries refused even to attend the 1951 San Francisco Peace Treaty conference because of the exclusion of Japan's crimes in Asia - and also because of Washington's establishment of a major military base in conquered Okiniwa, still there despite the energetic protests of the population.

It is useful to reflect on several aspects of the Pearl Harbor attack. One is the reaction of historian and Kennedy advisor Arthur Schlesinger to the bombing of Baghdad in March 2003. He recalled FDR's words when Japan bombed Pearl Harbor on "a date which will live in infamy." "Today it is we Americans who live in infamy," Schlesinger wrote, as our government adopts the policies of imperial Japan - thoughts that were barely articulated elsewhere in the mainstream, and quickly suppressed: I could find no mention of this principled stand in the praise for Schlesinger's accomplishments when he died a few years later.

We can also learn a lot about ourselves by carrying Schlesinger's lament a few steps further. By today's standards, Japan's attack was justified, indeed meritorious. Japan, after all, was exercising the much lauded doctrine of anticipatory self-defense when it bombed military bases in Hawaii and the Philippines, two virtual US colonies, with reasons far more compelling than anything that Bush and Blair could conjure up when they adopted the policies of imperial Japan in 2003.

Japanese leaders were well aware that B-17 Flying Fortresses were coming off the Boeing production lines, and they could read in the American press that these killing machines would be able to burn down Tokyo, a "city of rice-paper and wood houses." A November 1940 plan to "bomb Tokyo and other big cities" was enthusiastically received by Secretary of State Cordell Hull. FDR was "simply delighted" at the plans "to burn out the industrial heart of the Empire with fire-bomb attacks on the teeming bamboo ant heaps of Honshu and Kyushu," outlined by their author, Air Force General Chennault.

By July 1941, the Air Corps was ferrying B-17s to the Far East for this purpose, assigning half of all the big bombers to this region, taking them from the Atlantic sea-lanes. They were to be used if needed "to set the paper cities of Japan on fire," according to General George Marshall, Roosevelt's main military adviser, in a press briefing three weeks before Pearl Harbor. Four days later, New York Times senior correspondent Arthur Krock reported US plans to bomb Japan from Siberian and Philippine bases, to which the Air Force was rushing incendiary bombs intended for civilian targets. The US knew from decoded messages that Japan was aware of these plans.

History provides ample evidence to support Muste's conclusion that "The problem after a war is with the victor, [who] thinks he has just proved that war and violence pay." And the real answer to Muste's question, "Who will teach him a lesson?," can only be domestic populations, if they can adopt elementary moral principles.

Even the most uncontroversial of these principles could have a major impact on ending injustice and war. Consider the principle of universality, perhaps the most elementary of moral principles: we apply to ourselves the standards we apply to others, if not more stringent ones. The principle is universal, or nearly so, in three further respects: it is found in some form in every moral code; it is universally applauded in words, and consistently rejected in practice. The facts are plain, and should be troublesome.

The principle has a simple corollary, which suffers the same fate: we should distribute finite energies to the extent that we can influence outcomes, typically on cases for which we share responsibility. We take that for granted with regard to enemies. No one cares whether Iranian intellectuals join the ruling clerics in condemnation of the crimes of Israel or the United States. Rather, we ask what they say about their own state. We honored Soviet dissidents on the same grounds.

Of course, that is not the reaction within their own societies. There dissidents are condemned as "anti-Soviet" or supporters of the Great Satan, much as their counterparts here are condemned as "anti-American" or supporters of today's official enemy. And of course, punishment of those who adhere to elementary moral principles can be severe, depending on the nature of the society.

In Soviet-run Czechoslovakia, for example, Vaclav Havel was imprisoned. At the same time, in US-run El Salvador his counterparts had their brains blown out by an elite battalion fresh from renewed training at the John F. Kennedy School of Special Warfare in North Carolina, acting on explicit orders of the High Command, which had intimate relations with Washington. We all know and respect Havel for his courageous resistance, but who can even name the leading Latin American intellectuals, Jesuit priests, who were added to the long bloody trail of the Atlacatl brigade shortly after the fall of the Berlin Wall - along with their housekeeper and daughter, since the orders were to leave no witnesses?

Before we hear that these are exceptions, we might recall a truism of Latin American scholarship, reiterated by historian John Coatsworth in the recently published Cambridge University History of the Cold War: from 1960 to "the Soviet collapse in 1990, the numbers of political prisoners, torture victims, and executions of nonviolent political dissenters in Latin America vastly exceeded those in the Soviet Union and its East European satellites." Among the executed were many religious martyrs, and there were mass slaughters as well, consistently supported or initiated by Washington. And the date 1960 is highly significant, for reasons we should all know, but I cannot go into here.

In the West all of this is "disappeared," to borrow the terminology of our Latin American victims. Regrettably, these are persistent features of intellectual and moral culture, which we can trace back to the earliest recorded history. I think they richly underscore Muste's injunction.

If we ever hope to live up to the high ideals we passionately proclaim, and to bring the initial dream of the United Nations closer to fulfillment, we should think carefully about crucial choices that have been made, and continue to be made every day - not forgetting "the violence on which the present system is based, and all the evil - material and spiritual - this entails for the masses of men throughout the world." Among these masses are 6 million children who die every year because of lack of simple medical procedures that the rich countries could make available within statistical error in their budgets. And a billion people on the edge of starvation or worse, but not beyond reach by any means.

We should also never forget that our wealth derives in no small measure from the tragedy of others. That is dramatically clear in the Anglosphere. I live in a comfortable suburb of Boston. Those who once lived there were victims of "the utter extirpation of all the Indians in most populous parts of the Union" by means "more destructive to the Indian natives than the conduct of the conquerors of Mexico and Peru" - the verdict of the first Secretary of War of the newly liberated colonies, General Henry Knox.

They suffered the fate of "that hapless race of native Americans, which we are exterminating with such merciless and perfidious cruelty...among the heinous sins of this nation, for which I believe God will one day bring [it] to judgement" - the words of the great grand strategist John Quincy Adams, intellectual author of Manifest Destiny and the Monroe Doctrine, long after his own substantial contributions to these heinous sins. Australians should have no trouble adding illustrations.

Whatever the ultimate judgment of God may be, the judgment of man is far from Adams's expectations. To mention a few recent cases, consider what I suppose are the two most highly regarded left-liberal intellectual journals in the Anglosphere, the New York and London Reviews of Books.

In the former, a prominent commentator recently reported what he learned from the work of the "heroic historian" Edmund Morgan: namely, that when Columbus and the early explorers arrived they "found a continental vastness sparsely populated by farming and hunting people ... In the limitless and unspoiled world stretching from tropical jungle to the frozen north, there may have been scarcely more than a million inhabitants."

The calculation is off by tens of millions, and the "vastness" included advanced civilizations, facts well known to those who choose to know decades ago. No letters appeared reacting to this truly colossal case of genocide denial. In the companion London journal a noted historian casually mentioned the "mistreatment of the Native Americans," again eliciting no comment. We would hardly accept the word "mistreatment" for comparable or even much lesser crimes committed by enemies.

Recognition of heinous crimes from which we benefit enormously would be a good start after centuries of denial, but we can go on from there. One of the main tribes where I live was the Wampanoag, who still have a small reservation not too far away. Their language has long ago disappeared. But in a remarkable feat of scholarship and dedication to elementary human rights, the language has been reconstructed from missionary texts and comparative evidence, and now has its first native speaker in 100 years, the daughter of Jennie Little Doe, who has become a fluent speaker of the language herself.

She is a former graduate student at MIT, who worked with my late friend and colleague Kenneth Hale, one of the most outstanding linguists of the modern period. Among his many accomplishments was his leading role in founding the study of Aboriginal languages of Australia. He was also very effective in defense of the rights of indigenous people, also a dedicated peace and justice activist. He was able to turn our department at MIT into a center for the study of indigenous languages and active defense of indigenous rights in the Americas and beyond.

Revival of the Wampanoag language has revitalised the tribe. A language is more than just sounds and words. It is the repository of culture, history, traditions, the entire rich texture of human life and society. Loss of a language is a serious blow not only to the community itself but to all of those who hope to understand something of the nature of human beings, their capacities and achievements, and of course a loss of particular severity to those concerned with the variety and uniformity of human languages, a core component of human higher mental faculties. Similar achievements can be carried forward, a very partial but significant gesture towards repentance for heinous sins on which our wealth and power rests.

Since we commemorate anniversaries, such as the Japanese attacks 70 years ago, there are several significant ones that fall right about now, with lessons that can serve for both enlightenment and action. I will mention just a few.

The West has just commemorated the tenth anniversary of the 9/11 terrorist attacks and what was called at the time, but no longer, "the glorious invasion" of Afghanistan that followed, soon to be followed by the even more glorious invasion of Iraq. Partial closure for 9/11 was reached with the assassination of the prime suspect, Osama bin Laden, by US commandos who invaded Pakistan, apprehended him and then murdered him, disposing of the corpse without autopsy.

I said "prime suspect," recalling the ancient though long-abandoned doctrine of "presumption of innocence." The current issue of the major US scholarly journal of international relations features several discussions of the Nuremberg trials of some of history's worst criminals.

There we read that the "U.S. decision to prosecute, rather than seek brutal vengeance was a victory for the American tradition of rights and a particularly American brand of legalism: punishment only for those who could be proved to be guilty through a fair trial with a panoply of procedural protections." The journal appeared right at the time of the celebration of the abandonment of this principle in a dramatic way, while the global campaign of assassination of suspects, and inevitable "collateral damage," continues to be expanded, to much acclaim.

Not to be sure universal acclaim. Pakistan's leading daily recently published a study of the effect of drone attacks and other US terror. It found that "About 80 per cent [of] residents of [the tribal regions] South and North Waziristan agencies have been affected mentally while 60 per cent people of Peshawar are nearing to become psychological patients if these problems are not addressed immediately," and warned that the "survival of our young generation" is at stake.

In part for these reasons, hatred of America had already risen to phenomenal heights, and after the bin Laden assassination increased still more. One consequence was firing across the border at the bases of the US occupying army in Afghanistan - which provoked sharp condemnation of Pakistan for its failure to cooperate in an American war that Pakistanis overwhelmingly oppose, taking the same stand they did when the Russians occupied Afghanistan. A stand then lauded, now condemned.

The specialist literature and even the US Embassy in Islamabad warn that the pressures on Pakistan to take part in the US invasion, as well as US attacks in Pakistan, are "destabilizing and radicalizing Pakistan, risking a geopolitical catastrophe for the United States - and the world - which would dwarf anything that could possibly occur in Afghanistan" - quoting British military/Pakistan analyst Anatol Lieven.

The assassination of bin Laden greatly heightened this risk in ways that were ignored in the general enthusiasm for assassination of suspects. The US commandos were under orders to fight their way out if necessary. They would surely have had air cover, maybe more, in which case there might have been a major confrontation with the Pakistani army, the only stable institution in Pakistan, and deeply committed to defending Pakistan's sovereignty.

Pakistan has a huge nuclear arsenal, the most rapidly expanding in the world. And the whole system is laced with radical Islamists, products of the strong US-Saudi support for the worst of Pakistan's dictators, Zia ul-Haq, and his program of radical Islamisation. This program along with Pakistan's nuclear weapons are among Ronald Reagan's legacies. Obama has now added the risk of nuclear explosions in London and New York, if the confrontation had led to leakage of nuclear materials to jihadis, as was plausibly feared - one of the many examples of the constant threat of nuclear weapons.

The assassination of bin Laden had a name: "Operation Geronimo." That caused an uproar in Mexico, and was protested by the remnants of the indigenous population in the US. But elsewhere few seemed to comprehend the significance of identifying bin Laden with the heroic Apache Indian chief who led the resistance to the invaders, seeking to protect his people from the fate of "that hapless race" that John Quincy Adams eloquently described. The imperial mentality is so profound that such matters cannot even be perceived.

There were a few criticisms of Operation Geronimo - the name, the manner of its execution, and the implications. These elicited the usual furious condemnations, most unworthy of comment, though some were instructive. The most interesting was by the respected left-liberal commentator Matthew Yglesias.

He patiently explained that "one of the main functions of the international institutional order is precisely to legitimate the use of deadly military force by western powers," so it is "amazingly naïve" to suggest that the US should obey international law or other conditions that we impose on the powerless. The words are not criticism, but applause; hence one can raise only tactical objections if the US invades other countries, murders and destroys with abandon, assassinates suspects at will, and otherwise fulfills its obligations in the service of mankind. If the traditional victims see matters somewhat differently, that merely reveals their moral and intellectual backwardness. And the occasional Western critic who fails to comprehend these fundamental truths can be dismissed as "silly," Yglesias explains - incidentally, referring specifically to me, and I cheerfully confess my guilt.

Going back a decade to 2001, from the first moment it was clear that the "glorious invasion" was anything but that. It was undertaken with the understanding that it might drive several million Afghans over the edge of starvation, which is why the bombing was bitterly condemned by the aid agencies that were forced to end the operations on which 5 million Afghans depended for survival.

Fortunately the worst did not happen, but only the most morally obtuse can fail to comprehend that actions are evaluated in terms of likely consequences, not actual ones. The invasion of Afganistan was not aimed at overthrowing the brutal Taliban regime, as later claimed. That was an afterthought, brought up three weeks after the bombing began. Its explicit reason was that the Taliban were unwilling to extradite bin Laden without evidence, which the US refused to provide - as later learned, because it had virtually none, and in fact still has little that could stand up in an independent court of law, though his responsibility is hardly in doubt.

The Taliban did in fact make some gestures towards extradition, and we since have learned that there were other such options, but they were all dismissed in favor of violence, which has since torn the country to shreds. It has reached its highest level in a decade this year according to the UN, with no diminution in sight.

A very serious question, rarely asked then or since, is whether there was an alternative to violence. There is strong evidence that there was. The 9/11 attack was sharply condemned within the jihadi movement, and there were good opportunities to split it and isolate al-Qaeda. Instead, Washington and London chose to follow the script provided by bin Laden, helping to establish his claim that the West is attacking Islam, and thus provoking new waves of terror.

The senior CIA analyst responsible for tracking Osama bin Laden from 1996, Michael Scheuer, warned right away and has repeated since that "the United States of America remains bin Laden's only indispensable ally."

These are among the natural consequences of rejecting Muste's warning, and the main thrust of his revolutionary pacifism, which should direct us to investigating the grievances that lead to violence, and when they are legitimate, as they often are, to address them. When that advice is taken, it can succeed very well. Britain's recent experience in Northern Ireland is a good illustration. For years, London responded to IRA terror with greater violence, escalating the cycle, which reached a bitter peak. When the government began instead to attend to the grievances, violence subsided and terror has effectively disappeared. I was in Belfast in 1993, when it was a war zone, and returned a year ago to a city with tensions, but hardly beyond the norm.

There is a great deal more to say about what we call 9/11 and its consequences, but I do not want to end without at least mentioning a few more anniversaries. Right now happens to be the 50th anniversary of President Kennedy's decision to escalate the conflict in South Vietnam from vicious repression, which had already killed tens of thousands of people and finally elicited a reaction that the client regime in Saigon could not control, to outright US invasion: bombing by the US Air Force, use of napalm, chemical warfare soon including crop destruction to deprive the resistance of food, and programs to send millions of South Vietnamese to virtual concentration camps where they could be "protected" from the guerrillas who, admittedly, they were supporting.

There is no time to review the grim aftermath, and there should be no need to do so. The wars left three countries devastated, with a toll of many millions, not including the miserable victims of the enormous chemical warfare assault, including newborn infants today.

There were a few at the margins who objected - "wild men in the wings," as they were termed by Kennedy-Johnson National Security Adviser McGeorge Bundy, former Harvard Dean. And by the time that the very survival of South Vietnam was in doubt, popular protest became quite strong. At the war's end in 1975, about 70% of the population regarded the war as "fundamentally wrong and immoral," not "a mistake," figures that were sustained as long as the question was asked in polls. In revealing contrast, at the dissident extreme of mainstream commentary the war was "a mistake" because our noble objectives could not be achieved at a tolerable cost.

Another anniversary that should be in our minds today is of the massacre in the Santa Cruz graveyard in Dili just 20 years ago, the most publicised of a great many shocking atrocities during the Indonesian invasion and annexation of East Timor. Australia had joined the US in granting formal recognition to the Indonesian occupation, after its virtually genocidal invasion. The US State Department explained to Congress in 1982 that Washington recognized both the Indonesian occupation and the Khmer Rouge-based "Democratic Kampuchea" regime. The justification offered was that "unquestionably" the Khmer Rouge were "more representative of the Cambodian people than Fretilin was of the Timorese people" because "there has been this continuity [in Cambodia] since the very beginning," in 1975, when the Khmer Rouge took over.

The media and commentators have been polite enough to all this languish in silence, not an inconsiderable feat.

A few months before the Santa Cruz massacre, Australian foreign minister Gareth Evans made his famous statements dismissing concerns about the murderous invasion and annexation on the grounds that "the world is a pretty unfair place ... littered ... with examples of acquisitions of force," so we can therefore look away as awesome crimes continue with strong support by the western powers. Not quite look away, because at the same time Evans was negotiating the robbery of East Timor's sole resource with his comrade Ali Alatas, foreign minister of Indonesia, producing what seems to be the only official western document that recognizes East Timor as an Indonesian province.

Years later, Evans declared that "the notion that we had anything to answer for morally or otherwise over the way we handled the Indonesia-East Timor relationship, I absolutely reject" - a stance that can be adopted, and even respected, by those who emerge victorious. In the US and Britain, the question is not even asked in polite society.

It is only fair to add that in sharp contrast, much of the Australian population, and media, were in the forefront of exposing and protesting the crimes, some of the worst of the past half-century. And in 1999, when the crimes were escalating once again, they had a significant role in convincing US president Clinton to inform the Indonesian generals in September that the game was over, at which point they immediately withdrew allowing an Australian-led peacekeeping force to enter.

There are lessons here too, for the public. Clinton's orders could have been delivered at any time in the preceding 25 years, terminating the crimes. Clinton himself could easily have delivered them four years earlier, in October 2005, when General Suharto was welcomed to Washington as "our kind of guy." The same orders could have been given 20 years earlier, when Henry Kissinger gave the "green light" to the Indonesian invasion, and UN Ambassador Daniel Patrick Moynihan expressed his pride in having rendered the United Nations "utterly ineffective" in any measures to deter the Indonesian invasion - later to be revered for his courageous defense of international law.

There could hardly be a more painful illustration of the consequences of the failure to attend to Muste's lesson. It should be added that in a shameful display of subordination to power, some respected western intellectuals have actually sunk to describing this disgraceful record as a stellar illustration of the humanitarian norm of "right to protect."

Consistent with Muste's "revolutionary pacifism," the Sydney Peace Foundation has always emphasized peace with justice. The demands of justice can remain unfulfilled long after peace has been declared. The Santa Cruz massacre 20 years ago can serve as an illustration. One year after the massacre the United Nations adopted The Declaration on the Protection of All Persons from Enforced Disappearance, which states that "Acts constituting enforced disappearance shall be considered a continuing offence as long as the perpetrators continue to conceal the fate and the whereabouts of persons who have disappeared and these facts remain unclarified."

The massacre is therefore a continuing offence: the fate of the disappeared is unknown, and the offenders have not been brought to justice, including those who continue to conceal the crimes of complicity and participation. Only one indication of how far we must go to rise to some respectable level of civilized behavior.


e-max.it: your social media marketing partner
 
Chelsea Manning and Muhammad Ali Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=25436"><span class="small">Andrew O'Hehir, Salon</span></a>   
Sunday, 25 August 2013 14:47

O'Hehir writes: "I can already hear the Internet howling that it's outrageous to compare a transgender Army private who leaked classified documents to a beloved star athlete. "

Pfc. Manning (left) and Muhammad Ali. (photo: AP/David Bookstaver)
Pfc. Manning (left) and Muhammad Ali. (photo: AP/David Bookstaver)


Chelsea Manning and Muhammad Ali

By Andrew O'Hehir, Salon

25 August 13

 

onsider a polarizing political and cultural figure who is seen by many people as a hero and by others as a traitor – and who has a powerful symbolic importance to members of a persecuted minority. This person adopts a new name and a new identity, which transforms his or her relationship to mainstream culture and confuses both the media and many members of the public. Many people refuse to accept the new name and identity, or treat it as a nickname or a passing fad. At age 25, this person is prosecuted for an act of conscience-driven defiance against America’s war machine, one that arguably serves as a political turning point but also strikes many people as an unpatriotic or treasonous betrayal.

Obviously I’m talking about Chelsea Manning, at least in part. The Army leaker and whistle-blower, who was recently convicted on 17 charges of espionage and theft, declared this week that she identifies as female and no longer wants to be known as Bradley. It’s an event that feels, at this early stage, like a cultural touchstone of our young century. Among other things, as my colleagues Katie McDonough and Mary Elizabeth Williams have addressed, Manning’s “Today” show statement opened up an epistemological and taxonomical can of worms in the mainstream media, which seemed startlingly unprepared to deal with the identity preference of a high-profile transgender person.

For members of the general public to feel confused about the identity issues raised by Manning’s case, and to struggle with the question of how we decide what gender someone is – is it a matter of self-definition, of a legal piece of paper, of genetics or of physical anatomy? – is not especially surprising. That kind of confusion might even be productive, were it dealt with honestly and without hateful or dismissive rhetoric. But this is by no means a new issue for LGBT activists or media professionals; the gender transition of “Matrix” co-director Lana Wachowski (formerly Larry) was handled with far more grace last year. On the Manning story, major news outlets have appeared unsure whether to lead or follow. Most have continued to refer to Manning as a man named Bradley, although the Guardian, the Daily Mail, MSNBC, Rolling Stone, Slate, the Huffington Post and Salon were among the prominent exceptions. The current Wikipedia page is temporarily locked as “Chelsea Manning,” but was apparently changed back and forth five times within a three-hour period on Thursday. Debate continues to rage on the associated Talk page while a seven-day survey in search of consensus progresses. (Seriously, someone needs to preserve that conversation as a key historical document of 2013.)

But everything I said in that first paragraph applies not just to Manning but also to Muhammad Ali, who shocked the nation by claiming a new name and a new identity almost 50 years ago and went on, like Manning, to become an important figure in the domestic resistance to American militarism. After knocking out Sonny Liston and winning the heavyweight championship in February of 1964, the boxer until then known as Cassius Clay announced that he was a Black Muslim who was rejecting his “slave name” for a true one issued by Elijah Muhammad, leader of the Nation of Islam. That’s certainly not the same thing as a person who has previously lived as a man proclaiming that she’s a woman, but in the context of that time, it may have been even more startling.

Ali was stripped of one of his two heavyweight titles, on some outrageous pretext, and barred from boxing in numerous states. Most mainstream reporters and TV commentators openly mocked his new name, treating it as a bizarre affectation or, at best, adding it as an “aka” in parentheses. Many continued to refer to him as Cassius Clay into the 1970s. (One early adopter was Howard Cosell of ABC Sports, who parlayed his combative friendship with Ali into a long broadcasting career.) In Bill Siegel’s bracing new documentary “The Trials of Muhammad Ali,” Robert Lipsyte of the New York Times says his editors decreed that since Ali had become famous under his given name, that had to remain his name indefinitely. That’s exactly the same style-book rationale the Times invoked this week for continuing to call Manning “Bradley.”

I can already hear the Internet howling that it’s outrageous to compare a transgender Army private who leaked classified documents to a beloved star athlete. With the freshness and urgency of unexplored history, Siegel’s film allows us to set aside the goggles through which we view Muhammad Ali today, and see that the social forces that have pilloried and demonized Chelsea Manning in 2013 – not just flag-waving right-wingers, but mainstream politicians and pundits — did exactly the same thing to Ali in the 1960s.

It’s hard to imagine two more different individuals, at least on the surface: On one hand, the loquacious, extroverted star athlete who became the suave, trash-talking ideal of African-American masculinity; on the other, a shy and diminutive white kid from rural Oklahoma, gifted at science and computers, who evidently felt uncomfortable raised as a boy. (We need to be cautious in making assertions about Manning’s personality; she’s had little opportunity, either as Bradley or as Chelsea, to speak for herself.) But if we go a little deeper, some of the parallels between Manning and Ali, and maybe some of the differences too, are instructive. Both were talented and intelligent young people with limited formal education who grew up to be iconoclastic and enormously courageous adults, willing to make unpopular choices and face the consequences.

“The Trials of Muhammad Ali” reminds us that the Ali of the 1960s – a brash, handsome loudmouth who hung around with black radicals — has largely been lost to us through a kind of ex post facto beatification. Mostly silenced and immobilized by Parkinson’s disease in his later life, Ali has become seen as a kind of secular national saint. It wasn’t always that way. In a hard-hitting opening sequence, Siegel first shows us Ali being excoriated on television by talk-show host David Susskind, who was generally considered a liberal. This would have been after Ali’s 1967 conviction for refusing the draft (subsequently overturned by the Supreme Court on technical grounds), which was seen as spreading dangerous and un-American ideas in the black community. Susskind berates Ali as a fool, a traitor and a disgrace to his country and his race. Then we see the latter-day Ali of 2005 at the White House, receiving the Presidential Medal of Freedom from George W. Bush. I’m not saying that any such transformation is remotely likely for Chelsea Manning, but it’s entirely plausible that posterity will view her story differently than we do now.

There have been several memorable documentaries about Ali’s life and boxing career, including Leon Gast’s Oscar-winning “When We Were Kings” and the mid-‘90s TV movie “Muhammad Ali: The Whole Story,” which runs almost six hours in its home-video version. But Siegel, who co-directed the excellent historical documentary “The Weather Underground” (an Oscar nominee in 2004), draws on all kinds of file footage and illuminating interviews to reclaim Ali as an incendiary political figure and a social lightning rod. In exploring what drew Ali to the Nation of Islam, how he negotiated the split between Elijah Muhammad and Malcolm X, and what led him to refuse military induction – instead of, say, taking a cushy job in the reserves or the National Guard, as athletes and entertainers often did — Siegel sheds fascinating new light on one of the most elusive and complicated public figures of the 20th century. Anyone who wants to understand what happened to America during those turbulent years needs to see this film, at least as much as they need to see “Lee Daniels’ The Butler.” And watching this story about how Cassius Clay became Muhammad Ali, in the same week when Bradley Manning became Chelsea, strikes me as one of those cosmic coincidences that can’t be ignored.

After winning a gold medal at the 1960 Rome Olympics, Cassius Clay had a clear opportunity to become an all-American hero in the mold of Joe Louis. By the time he beat Liston – still one of the biggest upsets in sports history – it was clear that he had chosen a different path. Reporters and boxing insiders understood that Clay had already become a member of the Nation of Islam, but promoters did their best to squelch the story. They convinced him not to adopt a Muslim name and to stay away from Malcolm X, then the Nation’s leading intellectual, until the fight was over. Much of the pre-fight coverage was infused with a barely concealed racial anxiety. Liston had long been depicted in the mainstream press as a thuggish, fearsome ex-con – “the big Negro in every white man’s hallway,” as James Baldwin put it. But Clay’s brash, motor-mouth intelligence, and his association with what was often described as a “white-hate group,” suddenly seemed far more dangerous.

So if Cassius Clay was not quite the Bradley Manning of 1964, he was already a divisive figure who pushed a lot of white people’s buttons, well before he “came out” as a Black Muslim. The uproar that followed that announcement, as we see in Siegel’s film, had much of the same consternation, bewilderment and doublespeak we saw this week around Manning. In both cases, many people seemed to feel personally insulted or attacked by this shift of names and identities, although the reasons why are not immediately apparent. It costs us almost nothing to address another person by the name (or the gender) he or she has chosen, and can be viewed as no more than common courtesy. Do we really believe that anyone would make such a major life decision frivolously?

Then as now, some people have tried to insist on legal documents or court orders as the basis for how we name someone. In the film, Lipsyte notes that nobody ever demanded those technicalities from John Wayne, whose legal name from birth to death was Marion Morrison. But John Wayne and Tom Cruise and Snoop Dogg and all the other manufactured showbiz identities carry no hint of political or cultural rebuke. They don’t threaten ingrained and unquestioned ideas about race and gender and the immutable nature of identity quite the way that Muhammad Ali and Chelsea Manning do. We only ask the question “Does that person have the right to call themselves any damn thing they want?” when the answer makes us uncomfortable.

A man who rejects his birth name as a poisonous remnant of slavery might also refuse to fight an overseas war against a people who, as Ali observed, had never hurled racial slurs at him. A person who releases secret documents exposing American war crimes, and is willing to go to prison for it, might also claim the right to live openly under a new name and her true gender identity, and expect us to respect that choice. One of those rebels we have not merely forgiven for his declaration of freedom and his acts of resistance, but chosen to honor and venerate. The other one faces a long and lonely wait. Chelsea, we will not forget.


e-max.it: your social media marketing partner
 
FOCUS | Accepting Chelsea Manning Print
Sunday, 25 August 2013 12:37

Greenhouse writes: "Today, the United States is friendlier to lesbians, gays, and bisexuals than at any point in its history. But there is often still a discomfort toward and about the population represented by the final letter in the acronym L.G.B.T."

Chelsea Manning. (photo: U.S. Army/AP)
Chelsea Manning. (photo: U.S. Army/AP)


Accepting Chelsea Manning

By Emily Greenhouse, The New Yorker

25 August 13

 

In 2010, the WikiLeaks source Bradley Manning confided to Adrian Lamo, a former hacker who eventually turned Manning in to the government, that “I wouldn’t mind going to prison for the rest of my life, or being executed so much, if it wasn’t for the possibility of having pictures of me plastered all over the world press as a boy.”

On Thursday, a day after he was sentenced to thirty-five years in prison, Manning released a statement in which he said, “I am Chelsea Manning. I am a female. Given the way that I feel, and have felt since childhood, I want to begin hormone therapy as soon as possible.” He continued, “I hope that you will support me in this transition.”

Today, the United States is friendlier to lesbians, gays, and bisexuals than at any point in its history. But there is often still a discomfort toward and about the population represented by the final letter in the acronym L.G.B.T. Western society long ago decided that gender is immutable, that men are men and women women in perpetuity—that, somehow, nature or God or whomever is responsible for determining the formation of genitalia and chromosomes in utero is incapable of error. We may know today that this is not true, but we have not figured out how to deal with that fact.

Read More: Accepting Chelsea Manning


e-max.it: your social media marketing partner
 
NSA Abuses Include Stalking Ex-Girlfriends Print
Sunday, 25 August 2013 08:07

Cole writes: "The NSA has dealt with the spying scandal with the classic techniques of government manipulation of the public: Deny for as long as possible, then make few gradual small admissions."

Juan Cole. (photo: Informed Comment)
Juan Cole. (photo: Informed Comment)


NSA Abuses Include Stalking Ex-Girlfriends

By Juan Cole, Informed Comment

25 August 13

 

e have HUMINT, or human intelligence gathered from agents. We have SIGINT or signals intelligence. And now we have LOVEINT or NSA analysts occasionally reading the emails of ex-lovers. It doesn't happen a lot, the NSA told the WSJ, but often enough that there is a word for it.

The NSA only admitted this abuse to the Senate Intelligence committee a few days ago.

The NSA has dealt with the spying scandal with the classic techniques of government manipulation of the public: Deny for as long as possible, then make few gradual small admissions, so when the big abuses come out the press views the story as stale and is unconcerned about the new scale of abuse coming out.

  1. First, deny everything. Say it is impossible to access individual Americans' email as they are typing.

  2. Use the difference between statute (laws passed by Congress) and Ronald Reagan's 1981 Executive Order (which responded to earlier intel abuses and forbids spying on Americans) to deny that any "laws" have been broken. (An Executive Order has the force of law but isn't exactly a law.). Notorious authoritarians like Mike Rogers (R-MI), head of the House Intelligence Committee, have used this ploy. Rogers wouldn't know a civil liberty if he tripped over it.

  3. Admit the capability but insist there are strict controls absolutely preventing abuse.

  4. Insist that the FISA court and the House and Senate intelligence committees have full oversight.

  5. Admit that the NSA repeatedly lied to the FISA court.

  6. Admit a few tens of thousands of in-country US emails were collected before the FISA judges found out and stopped it.

  7. Admit you haven't actually been telling Congress about the abuses

  8. Admit that you've been sharing info on Americans gained through warrantless surveillance with the Drug Enforcement Agency and local law enforcement, who then lied about how they came by the evidence

  9. admit that just a handful of LOVEINT stalking abuses have occurred

Yet to come: revelations that the British GCHQ, having been paid $150 million to do it, spies on Americans for the NSA & then shares the info

And: Perhaps, the 'handful of times' the NSA has engaged in insider trading and affected stock movements

And: Perhaps, the handful of times the NSA has blackened the reputations of politicians it didn't like

and other handfuls of times Secret Government, which is always tyranny, has trumped democracy and the Constitution


e-max.it: your social media marketing partner
 
It's No Longer 'Sarah Palin's Alaska' Print
Saturday, 24 August 2013 14:00

Daniel writes: "A former Republican mayor of Valdez (Bill Walker) and an incumbent Democratic state senator (Bill Wielechowski) are considering joining forces to run for Governor and Lt Governor, respectively. Their mission: to defeat incumbent Governor Sean Parnell..."

Former Alaska governor Sarah Palin. (photo: KPA/Zuma/Rex Features)
Former Alaska governor Sarah Palin. (photo: KPA/Zuma/Rex Features)


It's No Longer 'Sarah Palin's Alaska'

By Daniel Kenealy, Guardian UK

24 August 13

 

Contrary to the Palin stereotypes, Alaskan politics is now a model of bipartisanship

n 2008 John McCain, the Republican nominee for president, flirted with selecting Joe Lieberman, an independent, who had been on the Democratic presidential ticket in 2000, as his running mate. Disconcerted by the reaction of his own base he gambled on Sarah Palin instead. The rest is history. Palin, whose transformation from disastrous vice-presidential candidate to the best actress in US politics continues to baffle and amaze in equal measure, may be Alaska's most famous political export.

But all that could change in 2014. For what McCain was almost courageous enough to do might actually happen in Alaska. A former Republican mayor of Valdez (Bill Walker) and an incumbent Democratic state senator (Bill Wielechowski) are considering joining forces to run for Governor and Lt Governor, respectively. Their mission: to defeat incumbent Governor Sean Parnell, who ascended to the governor's mansion following Palin's memorable, ahem, resignation in 2009 (to be fair, he was elected in his own right in November 2010).

While Parnell might not look that vulnerable on paper. there is a sense that the state is reaching a moment - not unlike the US as a whole - where a set of mounting public policy challenges need to be addressed. For the first time in years Alaska posted a deficit. And that is likely to be repeated next year. One of the reasons why the fiscal position has deteriorated is the passage of SB 21; a bill that gave tax breaks to oil and gas companies. Supporters of the bill, including Governor Parnell who championed it with vigour, claimed it would trigger economic growth through increased production. Opponents see it as no more than a giveaway of Alaska's natural resources and have successfully campaigned to have a vote on whether to repeal SB 21 placed on the primary ballot next August.

The state is also wrestling with such "wicked" problems as rural poverty, domestic violence, and drug abuse. And a natural gas pipeline, long a talking point, is nowhere close to being developed. A dynamic partnership between a Republican and a Democrat who happen to see eye-to-eye on many of these most pressing public policy challenges could be the silver bullet required to make progress.

It's an interesting testing ground for politics elsewhere in the US. The potential coalition of Alaskan voters that a Walker-Wielechowski ticket would target comprises independents, progressives, and moderates from both major parties. Moderate Democrats may find a bipartisan ticket such as this more appealing than the likely futile candidacies of Ethan Berkowitz, Les Gara or Hollis French, one of whom will emerge as the party's nominee. And there are plenty of registered Republicans, as I learned during a recent visit to Alaska, angry enough at Parnell's SB 21 giveaway to give Walker a second look.

Fair enough, this sort of independent, bipartisan politics might play better in a state where more than half of all registered voters are 'Independents'. Alaska, despite the stereotypes that may prevail (stereotypes that Palin does little to dispel), was quick to shun the Tea Party. Joe Miller, the Palin-supported Tea Party candidate, may have won the Republican senate primary in 2010, but he was taken down by a write-in campaign by the moderate Republican incumbent Lisa Murkowski. Palin has seen her popularity - in a state that once gave her an over 80% approval rating - tumble so low that in a hypothetical presidential match-up Alaskan voters would be more supportive of Hillary Clinton.

It may be that such pragmatism is the precursor to bipartisanship, in which case Washington DC may be doomed for several electoral cycles to come. Yet that same coalition of voters - moderate Republicans and Democrats, progressives, and independents - exists across the US Indeed this is something that Charles Wheelan identified in a recent, and excellent book, The Centrist Manifesto. What they need are standard-bearers.

Walker-Wielechowski still might not happen. And if it does taking on and defeating an incumbent with Parnell's strengths will not be easy. But, given that it offers a prospect of the sort of bipartisan pragmatism that US politics so desperately needs, let us hope that does happen.

With Palin herself hinting at a prospective run for one of Alaska's two US Senate seats, the 2014 election cycle in Alaska could prove to be just as fascinating as anything going on in the lower 48.

e-max.it: your social media marketing partner
 
<< Start < Prev 3061 3062 3063 3064 3065 3066 3067 3068 3069 3070 Next > End >>

Page 3065 of 3432

THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community.

RSNRSN