RSN Fundraising Banner
FB Share
Email This Page
add comment
Politics
Nothing You Think Matters Today Will Matter the Same Way Tomorrow Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=6853"><span class="small">Frank Rich, New York Magazine</span></a>   
Wednesday, 22 October 2014 07:43

Rich writes: "ISIS, Khorasan, Ferguson, Gaza, Putin: The summer of 2014 had been deemed America's "worst ever" well before Ebola, the Ray Rice video, and the Secret Service debacle kicked in."

Today's news won't be viewed the same way tomorrow. (photo: NYMag)
Today's news won't be viewed the same way tomorrow. (photo: NYMag)


Nothing You Think Matters Today Will Matter the Same Way Tomorrow

By Frank Rich, New York Magazine

22 October 14

 

SIS, Khorasan, Ferguson, Gaza, Putin: The summer of 2014 had been deemed America’s “worst ever” well before Ebola, the Ray Rice video, and the Secret Service debacle kicked in. One sees the point even if it requires historical amnesia about other bad summers (like, say, that one with the Battle of Gettysburg). But you also have to ask: What was a great American summer, exactly? Lazy, hazy 2001, when a peaceful country and its new president nodded off through Labor Day, worrying about little more than an alleged uptick in shark attacks?

The fact is that we can’t write history while we’re in it — not even that first draft of history that journalists aspire to write. While 2014 may have a shot at eternal infamy, our myopia and narcissism encourage us to discount the possibility that this year could be merely an inconsequential speed bump on the way to some greater catastrophe or unexpected nirvana. This was brought home to me when, in a quest for both a distraction from and a perspective on our current run of dreadful news, I revisited 1964, the vintage American year that has been serving as an unofficial foil, if not antidote, to 2014.

It’s easy to travel back there, thanks to the many media retrospectives marking 1964’s half-centenary in ironic counterpoint to this year’s woes. You could feast on the commemorations of Lyndon Johnson’s Civil Rights Act — The Bill of the Century, as one new book had it — to learn how Washington once accomplished great things. You could relive The Night That Changed America, CBS’s canonization of the Beatles’ first Ed Sullivan Show appearance. PBS fielded two documentaries: Freedom Summer, chronicling that courageous civil-rights movement in Mississippi, and 1964, which, like its inspiration, Jon Margolis’s book The Last Innocent Year, distinguished the relatively upbeat 1964 from the apocalyptic ’60s to come. The Times and Daily News saluted the 1964 World’s Fair, with its quaint American Century faith in the inexorability of progress. The fair’s slogan was “Peace Through Understanding,” and its equally naïve signature anthem was “It’s a Small World (After All).” As the soundtrack for a popular ride cross-promoting unicef and Pepsi, the song may not have advanced world peace, but it did do its bit to hook a generation of kids on sugary drinks.

If you were young in 1964, you may have fond memories of that year as well. I have never forgotten how the Beatles, who landed in my hometown of Washington, D.C., for their first American concert in February, chased away the Kennedy-assassination hangover, giving me and my ninth-grade peers permission to party again. From my own immature, hormonally addled perspective, the world kept leaping forward throughout that year as if a stiff wind were at its back—culminating with the election in which Johnson buried the opponent my elders deemed a trigger-happy proponent of nuclear Armageddon. It was a time when many in my boomer generation fell in love with the idea that change was something you could believe in—a particularly liberal notion that has taken hold in other generations, too, whether in the age of Roosevelt or Obama. Even as we recognize that the calendar makes for a crude and arbitrary marker, we like to think that history visibly marches on, on a schedule we can codify.

The more I dove back into the weeds of 1964, the more I realized that this is both wishful thinking and an optical illusion. I came away with a new appreciation of how selective our collective memory is, and of just how glacially history moves, despite the can-do optimism of a modern America besotted with the pursuit of instant gratification. Asked at the time of the 1964 World’s Fair to anticipate 2014, Isaac Asimov got some things right (miniaturized computers, online education, flat-screen television, and what we now know as Skype), but many of his utopian predictions were delusional. His wrong calls included not just his interplanetary fantasies but his vision of underground suburbs that would protect mankind from war, rampaging weather, and the tyranny of the automobile. Asimov also thought birth control would find international acceptance. It was no doubt beyond even his imagination that a half-century hence American lawmakers would introduce “personhood” amendments attempting to all but outlaw contraception.

The screenwriter William Goldman famously summed up Hollywood in three words: “Nobody knows anything.” Would that this aphorism were applicable, as he intended, solely to the make-believe of show business. It often seems that nobody knew anything about anything in 1964. Most everyone was certain that the big political developments of the time, epitomized by LBJ’s victories for civil rights and against Goldwater, would be transformational. Many of the same seers saw the year’s cultural upheavals, starting with the Beatles, as ephemera. More often than not, the reverse has turned out to be true. Are we so much smarter in 2014?

To try to simulate how 1964 played out in real time, I didn’t rely on historians’ or public television’s subsequent interpretations but instead used as a baseline the Times, then as now the liberal newspaper of record. Among the digital gadgets with which the paper tries to retain subscribers in the post-print age is what it calls TimesMachine: full replicas of past editions (from 1851 to 1980) that you can leaf through in facsimile page by page online, much as one used to do with microfilm. Be warned: You can plug yourself into this machine and not pry yourself loose for hours. The acres of classified ads alone are New York City’s answer to the Dead Sea Scrolls. I was hooked from the moment I summoned up the paper of January 1, 1964.

As it happened, I did so in August, just as racial turmoil was riveting the nation on the heretofore obscure St. Louis suburb of Ferguson. I got all the way to page ten of this antique New Year’s Day edition before I was stopped short by a headline over a brief AP story: “St. Louis Holds 24 in Racial Protest.” Elsewhere in that same paper was an unintended paradigm of a culture war we’ve lived with ever since. A page-one story announced that the 55-year-old New York governor and presidential candidate Nelson Rockefeller was expecting a child by his 37-year-old second wife. The front-page play, combined with the detailed chronology of the expectant parents’ marital histories, amounted to an editorial message: The divorced Rockefeller was no shoo-in with a Leave It to Beaver electorate that couldn’t watch married couples share the same bed on network television. (This taboo would be killed off later that year by Bewitched.) As if to rub it in, the same Times contained the prominent an­nouncement of a June wedding for Arizona senator Barry Goldwater’s daughter. The juxtaposition of these two aspiring First Families, one epitomizing the old Eastern Establishment and the other a new conservative order rising across the Sun Belt, would climax in July with the raucous booing of Rockefeller by the victorious Goldwater forces at the Republican National Convention.

At the back of the January 1, 1964, paper was yet another cultural artifact that seems contemporary a half-century later: an ad for NBC’s coverage of the Tournament of Roses Parade, graced by a photo of a star attraction, Betty White. Who would have imagined that she would outlast half the Beatles? A dueling ad for CBS coverage of the same festivities in Pasadena featured Ronald Reagan, then a Hollywood has-been, sharing the bill with the former Miss America turned game-show panelist Bess Myerson. Reagan’s true star turn of autumn 1964 — the prime-time preelection speech that ignited his political career — would barely be noted by the Times.

Steeping myself in the remaining days of 1964, I was struck by recurring patterns. Almost every contemporaneous sighting of an imminent resolution of a domestic or international conflict was premature. Almost every political division and social injustice that continues to plague America today was visible then. Almost every lasting cultural innovation, from the experimental-film revolution in the East Village to Pop Art uptown, was ignored, ridiculed, condescended to, or dismissed by the Times and its Establishment peers. While there has been epic progress on some major fronts in the 50 years since — the Soviet Union and Jim Crow both collapsed, for starters — there has not been nearly as much forward movement as imagined by those who were there then.

It was 1964 when the tantalizing prospect of a woman president was raised by the first female presidential candidate entering a major party’s primary, the much-admired Senator Margaret Chase Smith, Republican of Maine. But neither that front-page development nor Betty Friedan’s 1963 The Feminine Mystique, then reaching a wide audience in paperback, fast-tracked a dream that may finally be realized in our own time. (Even now, men hold some 71 percent of all political offices in America.) Action on the public-health menace of smoking was on another slow track. When a government committee’s finding of the link between cigarettes and cancer became a front-page story in January, the Times added a sidebar explaining that this discovery was already old news: The new report echoed previous studies dating back to 1859 in France and 1936 in America. Nonetheless, it would not be until 1971 that the first substantial government regulation, barring cigarette ads on television, would go into effect.

Some less culturally loaded, eminently fixable American maladies identified in 1964 have been left to fester ever since — as exemplified by a subheadline beneath the Times banner announcing the Warren Commission Report: “Security Steps Taken by Secret Service Held Inadequate.” A 1964 best seller, Vance Packard’s The Naked Society, sounded the alarm that “the surveillance of citizens” was growing as a result of the rise of “peering electronic eyes, undercover agents, lie detectors, hidden tape recorders, bureaucratic investigators, and outrageously intrusive questionnaires,” not to mention wholesale domestic spying by the departments of Defense and Justice. Packard even foresaw internet snooping in the prospect of “cabled TV” collecting its users’ personal information. But in contrast to muckrakers like Upton Sinclair and Ida Tarbell during the Progressive Era, Packard could not spark a scintilla, let alone an age, of reform. A half-century later, we’re more exposed than ever.

The most explosive conflicts of 1964 remain entirely intact in America today: race, war, and the ideological battle over the role of government. The struggle for voting rights that led to the murder of three civil-rights workers in Mississippi that summer — and to the Voting Rights Act of 1965 — continues to be fought in 2014, state by state and court by court, as if the bloody victories of a half-century ago were only provisional skirmishes in a never-ending civil war. Confrontations between white police and minority populations remain on a parallel continuum. Among the countless antecedents of this summer’s Ferguson unrest is a now half-forgotten incident from July 1964 — a days-long Harlem riot set off when an off-duty white police officer shot a black teenager. (This crisis was itself a replay of a 1943 Harlem riot that left five dead and 500 injured after a police officer shot a black soldier.) Yet faith in racial progress in 1964 was pervasive, at least among white liberals, despite such prominent dissenting voices as James Baldwin, whose The Fire Next Time was published a year earlier, and the new heavyweight champion, Cassius Clay, soon to be known as Muhammad Ali. Shocking as Clay’s victory over the favorite, Sonny Liston, was to the sports press, even more puzzling was his solidarity with Malcolm X and his contempt for Martin Luther King Jr.’s civil-rights movement. “I’m a citizen already” was how Clay put it.

Such sour notes could be dismissed by whites who saw racial progress conspicuously at hand. In 1964, King received the Nobel Peace Prize and Sidney Poitier became the first African-American to win an Oscar for Best Actor (for Lilies of the Field). Once LBJ signed the Civil Rights Act in July, the Times happily reported that “despite some bitter resistance, compliance is generally good” in the South. (The Mississippi murder victims’ bodies would be found a month later.) The legendary Times reporter Homer Bigart, on the ground in Savannah, wrote that Gold­water’s opposition to the civil-rights law notwithstanding, “most observers gave him scant chance” of winning Georgia, with its reputation for moderation in racial matters. In fact, Goldwater would win the state by a margin of 54 to 46 percent and sweep the rest of the Deep South by a landslide more lopsided than Johnson’s in the rest of the nation.

Resistance to desegregation was hardly limited to the old Confederacy. You will search mostly in vain for blacks (or other racial minorities) in newspaper and magazine advertisements of 1964, the heyday of the Mad Men era. There was “not a single Negro or Puerto Rican” among some 200 administrators at the World’s Fair, Robert Caro writes in The Power Broker, his biography of the fair’s vainglorious impresario, Robert Moses. In a new history of the fair, Tomorrow-Land, Joseph Tirella observes that “the glittering pavilions of Flushing Meadow’s manicured Fairgrounds and its utopian slogans had little—if anything—to do with the political turmoil of the city around it.” Third World countries were widely represented in Queens, but Moses vetoed Transit Authority plans that would have made the fair more accessible to low-income and minority visitors from the other boroughs. Still, the Illinois “Land of Lincoln” pavilion at the World’s Fair extolled the Great Emancipator, in the form of a Disney audio-animatronic robot, much as Broadway in 2014 would suit up Bryan Cranston to portray LBJ as Lincoln’s successor in All the Way. Then as now, a happy ending to America’s racial drama can always be had for the price of a ticket.

At least race was recognized as a battleground by Americans in 1964, albeit one where a truce was mistakenly seen on the horizon. The war in Vietnam was not seen as a real war, or a subject of debate, except to some scattered, under-the-radar draft protesters. When murky confrontations between U.S. destroyers and North Vietnamese torpedo boats in the Gulf of Tonkin moved Congress in August to give President Johnson a blank check to respond, the public yawned. The Tonkin resolution passed the Senate 88 to 2 after nine hours of debate following a 416-to-0 ratification in the House. The Times’ Washington wise man James Reston was baffled by the apathy of a populace unruffled by “a president who announces that he is willing to risk war with a quarter of the human race in China in order to save Vietnam” and “even more indifferent to cries from the Republican opposition for a policy that is even bolder.”

But Johnson’s vanquishing of the super-hawk Goldwater in November 1964 quieted fears that America would be caught up in a wider war in Southeast Asia. Anxieties about greater racial turmoil could also be put aside thanks to Johnson’s decisive victory. Analyzing the election returns, the Times found no sizable white backlash outside the old Confederacy; Reston wrote that “even the Middle Western Bible Belt” had turned against Goldwater. Johnson’s Great Society and War on Poverty had been vindicated by the voters, too. The election was “as fundamental in its way as the ratification of the New Deal 28 years ago,” according to the Times. The public had firmly rejected the right’s “assault on ‘big government’ and the social and welfare programs of the last thirty years.” No doubt Republican moderates who had outperformed the national ticket, rising stars like George Romney and John Lindsay, would end the GOP’s “revolutionary break with the centrist tradition of American politics” and lead the GOP “back into the sunlight of modernism.”

As we know now, just about every one of these conclusions was soon — in some cases very soon — proved wrong. By the end of 1965, there would be 180,000 American troops in Vietnam. Four years later, the metastasizing conflicts in Southeast Asia and at home would motivate a fair number of Americans both black and white to take to the streets to try to burn the place down.

Looking back at 1964 from the vantage point of 2014, you realize that one revolt was already in full swing and having a visible impact: the radical transformation of American culture. The February night that the Beatles first played The Ed Sullivan Show was hardly the night that changed America, but it was part of a wave that was unstoppable, no matter how much the powers that be, from show-business moguls to plain old middle-class suburban parents, tried to resist it. Revisiting 1964, you can see the old order fracture week by week.

The Times started mobilizing against the British Invasion even before the Beatles landed in February. When a filmed segment of the band turned up on The Jack Paar Program after New Year’s, the television critic Jack Gould declared that “on this side of the Atlantic it is dated stuff.” Once the Beatles appeared live in New York on Ed Sullivan, Gould dismissed them again, likening their shaggy hair to the wig worn by the morning-­children’s-show host Captain Kangaroo. A front-page news story theorized that the Beatlemania “craze” faced “an awful prospect of demise.”

Such reaction was universal, across the cultural and political spectrum. At The New Yorker, “The Talk of the Town” subcontracted Beatles commentary to a young music fan who pronounced the group inferior to the Everly Brothers. The Washington Post went to the Beatles’ debut concert and branded them a “commonplace, rather dull act that hardly seems to merit mentioning.” William F. Buckley Jr. found the Beatles “not merely awful” but “God awful” and “appallingly unmusical.” The Nation dismissed them as a “very safe” diversion for the complacent upper-middle class. Robert Moses refused to book the Beatles at the World’s Fair, though they would soon fill the new Shea Stadium next door. He instead stuck with nightly concerts by Guy Lombardo and the Royal Canadians, whose pre-jazz brand of easy-listening band music dated back to the late ’20s. Lombardo, Moses insisted, was “a favorite with many of the kids, if not the wildest ones.”

The effort to push back against an incipient counterculture extended well beyond music. Lenny Bruce was arrested for indecency at Café au Go Go in the Village. Pop Art was seen as a juvenile prank: When the architect Philip Johnson commissioned an Andy Warhol mural for the fair’s New York State pavilion, the result, a jamboree of homoerotic police mug shots titled 13 Most Wanted Men, had to be painted over within days to spare Rockefeller a potential political embarrassment. The perennially best-selling novelist John O’Hara was ubiquitous at the Times and New Yorker, but the upstart Frank O’Hara, whose Lunch Poems was published in 1964, barely registered. A nascent revolution in Hollywood also flummoxed cultural gatekeepers. Confronted with the breakthrough black comedy of the Cold War era, Stanley Kubrick’s Dr. Strangelove, or: How I Learned to Stop Worrying and Love the Bomb, the Times critic Bosley Crowther admitted it was funny but also condemned it as “dangerous.” “Is Nothing Sacred?” was the headline of one of two follow-up pieces he wrote for the Sunday paper. How, he wondered, could a movie have the gall to mock “top-level scientists,” diplomats, “the experts,” prime ministers, and “even the president of the United States” as “fuddy-duds or maniac monsters who are completely unable to control the bomb”? As if to accentuate how far removed such Establishment taste was from this anarchic new fever loose in the American bloodstream, the same editions of the Times featured ads for the notorious Elizabeth Taylor–­Richard Burton turkey Cleopatra, crowing that Crowther had named it “one of the year’s ten best” in 1963.

Resistance to this cultural sea change crumbled swiftly. Even Crowther beat a hasty retreat, putting Strangelove on his 1964 ten-best list in December despite having labeled it “defeatist and destructive of morale” in February. ABC, eager to ride the youth wave, preempted its long-running Western series Wagon Train in November for a Beatles concert. The network also pitted a new rock-and-roll series, Shindig!, against CBS’s cornpone hit The Beverly Hillbillies, ending its reign as the No. 1 Nielsen show. The squarest of television variety hours, The Hollywood Palace and The Red Skelton Hour, felt compelled to compete by booking another new English invader, the Rolling Stones—much to the annoyance of Dean Martin, who insulted the Stones on-camera in his role as Hollywood Palace emcee. It’s hard to believe that a Belgian novelty act, the Singing Nun, had been at the top of the charts when 1964 began. Meet the Beatles would soon outsell the best-selling popular-music long-playing disc of the postwar era, the Broadway cast album of My Fair Lady. Columbia, the august label that had released My Fair Lady and often dominated the popular and classical music industries, released two new Bob Dylan LPs in 1964 alone. Whatever critics or Robert Moses had to say, the cultural marketplace was racing to go where the customers were and commodify every development bubbling up from below. Capitalism trumped any objections from the doomed status quo.

In 2014, our rapid cultural transformations again seem as real as those of a half-century ago, even if the biggest revolution of our era is digital, not musical—a culture spawned in the environs of Cupertino’s Apple rather than Abbey Road’s. But all bets are off on every other event and phenomenon we regard as seismic, game-­changing, and historic in this year. It’s humbling and in some instances a bit reassuring to know that our current hunches about what history holds are likely to be as wrongheaded as many of the definitive judgments of 1964. “Readers of the full report,” a Times editorial intoned about the Warren Commission, “will find no basis for questioning the commission’s conclusion that President Kennedy was killed by Lee Harvey Oswald, acting alone.”

Today you must wonder whether speculation by Leon Panetta that we could have a 30-year-long war against ISIS and its terrorist brethren may be no more on the mark than the Johnson administration’s insistence that no “wider” war would come to Vietnam. The warnings that ISIS is a bigger foe than Al Qaeda — almost immediately rendered inoperative by estimates that Khorasan is more threatening than ISIS — may be worth little more than the intelligence that inflated the hostilities in the Gulf of Tonkin. Putin’s seizing of Crimea — likened in more than a few quarters to Nazi Germany’s invasion of Czechoslovakia — is not necessarily the start of a new hot conflict but can also be seen as a desperate feint by a doomed regime. The NFL, presumed to be invulnerable to domestic-abuse and concussion scandals, could yet go the way of boxing, whose heavyweight championship was considered the “most valuable commodity in the world of sports” at the time of the Clay-Liston fight, in the words of the Times columnist Arthur Daley. Isaac Asimov’s old dream of space colonization — if ever achieved — may be realized by Jeff Bezos, Elon Musk, or India rather than the United States, Russia, or China. And for all our hyperventilating over who’s winning every morning in politics, America’s right and left may fight each other to a standoff in perpetuity. The undying punditocracy notion of a “centrist tradition” in America remains as much a mirage in 2014 as in 1964.

As James Baldwin said presciently in November 1963, “Americans are the youngest country, the largest country, and the strongest country, we like to say, and yet the very notion of change, real change, throws Americans into a panic.” No matter how many urgent reports on climate change are handed down, or how many vigils call for more gun regulation, reform may advance at a pace as halting as that which Baldwin foresaw for our endless struggle over race. Change we can believe in is less likely to happen overnight than on an installment plan. Even items left behind in the 1964 World’s Fair time capsule (credit cards, a ballpoint pen, plastic wrap, tranquilizers) have barely aged since. If anything, the most representative artifact of 1964 may be a television show that had its debut the month before the fair did: Jeopardy!, yet another American institution that, 50 years on, has proved resistant to all but cosmetic changes and periodic adjustments for inflation.

e-max.it: your social media marketing partner
 
How Edward Snowden Changed Journalism Print
Tuesday, 21 October 2014 12:45

Coll writes: "One of the least remarked upon aspects of the Snowden matter is that he has influenced journalistic practice for the better by his example as a source."

Edward Snowden. (photo: The New Yorker)
Edward Snowden. (photo: The New Yorker)


How Edward Snowden Changed Journalism

By Steve Coll, The New Yorker

21 October 14

 

itizenfour,” the new documentary about Edward Snowden, by Laura Poitras, is, among other things, a work of journalism about journalism. It opens with quotations from correspondence between Poitras and a new source who identifies himself only as Citizenfour. This source turns out to be Snowden. Soon, Poitras and Glenn Greenwald, at the time a columnist for the Guardian, travel to Hong Kong to meet Snowden in a hotel room.

They don’t know, at this point, if Snowden is who he says he is. They don’t know if his materials are authentic. Yet Poitras turns on her camera right away. Greenwald, who attended law school, questions Snowden, quite effectively. Gradually, Snowden’s significance becomes clear. The sequence is enclosing and tense and has many remarkable facets. One is that we witness a historically significant exercise in reporting and source validation as it happens. It is as if Bob Woodward had filmed his initial meeting, in a garage, with Deep Throat.

Snowden comes across in the film as shrewd, tough, and hard to read. (My colleague George Packer, in his recent Profile of Poitras, captures the film’s range brilliantly. Snowden also spoke to Jane Mayer remotely at this year’s New Yorker Festival.) Snowden has said that he had never spoken to a journalist before he contacted Poitras. “I knew nothing of the press,” he told the Guardian last summer. “I was a virgin source, basically.” This is not entirely persuasive: he may never have talked to a journalist, but he behaved with exceptional sophistication, both then and later— he is very far from the proverbial “naïve source.”

In fact, one of the least remarked upon aspects of the Snowden matter is that he has influenced journalistic practice for the better by his example as a source. Famously, when Snowden first contacted Greenwald, he insisted that the columnist communicate only through encrypted channels. Greenwald couldn’t be bothered. Only later, when Poitras told Greenwald that he should take the trouble, did Snowden take him on as an interlocutor.

It had been evident for some time before Snowden surfaced that best practices in investigative reporting and source protection needed to change—in large part, because of the migration of journalism (and so many other aspects of life) into digital channels. The third reporter Snowden supplied with National Security Agency files, Barton Gellman, of the Washington Post, was well known in his newsroom as an early adopter of encryption. But it has been a difficult evolution, for a number of reasons.

Reporters communicate copiously; encryption makes that habit more cumbersome. Most reporters don’t have the technical skills to make decisions on their own about what practices are effective and efficient. Training is improving (the Tow Center for Digital Journalism, at Columbia Journalism School, where I serve as dean, offers a useful place to start), but the same digital revolution that gave rise to surveillance and sources like Snowden also disrupted incumbent newspapers and undermined their business models. Training budgets shrank. In such an unstable economic and audience environment, source protection and the integrity of independent reporting fell on some newsrooms’ priority lists.

Snowden has now provided a highly visible example of how, in a very high-stakes situation, encryption can, at a minimum, create time and space for independent journalistic decision-making about what to publish and why. Snowden did not ask to have his identity protected for more than a few days—he seemed to think it wouldn’t work for longer than that, and he also seemed to want to reveal himself to the public. Yet the steps he took to protect his data and his communications with journalists made it possible for the Guardian and the Post to publish their initial stories and bring Snowden to global attention.

It took an inside expert with his life and liberty at stake to prove how much encryption and related security measures matter. “There was no risk of compromise,” Snowden told the Guardian, referring to how he managed his source relationship with Poitras and the others before their meeting in Hong Kong. “I could have been screwed,” but his encryption and other data-security practices insured that it “wasn’t possible at all” to intercept his transmissions to journalists “unless the journalist intentionally passed this to the government.”

In fashioning balanced practices for reporters, it is critical to ask how often and in what ways governments—ours and others—systematically target journalists’ communications in intelligence collection. For all his varied revelations about surveillance, this is an area where Snowden’s files have been less than definitive. It seems safe to assume the worst, but, as for the American government’s practices, there are large gaps in our understanding. White House executive orders, the Patriot Act, and the Foreign Intelligence Surveillance Act might all be grounds for targeting journalists for certain kinds of collection. Yet the government has never disclosed its policies, or the history of its actual practices following the September 11th attacks. (For a chilling sense of how vulnerable a journalist’s data would be if targeted by sophisticated surveillance, read “Dragnet Nation,” by Julia Angwin, an investigative reporter, formerly at the Wall Street Journal and now at ProPublica.)

In September, the Reporters Committee on Freedom of the Press and more than two dozen media organizations asked the Privacy and Civil Liberties Oversight Board, an independent federal body, to look into these questions and report their findings publicly. “National security surveillance programs must not be used to circumvent important substantive and procedural protections belonging to journalists and their source,” their letter said. “Sufficient details about these programs must be disclosed to the public so that journalists and sources are better informed about the collection and use of their communications.”

From a working journalist’s perspective, the Edward Snowdens of this world come around about as often as Halley’s Comet. It is not possible to report effectively and routinely while operating as though every communication must be segregated in a compartment within a compartment. The question of what constitutes best practices is a work in progress, as is the protection of personal privacy more broadly.

e-max.it: your social media marketing partner
 
Will the US Go to "War" Against Ebola? Print
Tuesday, 21 October 2014 12:38

Greenberg writes: "These days, two 'wars' are in the headlines: one against the marauding Islamic State and its new caliphate of terror carved out of parts of Iraq and Syria, the other against a marauding disease and potential pandemic, Ebola, spreading across West Africa, with the first cases already reaching the United States and Europe."

To win it requires a much larger effort in West Africa than the outside world has so far pledged. (photo: AFP)
To win it requires a much larger effort in West Africa than the outside world has so far pledged. (photo: AFP)


Will the US Go to "War" Against Ebola?

By Karen Greenberg, TomDispatch

21 October 14

 

hese days, two “wars” are in the headlines: one against the marauding Islamic State and its new caliphate of terror carved out of parts of Iraq and Syria, the other against a marauding disease and potential pandemic, Ebola, spreading across West Africa, with the first cases already reaching the United States and Europe. Both wars seemed to come out of the blue; both were unpredicted by our vast national security apparatus; both have induced fears bordering on hysteria and, in both cases, those fears have been quickly stirred into the political stew of an American election year.

The pundits and experts are already pontificating about the threat of 9/11-like attacks on the homeland, fretting about how they might be countered, and in the case of Ebola, raising analogies to the anthrax attacks of 2001. As the medical authorities weigh in, the precedent of 9/11 seems not far from their minds. Meanwhile, Thomas Frieden, the director of the Centers for Disease Control and Prevention (CDC), has tried to calm the country down while openly welcoming “new ideas” in the struggle against the disease. Given the almost instinctive way references and comparisons to terrorism are arising, it’s hard not to worry that any new ideas will turn out to be eerily similar to those that, in the post-9/11 period, defined the war on terror.

The differences between the two “wars” may seem too obvious to belabor, since Ebola is a disease with a medical etiology and scientific remedies, while ISIS is a sentient enemy. Nevertheless, Ebola does seem to mimic some of the characteristics experts long ago assigned to al-Qaeda and its various wannabe and successor outfits. It lurks in the shadows until it strikes. It threatens the safety of civilians across the United States. Its root causes lie in the poverty and squalor of distant countries. Its spread must be stopped at its region of origin -- in this case, Guinea, Liberia, and Sierra Leone in West Africa -- just as both the Bush and Obama administrations were convinced that the fight against al-Qaeda had to be taken militarily to the backlands of the planet from Pakistan’s tribal borderlands to Yemen’s rural areas.

Perhaps we shouldn’t be surprised then that, while President Obama was sending at least 1,600 military personnel (and the drones and bombers) to fight ISIS, his first response to the Ebola crisis was also to send 3,000 troops into Liberia in what the media has been calling an “Ebola surge” (a reflexive nod to the American troop “surge” in Iraq in 2007). The Obama administration’s second act: to beef up border protections for the screening of people entering the United States (a move whose efficacy has been questioned by some medical experts), just as the authorities moved swiftly in the wake of 9/11 to turn airports and borders into massive security zones. The third act was to begin to trace points of contact for those with Ebola, which, while logical and necessary, eerily mimics the way the national security state began to build a picture of terror networks, establish watch lists, and the like.

The next step under consideration for those who might have been exposed to Ebola, quarantine (that is, detention), is controversial among medical experts, but should similarly remind us of where the war on terror went after 9/11: to Guantanamo. As if the playbook for the post-9/11 response to terrorism were indeed the playbook for Ebola, Pennsylvania Congressman Tim Murphy, questioning Dr. Frieden, noted that, without putting policies of surveillance, containment, and quarantine in place, “we still have a risk.”

While any of these steps individually may prove sensible, the ease with which non-medical authorities seem to be falling into a familiar war on terror-style response to the disease should be examined -- and quickly. If it becomes the default template for Ebola and the country ends up marching down the road to “war” against a disease, matters could be made so much worse.

So perhaps it’s time to refresh our memories about that war on terror template and offer four cautionary lessons about a road that should never be taken again, not in developing a policy against the latest non-state actors, nor in pursuit of the containment of a disease.

Lesson One: Don’t turn the “war” on Ebola into another set of programs that reflect the national security establishment’s well-developed reliance on intelligence, surveillance, and the military. Looking, for instance, for people complaining about Ebola-like symptoms in private or searching the metadata of citizens for calls to doctors would be a fool’s errand, the equivalent of finding needles in a field full of haystacks.

And keep in mind that, as far as we can tell, from 9/11 on, despite the overblown claims of its adherents, the surveillance system they constructed has regularly failed to work as promised. It did not, for instance, stop the Shoe Bomber, the Times Square bomber, or the Boston Marathon bombers. Nor did the intelligence authorities, despite all the money invested since 9/11, prevent the Benghazi attack or the killing of seven CIA agents by a suicide bomber believed to be an American double agent in Khost, Afghanistan, in December 2009, or predict the rise of ISIS for that matter. Similarly, it is hard to imagine how the usual military might, from drones and special ops teams to those much-discussed boots on the ground, will help solve the problem of Ebola.

In the post-9/11 era, military solutions have often prevailed, no matter the problem at hand. Yet, at the end of the day, from the invasions of Afghanistan and Iraq to the air operation in Libya to the CIA’s drone campaigns across tribal backlands, just about no militarized solution has led to anything approximating victory -- and the new war against the Islamic State in Syria and Iraq is already following the same dismal pattern. Against a virus, the U.S. military is likely to be even less successful at anything more than aiding health workers and officials in disease-ridden areas.

The tools that the national security state has relied on in its war on terror not only didn’t work then (and are highly unlikely to work when it comes to the present Middle Eastern conflict either), but applied to Ebola would undoubtedly prove catastrophic. And yet -- count on it -- they will also prove irresistible in the face of fear of that disease. They are what the government knows how to do even if, in the war on terror itself, they created a vulnerability so much greater than the sum of its parts, helped foster the growth of jihadist movements globally, and eroded the sense of trust that existed between the government and the American people.

Lesson Two: Keep public health professionals in charge of what needs to be done. All too often in the war on terror, professionals with areas of expertise were cast aside by the security establishment. The judicial system, for instance, was left in the lurch when it came to dealing with accused al-Qaeda operatives, while the expertise of those who found no evidence of weapons of mass destruction in Iraq in 2002-2003 was ignored.

Only by trusting our medical professionals will we avoid turning the campaign against Ebola over to the influence of the security state. And only by refusing to militarize the potential crisis, as so many others were in the post-9/11 era, will we avoid the usual set of ensuing disasters. The key thing here is to keep the Ebola struggle a primarily civilian one. The more it is left in the hands of doctors and public health experts who know the disease and understand what it means practically to commit the government to keeping people as safe as possible from the spread of the virus, the better.

Lesson Three: Don’t cloak the response to Ebola in secrecy. The architects of the war on terror invoked secrecy as one of the prime pillars of their new state of being. From the beginning, the Bush administration cavalierly hid its policies under a shroud of secrecy, claiming that national security demanded that information about what the government was doing should be kept from the American people for their own “safety.” Although Barack Obama entered the Oval Office proclaiming a “sunshine” presidency, his administration has acted ever more fiercely to keep the actions of both the White House and the national security state under wraps, including, to mention just two examples, its justifications for policies surrounding its drone assassination campaigns and the extent of its warrantless surveillance programs.

As it happened, that wall of secrecy proved endlessly breachable, as leaks came flooding out of that world. Nonetheless, the urge to recreate such a state of secrecy elsewhere may be all too tempting. Don’t be surprised if the war on Ebola heads into the shadows, too -- and that’s the last thing the country needs or deserves when it comes to a public health crisis. To date, with medical professionals still at the forefront of those dealing publicly with Ebola, this impulse has yet to truly rise to the surface. Under their aegis, information about the first Ebola cases to reach this country and the problems involved hasn’t disappeared behind a cloak of secrecy, but don’t count on transparency lasting if things get worse. Yet keeping important facts about a potential pandemic under wraps is guaranteed to lead to panic and a rapid deterioration of trust between Americans and their government, a relationship already sorely tested in the war on terror years.

Realistically, secrecy and allied tools of the trade would represent a particularly inauspicious starting point for launching a counter-Ebola strategy at a time when it would be crucial for Americans to know about failures as well as successes. Outbreaks of panic enveloped in hysteria wrapped in ignorance are no way to stop a disease from spreading.

Lesson Four: Don’t apply the “black site” approach to Ebola. The war on terror was marked by the creation of special prisons or “black sites” beyond the reach of the U.S. justice system for the detention (in the case of Ebola think: isolation and quarantine) of terrorist suspects, places where anything went. There can, of course, be no question that Ebola patients, once diagnosed with the disease, need to be isolated. Protective gear and isolation units are already being used in treating cases here.

The larger issue of quarantine, however, looms as potentially the first major public policy debate of the Ebola era. Keep an eye on this. After all, quarantine-style thinking is already imprinted in the government’s way of life, thanks to the war on terror, so moving toward quarantines will seem natural to its officials.

Quarantine is a phenomenon feared by civil libertarians and others as an overreaction that will prove ineffective when it comes to the spread of the disease. It stands to punish individuals for their associations, however inadvertent, rather than dealing with them when they actually display signs of the disease. To many, though, it will seem like a quick-fix solution, the Ebola counterpart to Guantanamo, a facility for those who were deemed potential carriers of the disease of terrorism.

The fears a threat of massive quarantines can raise will only make things harder for health officials. So, too, will increasing calls for travel bans for those coming from West African countries, a suggestion reminiscent of sweeping police profiling policies that target groups rather than individuals. Avoiding such bans is not just a matter of preserving civil liberties, but a safety issue as well. Fears of broad quarantines and blanket travel bans could potentially lead affected individuals to become far more secretive about sharing information on the disease and far more deceptive in their travel planning. It could, that is, spread, not halt the dissemination of Ebola. As Thomas Frieden of the CDC argues, “Right now we know who’s coming in. If we try to eliminate travel, the possibility that some will travel over land, will come from other places, and we don’t know that they’re coming in will mean that we won’t be able to do multiple things. We won’t be able to check them for fever when they leave. We won’t be able to check them for fever when they arrive. We won’t be able, as we do currently, to take a detailed history to see if they were exposed when they arrive.” In other words, an overly aggressive reaction could actually make medical deterrence exponentially more difficult.

The United States is about to be tested by a disease in ways that could dovetail remarkably well with the war on terror. In this context, think of Ebola as the universe’s unfair challenge to everything that war bred in our governmental system. As it happens, those things that the U.S. did, often ineffectively and counterproductively, to thwart its enemies, potential enemies, and even its own citizenry will not be an antidote to this “enemy” either. It, too, may be transnational, originate in fragile states, and affect those who come in contact with it, but it cannot be stopped by the methods of the national security state.

Countering Ebola will require a whole new set of protections and priorities, which should emerge from the medical and public health communities. The now sadly underfunded National Institutes of Health and other such organizations have been looking at possible pandemic situations for years. It is imperative that our officials heed the lessons of their research as they have failed to do many times over with their counterparts in public policy in the war on terror years. To once again invoke the powers of the state to address fantasies and fears rather than the realities of a spreading disease would be to recklessly taunt the fates.

e-max.it: your social media marketing partner
 
FOCUS | Getting a Grip on Ebola Print
Tuesday, 21 October 2014 09:33

Reich writes: "The real crisis is the hysteria over Ebola that's being fed by media outlets seeking sensationalism and politicians posturing for the midterm elections. That hysteria is causing us to lose our heads."

Economist, professor, author and political commentator Robert Reich. (photo: Richard Morgenstein)
Economist, professor, author and political commentator Robert Reich. (photo: Richard Morgenstein)


Getting a Grip on Ebola

By Robert Reich, Robert Reich's Blog

21 October 14

 

e have to get a grip. Ebola is not a crisis in the United States. One person has died and two people are infected with his body fluids.

The real crisis is the hysteria over Ebola that’s being fed by media outlets seeking sensationalism and politicians posturing for the midterm elections.

That hysteria is causing us to lose our heads. Parents have pulled their children out of a middle school after learning the school’s principal had traveled to Zambia. Zambia happens to be in Africa but it has not even had a single case of Ebola.

A teacher at an elementary school has been placed on paid leave because parents were concerned he might have contracted the Ebola virus. When and how? During a recent trip to Dallas for an educational conference.

Are we planning to quarantine Dallas next?

Some politicians from both parties are demanding an end to commercial flights between the United States and several West African countries. But there are no direct flights to the U.S. from Liberia, Sierra Leone, and Guinea, where Ebola is taking its biggest toll.

So do they want to ban all commercial flights that might contain someone from any of these countries, who might have transferred planes? That would cover just about all commercial flights coming from outside the United States.

The most important thing we can do to prevent Ebola from ever becoming a crisis in the United States is to help Liberia, Sierra Leone, and Guinea, where 10,000 new cases could crop up weekly unless the spread of the virus is slowed soon.

Isolating these poor nations would only make their situation worse. Does anyone seriously believe we could quarantine hundreds of thousands of infected people a continent away who are infecting others?

The truth is quite the opposite. If the disease is allowed to spread in these places, the entire world could be imperiled.

These nations desperately need medical professionals in the field, more medical resources, isolation facilities, and systems in place to detect early cases.

Even at this stage, that’s not an impossible task. Nigeria is succeeding in checking the spread of the disease. It has not had a new case of Ebola in over a month.

But I’m worried about America. I’m not worried about Ebola. I’m worried about our confidence and courage.

Every time a global crisis arises these days – the drug war in Latin America, terrorism in the Middle East, climate change that’s straining global food and water supplies and threatening many parts of the world with flooding – the knee-jerk response of some Americans is to stop it at our borders.

As if we have the option. As if we live on another planet.

What’s wrong with us? We never used to blink at taking a leadership role in the world. And we understood leadership often required something other than drones and bombs.

We accepted global leadership not just for humanitarian reasons but also because it was in our own best interest. We knew we couldn’t isolate ourselves from trouble. There was no place to hide.

After World War II, we rebuilt Europe and Japan. Belatedly, we achieved peace in Kosovo. We almost eradicated polio. We took on tuberculosis, worldwide.

Now even Cuba is doing more on the ground in West Africa than we are. It’s dispatching hundreds of doctors and nurses to the front lines. The first group of 165 arrived in Sierra Leone in the past few days.

Where are we?

We’re not even paying attention to health crises right under our own noses.

More people are killed by stray bullets every day in America than have been killed by Ebola here. More are dying because of poverty and hunger.

More American kids are getting asthma because their homes are located next to major highways. One out of three of our children is obese, at risk of early-onset diabetes.

We’re not even getting a flu shot to all Americans who need one.

Instead, we bicker. For the last eighteen months, Republicans have been blocking confirmation of a Surgeon General.

Why? Because the President’s nominee voiced support for expanded background checks for gun purchases, and the National Rifle Association objected.

We’ve got to get our priorities straight. Media outlets that are exploiting Ebola because they want a sensational story and politicians using it to their own ends ought to be ashamed.

Public fear isn’t something to be played with.

There’s a huge job to be done, here and abroad. Let’s roll up our sleeves and get on with it.

e-max.it: your social media marketing partner
 
How Voting in Large Numbers Dramatically Improves Society Print
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=7118"><span class="small">Carl Gibson, Reader Supported News</span></a>   
Tuesday, 21 October 2014 07:33

Gibson writes: "Do you want a higher minimum wage? Free childcare? Universal healthcare? A free college education? Then vote for politicians who support those things, and vote out those who don't."

(photo: Shutterstock)
(photo: Shutterstock)


How Voting in Large Numbers Dramatically Improves Society

By Carl Gibson, Reader Supported News

21 October 14

 

o you want a higher minimum wage? Free childcare? Universal healthcare? A free college education? Then vote for politicians who support those things, and vote out those who don’t. It may not seem that simple, but if enough people adhere to this strategy with the patience of several more elections, all those things will happen here. Denmark has proven that to the world by having a model society made possible by a consistently high voter turnout rate. The last election in Denmark brought out 87 percent of the population. To compare, the 2012 presidential election only saw a 57 percent turnout of registered voters in the U.S.

I wrote a column earlier this month, called “The Election Is a Month Away. Fucking. Vote.” I laid out my case for voting this year to reject a culture of gridlock and corruption, but some still argued that a vote for any candidate was a vote for a rigged system, and that even if decent politicians were to be elected, they would be hamstrung by other members of their legislative body. Others chastised me for swearing at people in my headline and making people feel bullied into voting. So instead of my trying to intimidate you into voting, just take a look at Denmark’s quality of life, which is made possible by a consistently high voter turnout.

The minimum wage in Denmark is $21 an hour, and McDonald’s workers make enough to provide for their families and even have enough left over to save for retirement. This isn’t because McDonald’s has incredibly nice franchise owners in Denmark, but because a unionized workforce demanded it. Thanks to the union negotiating on behalf of workers, McDonald’s employees in Denmark also enjoy paid vacations, guaranteed overtime pay, and two days off per week. Compare that to the U.S., where the minimum wage is still $7.25 for most McDonald’s workers. Even in Seattle, which has the highest minimum wage in the country at $15 an hour, McDonald’s employees are still making a full six dollars less per hour than their Danish counterparts.

There’s a growing movement in the U.S. to guarantee all fast-food and retail workers a $15 an hour minimum wage and the right to organize a union. President Obama even acknowledged the efforts of these workers and applauded their efforts to fight for a living wage. If we were to demand that all candidates asking for our vote support $15 and a union, and if we voted out all the ones who didn’t, fast-food workers here could enjoy the same pay and benefits as the workers in Denmark. And the American people largely support it – Chicago voters overwhelmingly support a $15 an hour minimum wage by an 87 to 13 margin. And as of this past June, 70 percent of Americans support increasing the minimum wage to at least $10.10 an hour. If all of those Americans voted for candidates who supported a higher minimum wage, it would happen quickly.

Danes also enjoy free college, in which higher education is seen as a basic right and is funded with tax dollars. Two of the world’s top 100 universities are also in Denmark. Compare that to America, where students are expected to take on decades’ worth of debt to get a college education, just for the chance to have a good job that pays, say, $21 an hour. And to pay off this debt, students then have to work jobs in a field that’s likely outside the field they studied. The student debt bubble in America has surpassed the $1.2 trillion mark, and many students are putting off traditional life milestones, like marriage, home ownership, and child-rearing. Our government already spends some $69 billion on student aid of some sort. By spending just $62 billion, public universities, which educate about 76 percent of American college students, could be completely tuition-free. All we need to do is elect people to office who promise to do this, and vote in large numbers.

Speaking of child rearing, Denmark’s government recognizes the economic burden of providing daycare for children of working parents, and provides it free of charge to all taxpayers. With an abnormally low birthrate, Denmark was weighing the possibility of cutting the budget for local nurseries. The free daycare initiative started in 2012 as an incentive to get more parents to have babies. A similar program in the U.S. could help ease the burden on working parents who do not have the financial means to hire a full-time sitter or pay for daycare. All we need to do is elect politicians who promise to do everything they can to make sure mothers and fathers with full-time jobs don’t have to worry about the additional expense of daycare.

Additionally, Denmark views healthcare as a human right for all citizens. Prescription medication is free for all Danes under the age of 18, and extremely affordable for adults. Everyone can choose his own doctor, and patient satisfaction with the Danish healthcare system is much higher than in the U.S. Our healthcare system was rated as the most expensive and inefficient by the World Health Organization, and even though the Affordable Care Act successfully expanded Medicaid to middle-class families and made health insurance more affordable for several million Americans, it’s still captive to the for-profit healthcare industry. Denmark spends just 11 percent of its GDP on healthcare, while healthcare costs sucks up 18 percent of the U.S. economy. If we want better healthcare, we have to hold candidates to the promise of fighting for universal health care, or at the very least, a public option like Medicare for all.

It should be obvious that if we want the society that most Americans polled want – universal healthcare, free college, free childcare, and a $21 an hour minimum wage – all we need to do is elect politicians who stand for these things. It may seem impossible now, but look at how far we’ve come as a society in a relatively short amount of time. Before we abolished slavery it would have seemed impossible that, 150 years later, we would re-elect the first black president of the United States. At the turn of the 20th century, it seemed impossible that women would be able to vote. Now, Hillary Clinton is being talked about as a potential front-runner in the 2016 elections. All we need to do is muster the willpower to get the social reforms we deserve.



Carl Gibson, 26, is co-founder of US Uncut, a nationwide creative direct-action movement that mobilized tens of thousands of activists against corporate tax avoidance and budget cuts in the months leading up to the Occupy Wall Street movement. Carl and other US Uncut activists are featured in the documentary "We're Not Broke," which premiered at the 2012 Sundance Film Festival. He currently lives in Madison, Wisconsin. You can contact him at This e-mail address is being protected from spambots. You need JavaScript enabled to view it , and follow him on twitter at @uncutCG.

Reader Supported News is the Publication of Origin for this work. Permission to republish is freely granted with credit and a link back to Reader Supported News.

e-max.it: your social media marketing partner
 
<< Start < Prev 2681 2682 2683 2684 2685 2686 2687 2688 2689 2690 Next > End >>

Page 2683 of 3432

THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community.

RSNRSN