RSN Fundraising Banner
FB Share
Email This Page
add comment
Print

Cohen writes: "Silicon Valley research that started with an admirable sense of utopian promise eventually changed into practical, market-based technology."

AI and computer algorithms pose a major threat to freedom and justice. (photo: iStock)
AI and computer algorithms pose a major threat to freedom and justice. (photo: iStock)


There's No App for Justice

By Noam Cohen, New Republic

25 April 18


The Silicon Valley startups remaking legal practice

ack in the 1980s, before Americans worried that algorithms were ruling their lives, the sociologist and psychologist Sherry Turkle asked a group of MIT students to consider what they would think if they encountered a computer judge. Would it be the ideal of blind justice—a truly unbiased brain deducing the fairest outcome for any set of circumstances? Or simply a machine that asserted its authority without mercy?

Turkle found that the student responses varied based on race. White students were generally wary. “Judges have to have compassion for the particular circumstances of the people before them,” one told Turkle for her 1995 book, Life on the Screen. “Computers could never develop this quality.” African American students saw things differently, not because they had greater confidence in computers but because of what they knew about people in power. A computer judge, they told Turkle, “is not going to see a black face and assume guilt. He is not going to see a black face and give a harsher sentence.”

They weren’t the first to see promise in legal technology. The law has attracted artificial intelligence researchers since the field emerged in the 1950s. When the Stanford researcher John McCarthy was asked in a 1973 debate with Joseph Weizenbaum, the MIT computer science professor and artificial intelligence skeptic, “What do judges know that we cannot eventually tell a computer?” his answer was an emphatic, “Nothing.” In a 1977 law review article, Northwestern law professor Anthony D’Amato described the idea of computer judges as a laudable, if lofty, goal—a chance to live up to the idea of the United States as a country governed by “the rule of law, not the rule of men.”

Alas, there are still no robot judges. Instead, Silicon Valley research that started with an admirable sense of utopian promise eventually changed into practical, market-based technology. There are now at least 600 legal tech startups operating in the United States, many of them using AI to organize bankruptcy filings, search for new patent filings, and more generally help lawyers make the strongest possible case for their clients by showing connections between past court decisions, the law, and legal arguments. Several of the most promising startups—like Ravel Law, a legal research and analytics firm, which was conceived in a Stanford Law School dorm room in 2012 and went on to raise $14 million in venture capital—have been snatched up by the legal database and search giants Westlaw and LexisNexis. (Lexis bought Ravel for an undisclosed price last year.)

The question is whether these new legal startups will lessen inequality or magnify it. Worryingly, in other fields, AI seems often to increase imbalances. Racially biased search results on Google, as Safiya Umoja Noble revealed in her book Algorithms of Oppression, enforce inequities in the flow of information; the advance of robotic technology steadily eliminates jobs for the working class and beyond. When it comes to the law, this could have severe consequences.

Seven years after her original study, Turkle returned to the question of the computer judge, with startling results. By 1990, minority students had realized that computers often internalize human biases, and they now believed a computer judge would “carry all of the terrible prejudices of the real ones,” as one student told Turkle. “It would not improve the situation, but it would freeze the bad one.” Today’s legal research startups may pose the same risks, potentially perpetuating, and perhaps even exacerbating, structural inequalities in America’s legal system.

***

This has happened before. Founded in the 1970s, Lexis and Westlaw were the startups of their day, bringing new access to legal research through technological innovation, and turning days-long trips to courthouses and legal libraries into something that could be completed in moments, while at a desk. But these were (and remain) expensive services, imposing yet another layer of fees onto the already costly practice of law—and giving yet another advantage to richer firms over poorer.

The law today still seems ripe for the so-called disruption that these newer technology companies promise. “The legal system is not meeting the needs of those it is intended to serve,” says Robert Ambrogi, a Massachusetts lawyer who has been tracking the growth of legal tech.

In recent years, algorithmic legal work has found the greatest traction in discovery, the exchange of documents relevant to a legal case. This labor-intensive process can involve searching for a keyword in a pile of documents or for a certain recipient in thousands of emails. Having humans do this work no doubt led to unexpected discoveries, but it was also inefficient. Companies like Blackstone Discovery (“Born in Silicon Valley, Built to Win”), Everlaw, and GoldFynch have turned it into a largely automated process, reducing the need for a certain type of worker—the paralegals who sift through documents before a big case. Computers are probably better at these tasks, and definitely cheaper. But the shift means fewer jobs at the bottom rungs of a law firm for paralegals, who are often women—its own form of inequity.

The startups don’t see it that way, of course. Their tools, they argue, improve the practice of the law by eliminating preventable human error like citing the wrong case or simply misunderstanding the relevant law. “Efficiency and justice go hand in hand,” says Itai Gurari, the chief executive of Judicata, a San Francisco–based startup he helped found in 2012.

Judicata’s software does things no paralegal can: It analyzes judges and their decisions with big-data precision, using massive pools of information to help answer specific questions. Who does a judge tend to side with, the defense or the party bringing a suit? How does she tend to rule when on a panel with another judge? Does she side with a defendant in certain types of cases, where certain areas of the law are concerned? The software suggests the most effective precedents, in addition to scouring the legal brief for errors. Finally, Judicata also has a grading system for briefs, predicting the likelihood that an argument will prevail. Echoing the language of scientific certainty, its founders call this “mapping the legal genome.”

Helpful and elegant this may be, but it doesn’t improve the legal system so much as hack it. Granted, successful lawyers have often analyzed the tendencies of the judges they face. Many even attempt to try their cases in circuit courts thought to be more sympathetic to their arguments. The attorneys general trying to overturn the Trump administration’s travel ban last year, for example, wanted the case tried in the Ninth Circuit, which covers the Western United States and skews liberal; during Barack Obama’s administration, lawyers chose to challenge his policies in the conservative Northern District of Texas.

The new legal startups have simply mechanized that process via AI—but the result is that the decisions of law and justice are now turning on who has the biggest computer with the best algorithm. This matters. You can play the stock market without computerized investment advice—millions do—but the amateur day trader can’t rival the firms with optimizing algorithms seeking arbitrage opportunities. The law may be entering a new and dangerous phase in which technology becomes a weapon to be wielded among elite law firms.


e-max.it: your social media marketing partner
Email This Page

 

THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community.

RSNRSN