Bolton writes: "The letter, put together by the Future of Life Institute, a group that works to mitigate 'existential risks facing humanity,' warns of the danger of starting a 'military AI arms race.'"
The letter claims that totally autonomous killing machines could become a reality within 'years, not decades.' (photo: Getty Images)
Ban 'Killer Robots' With AI, Urges Stephen Hawking, Noam Chomsky and Thousands More in Open Letter
28 July 15
ore than 1,000 robotics experts and artificial intelligence (AI) researchers - including physicist Stephen Hawking, technologist Elon Musk, and philosopher Noam Chomsky - have signed an open letter calling for a ban on "offensive autonomous weapons", or as they are better known, 'killer robots'.
Other signatories include Apple co-founder Steve Wozniak, and hundreds of AI and robotics researcher from top-flight universities and laboratories worldwide.
The letter, put together by the Future of Life Institute, a group that works to mitigate "existential risks facing humanity", warns of the danger of starting a "military AI arms race".
These robotic weapons may include armed drones that can search for and kill certain people based on their programming, the next step from the current generation of drones, which are flown by humans who are often thousands of miles away from the warzone.
The letter says: "AI technology has reached a point where the deployment of such systems is - practically if not legally - feasible within years, not decades."
It adds that autonomous weapons "have been described as the third revolution in warfare, after gunpowder and nuclear arms".
It says that the Institute sees the "great potential [of AI] to benefit humanity in many ways", but believes the development of robotic weapons, which it said would prove useful to terrorists, brutal dictators, and those wishing to perpetrate ethnic cleansing, is not.
Such weapons do not yet truly exist, but the technology that would allow them to be used is not far away. Opponents, like the signatories to the letter, believe that by eliminating the risk of human deaths, robotic weapons (the technology for which will become cheap and ubiquitous in coming years), would lower the threshold for going to war - potentially making wars more common.
Last year, South Korea unveiled similar weapons - armed sentry robots, that are currently installed along the border with North Korea. Their cameras and heat sensors allow them to detect and track humans automatically, but the machines need a human operator to fire their weapons.
The letter also warns of the possible public image impact on peaceful the uses of AI, which potentially could bring significant benefit to humanity. By building robotic weapons, it warns that a public backlash could grow, curtailing the genuine benefits of AI.
It sounds very futuristic, but this field of technology is advancing at a rapid rate, and opposition to the violent use of AI is already growing.
The Campaign to Stop Killer Robots, a group formed in 2012 by a list of NGOs including Human Rights Watch, works to preemptively ban robotic weapons.
They are currently working to get the issue of robotic weapons on the table of the Convention of Conventional Weapons in Geneva, a UN-linked group that seeks to prohibit the use of certain conventional weapons such as landmines and laser weapons, which, like the Campaign hopes autonomous weapons will be, were preemptively banned in 1995.
The Campaign is trying to get the Convention to set up a group of governmental experts which would look into the issue, with the aim of having such weapons banned.
Earlier this year, the UK opposed a ban on killer robots at a UN conference, with a Foreign Office official telling The Guardian that they "do not see the need for a prohibition" of autonomous weapons, adding that the UK is not developing any such weapons.
THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community. |