'Killer robots' are not about Terminator
-
A drone that can strike people based on facial recognition is technically feasible (Photo: Ricardo Gomez Angel)
By Peter Teffer
Photo editors of news media worldwide must have had a field day on Monday (21 August), when selecting an image to accompany the news that technology experts warned the world of killer robots.
Elon Musk, a billionaire CEO of space and electric car companies, and more than a hundred other technology company leaders published an open letter, warning against an arms race involving lethal autonomous weapon systems.
Join EUobserver today
Become an expert on Europe
Get instant access to all articles — and 20 years of archives. 14-day free trial.
Choose your plan
... or subscribe as a group
Already a member?
Many media outlets decided to use still photos of one of the Terminator films, in which robots rise up against humanity. Others chose to use the scene of a malfunctioning law enforcement robot in the film RoboCop.
While these images may help in terms of page views, Soren Transberg Hansen told EUobserver that this framing also leads to wider misunderstandings.
Transberg Hansen is one of the 41 signatories that represent companies based in the EU.
“It's a bit problematic that this Terminator picture comes up all the time, because then it's not taken seriously,” said Transberg Hansen, CEO of Brainbotics, a software company from Denmark.
“That is not what it is about. We are nowhere near that level of technology,” he added.
Although there is no universal definition of lethal autonomous weapons, the Danish robotics expert summarises them as “autonomous and semi-autonomous computer systems that can be used for triggering some armed device”.
“The concept of using robots in warfare is not a particular new one, but the technology is advancing really fast,” said Transberg Hansen.
“There is surprisingly little debate about some of the consequences.”
With the letter, the signatories want to spur that debate.
Pandora's box
The text, only 278 words long, said that lethal autonomous weapons could become “the third revolution in warfare” - the first one being the creation of gunpowder, and the second the invention of nuclear bombs.
“Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,” the letter said.
“We do not have long to act. Once this Pandora’s box is opened, it will be hard to close,” it went on.
By some definitions, autonomous weapons are already in use.
South Korea is said to have deployed several automatic guns that fire autonomously.
Worldwide, it is the United States that is “leading the way” in this field, Transberg Hansen said.
“It has to do with the funding programmes,” he argued.
“Where does the money come from? In the US it very often comes from the military, whereas in Europe we have a civil funding scheme, Horizon2020, that has much more focus on civil problems.”
Horizon2020 is an EU grant programme for research and innovation, which has funded several robotics projects for civil use.
It would be “very unlikely” that such funds would be applied for military projects, said Rich Walker, another signatory of the open letter.
Walker is managing director of the UK-based Shadow Robot Company, which builds robot hands which are used in research and industry.
He told EUobserver that it is “difficult to answer” which European countries are currently developing lethal autonomous weapons, because nations typically do not want to disclose their military plans.
Drones
While the EU has for decades had little to do with military issues, the EU commission has recently announced it would set up a European Defence Fund, which could finance projects with drones.
A drone that can strike people based on facial recognition is technically feasible, said Transberg Hansen, although he noted that as far as he knew European countries are not developing such weapons.
“Not officially. … But I wouldn't be surprised if some states would have some degree of autonomous decision-making,” he said, noting that in particular the fight against terrorism could spur politicians into moving towards using more robotics and artificial intelligence.
A group of international experts was due to debate the issue this week in a United Nations forum in Geneva.
However, the meeting was postponed. According to the open letter, “due to a small number of states failing to pay their financial contributions to the UN”.
An EU official confirmed that a “lack of sufficient funds” caused the delay, but could not elaborate. The UN's press office did not immediately respond.