Scientists call for ban on killer robots at international convention

Spread the love
  • Terrifying footage was made by advocacy group Campaign to Stop Killer Robots 
  • In the film palm-sized drones armed with explosives find and attack people 
  • Campaigners of the film warn a preemptive ban on the technology is needed
  • More than 100 nations are part of the international Convention on Certain Conventional Weapons that will debate a ban on ‘killer robots’ in Geneva today 

Phoebe Weston For Mailonline

AI experts have put together a seven-minute film that depicts a terrifying future where tiny killer drones are programmed to carry out mass killings.

Made by an advocacy group called Campaign to Stop Killer Robots, the footage shows palm-sized drones armed with explosives finding and attacking people without human supervision.

These tiny drones can kill with ruthless efficiency and campaigners warn a preemptive ban on the technology is needed to stop a new era of horrific mass destruction.

Scroll down for video

AI experts have put together a seven-minute film that depicts a terrifying future where killer drones are programmed to carry out mass killings

AI experts have put together a seven-minute film that depicts a terrifying future where killer drones are programmed to carry out mass killings

AI experts have put together a seven-minute film that depicts a terrifying future where killer drones are programmed to carry out mass killings

SLAUGHTERBOTS

In the film, machines can can spot activists in lecture halls and kill them by propelling an explosive into their head.

The video starts with a developer introducing the new technology, saying these drones can react 100 times faster than a human.

He says these drones have wide field cameras, face recognition, special sensors and shaped explosives. 

The video then cuts out of the developer’s presentation and goes to mock news stories about horrific destruction in the aftermath of an attack.

A leading AI scientists from the University of California in Berkeley, Stuart Russell, will be part of the team that show the film at the International Convention on Certain Conventional Weapons.

‘Pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security,’ Dr Russell said.

‘The technology illustrated in the film is simply an integration of existing capabilities’, he said.

More than 100 nations are part of the International Convention on Certain Conventional Weapons that will debate a ban on so called ‘killer robots’ in Geneva today.

In the film, which will be shown at the convention, machines can can spot activists in lecture halls and kill them by propelling an explosive into their head, writes the Guardian.

The film called ‘Slaughterbots’ starts with a developer introducing the new technology, saying these drones can react 100 times faster than a human.

He says these drones have wide field cameras, face recognition, special sensors and shaped explosives.

‘Let’s watch the weapons make decisions’, he says, claiming they can penetrate people, cars trains and evade bullets.

The video then cuts out of the developer’s presentation and goes to mock news stories about horrific destruction in the aftermath of an attack. 

Experts warn this futuristic scenario could happen unless we halt the development of such drones which could be devastating for human security.

A leading AI scientists from the University of California in Berkeley, Stuart Russell, will be part of the team that show the film.

‘Pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security,’ Dr Russell said.

‘The technology illustrated in the film is simply an integration of existing capabilities’, he said.

KILLER ROBOTS

In June the Pentagon awarded an $11 million (£8.4 million) contract to build a ‘combined-arms squad’ of human and robotic capabilities.

From unmanned trucks and aircraft, to ‘ghost fleets’ of underwater drones, the military has in many ways turned its sights on autonomous technology to improve capabilities.

And, a similar shift can be seen all around the world.

Russia, for example, has also been working on ways to integrate combat robots into battle, including armed sentry drones. 

‘Intelligent robotic weapons – they’re a reality, and they will be much more of a reality by 2030,’ said John Bassett, a former British intelligence officer.

‘At some point around 2025 or thereabouts, the US Army will actually have more combat robots than it will have human soldiers.’

‘It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance.’

At their five-year review conference, the 123 nations are meant to agree to formalise their efforts to deal with the challenges raised by weapons systems that would select and attack targets without meaningful human control. 

The military has been one of the biggest funders of AI technology and have created robots that can scan video footage for a target with more precision than the human eye.

Now experts warn the technology is so sophisticated drones could be armed with explosives to kill people without needing the human controller to have the final say.

‘The governments meeting in Geneva took an important step toward stemming the development of killer robots, but there is no time to lose’ said Steve Goose, arms director of Human Rights Watch last year, a co-founder of the Campaign to Stop Killer Robots.

‘Once these weapons exist, there will be no stopping them.

Made by advocacy group Campaign to Stop Killer Robots, the footage shows palm-sized drones armed with explosives finding and attacking people (pictured)

In the film, which will be shown at the convention, machines can can spot activists in lecture halls and kill them by propelling an explosive into their head

In the film, which will be shown at the convention, machines can can spot activists in lecture halls and kill them by propelling an explosive into their head

In the film, which will be shown at the convention, machines can can spot activists in lecture halls and kill them by propelling an explosive into their head

WILL ROBOTS GET AWAY WITH WAR CRIMES? 

If a robot unlawfully kills someone in the heat of battle, who is liable for the death? 

In a report by the Human Rights Watch in 2016, they highlighted the rather disturbing answer: no one.

The organisation says that something must be done about this lack of accountability – and it is calling for a ban on the development and use of ‘killer robots’. 

Called ‘Mind the Gap: The Lack of Accountability for Killer Robots,’ their report details the hurdles of allowing robots to kill without being controlled by humans. 

‘No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,’ said Bonnie Docherty, senior Arms Division researcher at the HRW and the report’s lead author.

‘The time to act on a pre-emptive ban is now.’

In August, robotics and artificial intelligence experts signed an open letter demanding the UN prohibit the use of such weapons internationally.

Among the 116 signatories are Tesla founder Elon Musk and Mustafa Suleyman, head of applied AI at Google’s Deep Mind.

The weapons, including lethal microdrone swarms, are on the edge of development with the potential to create global instability, they warn.

‘Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,’ the letter reads.

‘These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.

‘We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.’

Technology allowing a pre-programmed robot to shoot to kill, or a tank to fire at a target with no human involvement, are only years away experts say. 

The Conference on Disarmament will open five days of talks on the weaponry, but those calling for a ban will not be satisfied, said Indian ambassador Amandeep Gill, who is chairing the meeting.

‘It would be very easy to just legislate a ban but I think… rushing ahead in a very complex subject is not wise,’ he told reporters.

‘We are just at the starting line.’

He said the discussion, which will also include civil society and technology companies, will be partly focused on understanding the types of weapons in the pipeline.

Experts warn the technology is so sophisticated military drones could be armed with explosives to strike people without needing the human controller to have the final say

Mr Gill said there was agreement among nations that ‘human beings have to remain responsible for decisions that involve life and death’.

But, he said, there are varying opinions on the mechanics through which ‘human control’ must govern deadly weapons.

The International Committee of the Red Cross, which is mandated to safeguard the laws of conflict, has also not called for a ban, but has underscored the need to place limits on autonomous weapons.

‘Our bottom line is that machines can’t apply the law and you can’t transfer responsibility for legal decisions to machines’, Neil Davison of the ICRC’s arms unit told AFP.

He highlighted the problematic nature of weapons systems, where there are major variables in terms of the timing or location of an attack – for example something that is deployed for multiple hours and programmed to strike whenever it detects an enemy target.

‘Where you have a degree of unpredictability or uncertainty in what’s going to happen when you activate this weapons system then you are going to start to have problems for legal compliance’, he said.