Prof Noel Sharkey on Autonomous Robots: “we don’t want to sleepwalk into further technological disasters” #NIHRF

Prof Noel SharkeyNorthern Ireland-born Prof Noel Sharkey from University of Sheffield’s Centre for Robotics addressed a crowd of forty who had managed to battle through the disrupted Belfast traffic to St Mary’s University College this evening. (Older readers may remember him from BBC Two’s Robot Wars!)

As part of the NI Human Rights Festival, he delivered a talk on

From War to Surveillance: Human Rights and Autonomous Weapons

Sharkey began by explaining the scale and scope of military drones which he described as “more surgically precise than flying over with a B52 bomber” but still not precision weapons.

He believes that the CIA have around 80 drones in their air force, and hints at a UK role in the transit of signals from drone operators in the US to the countries in which drones are operating.

But Sharkey believes that even worse than human-guided drones is the removal of humans from the loop: programming aerial vehicles to fly themselves, identify targets and fire weapons.

While a advocate for the good that robots can do, Sharkey is very uncomfortable with “killer robots”. What computer can distinguish between someone holding a firearm and a child holding a toy gun. He argues that human rights are at risk when computer systems cannot be proportionate and lack situational awareness and deliberative reasoning.

Autonomous robots aren’t just up in the air. A robotic submarine to sink other submarines is being developed along with all-terrain armed vehicles that have already competed in DARPA challenges.

campaign to stop killer robots logoTogether with other activists, the Campaign to Stop Killer Robots (@stoptherobotwar) was formed and has succeeded in getting expert meetings staged through the Convention on Conventional Weapons (CCW, or more formally the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons) at the UN to look into the total prohibition of lethal autonomous robots. Since human-controlled drones are already deployed, some countries are unwilling to even debate restrictions on their use. So the campaign focuses on future weaponry, autonomous robots.

CCWThough Sharkey notes that even if a restriction on the use of these lethal weapons in war was agreed, it would not extend to use by police in day-to-day society.

Sharkey is struck by what at first seem innocuous uses of drones that lead to civilian loss of privacy during surveillance operations. He talked about freedom of information requests discovering that drones were being used to police mundane crimes such as fly posting rather than serious criminal activity.

Back in March 2013, the then PSNI Chief Constable Matt Baggott explained to me in an interview why the local police force was buying drones.

Matt Baggott drones blog post snippetFairly inoffensive, peaceful robots can be quickly repurposed for military and lethal means. Sharkey says “the internet took us by surprise”. The original academic network held no clues to today’s commercial use of the internet: spying on people, supporting the selling pornography and allowing paedophilic images to be swapped.

We don’t want to sleepwalk into further technological disasters.

Sharkey talked about the lack of rigour in existing methods of targeting “combatants” in war zones, which could only get worse without human oversight.

Professionally well informed about robotics through his years of academic research, Prof Sharkey is very suspicious of the perils of autonomous weapons. At times he comes across as alarmist, and his exaggerated and imprecise language weaken his message.

But behind the superlatives, Sharkey is picking out abuses that can be predicted today and highlighting the need for further analysis and debate. Though he admits that he may not still be living if and when the UN ever manage to ban “killer robots” under the CCW.

In the Q&A after Sharkey’s illustrated talk, one member of the audience asked if we – society at large – weren’t complicit in accelerating the development of drones technology and its onward application to autonomous robotic vehicles through increased consumer purchase and use.

In my opinion, the reality is that there are good uses and evil abuses of all technology. Civilian and commercial applications have military use, and vice versa. Holding governments to account and upholding ethical standards is the difficult problem …

Look out for an article on Prof Noel Sharkey in the Irish News this week.

, , , ,

  • Superfluous

    Philosophers have been pondering over the Trolley Problem http://en.wikipedia.org/wiki/Trolley_problem for quite some time, and this becomes more important as we start to get autonimous cars on our roads (in a high speed crash, if the automatic car has to miss the pregnant woman to the left, or the old almost dead man to the right – do we program it to take out the later?).

    The truth is that most crashes on our roads these days are human error – and therefore ‘morally aligned’ robots could actually have a pretty funky utilitarian case. What happens if we get to the same stage in war – where the robot actually makes a better decision, on average, that the human operator? As eerie as it sounds we could automatically avoid all children and possibly rule out some of the wikileaks psychopath humans who have got behind the controller in the past. Obviously, even so, we must be very cautious – but that doesn’t mean we should completely rule it out.

  • Practically_Family

    I don’t think it matters. If it’ll kill more “them” for less investment of blood & treasure belonging to “us” then it’ll happen.

  • Reader

    If a robotic car is good enough at avoiding accidents in the first place, it would be beneficial even if it can’t make the much more rare moral calculations for the last seconds before impact. It’s not as though humans are very good at those either – I am sure people have been killed by drivers swerving to avoid cats.
    Here’s a one second quiz: In the final moment before the inevitable 40mph crash, should you hit the full family car, or the lone pedestrian?

  • Dec

    Funny how deadly robots weren’t such a big issue for Sharkey when he was getting paid to appear on Robot Wars.

  • Reader

    Those weren’t robots. They were remote controlled, like drones.

  • Starviking

    Well the problem with these philosophical quandaries are: they assume the questioned has an omniscient view. That is, they know, from their godlike grasp of mechanics, human behavior, and probably a host of other disciplines – that the situation they are in has only a binary choice: hit one, or the other.

    How someone with such godlike abilities got themselves into such a situation is beyond me.

    In real life, with a family car* and a pedestrian as possible casualties, there is not going to be enough time to ponder philosophy. I would be trying to avoid both.

    *and is there enough time to realise that the family car is full, let alone absorb the import of the information?

  • Reader

    Yep – that was sort of my point – people aren’t infallible, therefore we shouldn’t wait until robot cars are. The time to let robot cars onto the road is when they are safer than the agerage driver. Then, as the robots get better and better, make the driving test tougher and tougher for human drivers.

  • terence patrick hewett

    They are just machines. Power of all sorts (for good or for ill) is gradually leaving government and economic institutions and being transferred to the individual. We as a society will deal with this as we dealt with the social and military implications of the 1st and 2nd Industrial Revolutions. Never underestimate an academic in pursuit of funding: first scare the giblets out of everyone then put the bite on the b*ggers. Works every time.