On 5 December 2012, the APPG on Drones heard a fascinating presentation from Professor Noel Sharkey, who is Professor of Artificial Intelligence and Robotics and Professor of Public Engagement at the University of Sheffield.
He began his presentation by outlining how he had become involved in drones and robotic warfare more broadly. He was clear that the use of armed drones set a dangerous precedent which undermined international law. Most recently, this was seen with the attack by the US on Libya and the failure to seek Congressional approval for this attack. The second point he raised was the issue of proliferation. He is currently monitoring 51 countries which have access to drones including China and Iran, although he mentioned US documents he had seen that suggest that 72 countries could have access to this technology. He also mentioned the potential sale and export of drones and how various safeguards put in place such as the MTRC were not fit for purpose. Professor Sharkey tracked the move from ‘man in the loop’ to ‘man on the loop’. In this latter construct, one man would have the ability to control a swarm of drones. However, it was also representative of the move toward more automated war. As a recent report by Human Rights Watch, to which Professor Sharkey was an advisor, had shown, these were steps toward ‘man outside the loop’.
Autonomous warfare had a number of advantages, particularly, that it is a faster and cheaper kind of warfare; also when technologically sophisticated nations fight, signals will be jammed to prevent the use of drones controlled by humans. In defining autonomy, Professor Sharkey pointed to the ability of drones to land and take-off, to navigate toward targets and to avoid obstacles. Broadening the focus, he highlighted the example of the Crusher, a small automated device currently used to defuse bombs, and showed what happened when it was armed. He also highlighted the use of fully autonomous submarines. Attention was drawn to the fact that a key problem with the use of any automated system is their inability to work within International Humanitarian Law. For example, they cannot distinguish between military personnel and civilians (The Principle of Distinction). Robots cannot reason nor can they exhibit the situation awareness necessary to make proportionality decisions, i.e. the number of civilian casualties should be directly proportional to the military advantage gained. Further autonomous robots disrupt the chain of accountability and responsibility for mishaps in warfare. Attention was also drawn to the vulnerability of automated weapons to ‘spoofing’. This is when a fake signal is sent by hackers to a drone, and the drone then mistakes it for the control signal from a GPS satellite. This signal can cause the drone to mistake its location and therefore be drawn off course or caused to crash.
Professor Sharkey discussed the development of the supersonic X-47B drone which has recently been tested being catapulted off the aircraft carrier USS Truman and the Hypersonic drone Falcon (a test bed for DARPA’s HTV-2 program) which is also in development. He noted how the development of these drones would give the US the capability to strike anywhere in the world within one hour. Focus was drawn on whether there was the capability to have full control of a drone travelling at such speed and the potential for a situation where the equipment can only be recalled once it has caused damage.
He also explored what could happen when two algorithms meet, to provide an idea of the power of drones and automated weapons and how far they could spiral from human control. The point was that if autonomous enemy teams of robots met, their behaviour would be unpredictable and could cause unjustifiable harm. Professor Sharkey concluded by arguing that there was nothing wrong with autonomous robots. The problems arise once they are armed and the current lack of adequate legal instruments to prevent the use of these weapons.
Following Professor Sharkey’s presentation, the floor was opened to discussion. The issue of whether there was merit in developing a specific treaty for automated weapons was raised by a particpant. Particular reference was made to Article 36, 1977 Additional Protocol I to the Geneva Conventions of 1949 and the requirement that compliance needed to occur ‘during the development’ of weapons. The approach of the military to the use of drones was queried. Reference was made to the attitude of those who built the nuclear bomb and the divisions amongst the military on its use. Professor Sharkey commented that within the military, there was support for his position. Recently the French military had condemned the use of drones. Also the majority of former Soviet Bloc countries, excluding Albania have suggested they would never use drones. In contrast, the Israelis were very keen on drones. For example, they were used in Gaza last month and to patrol Israel’s borders. Professor Sharkey commented that in the US, the emphasis was on a fear of ‘trial by CNN’; in the UK, there was a much greater emphasis on avoiding civilian deaths and on the Code of Ethics.
In response to a question about the power of drone technology, Professor Sharkey highlighted recent experiments on moths. Here, pupae were injected with chips which meant that when they became moths, they could be remote controlled. Although the experiment ultimately failed, more recent experiments on Giant African Beetles, based on the same principle, had been successful.
A question was asked as to the role which could be played by the promotion of a corporate social responsibility agenda. Professor Sharkey responded that although there were export restrictions in place, the fact was that often the components are, in themselves, not in breach of such restrictions. He highlighted a recent problem with US drones, where it was found that parts for these drones were manufactured in China and packed in Taiwan; during this process, an electronic ‘backdoor’ was created, which enabled the chips controlling the drones to be hacked.
The relationship between funding for academic research into drones and the emphasis on their use in a military context was hugely significant in the United States. This was having a subsequent effect on the emphasis on ethics in the use of automated weapons. Attention was drawn to DARPAR, the Defense Advanced Research Projects Agency in the US, which was pumping money into research and development.
In response to a question about the role of Parliamentarians on the issue, the position was put forward that the UK should get behind a prohibition on the use of autonomous weapons systems. Citing the example of the role of the Norwegian government and the ban on cluster bombs, Professor Sharkey proposed that the UK government should play a similar role on this issue.