→Jump to content

Office of Technology Assessment at the German Bundestag Office of Technology Assessment at the German Bundestag

Information on the projec

Autonomous weapons systems

Thematic area: Technology, society, innovation
Analytical approach: TA project
Topic initiative: Committee on Education, Research and Technology Assessment
Status: In process of approval
Current project phase: Report completed. Approval by the Committee on Education, Research and Technology Assessment pending
Duration: 2017 till 2018

Background and central aspects of the topic

In recent years, impressive progress has been made in robotics and also in research on artificial intelligence (AI) – advances that do not even stop at the military sector. Since long, unmanned airborne systems (so-called drones) are an integral part of military operations. Moreover, unmanned systems are already being used for special ground operations (e. g. explosive ordnance disposal) and in or under water (e. g. maritime mine countermeasures) as well. Though the autonomy of action of these systems – which are partly very different with regard to their structure and objectives – is currently still limited and they are regularly controlled by humans, it is to be expected in view of the intensive research and development activities that the degree of autonomy of robotic and AI systems will increase and that their military use will be intensified significantly.

The currently technically feasible degree of autonomy, for example, allows a drone to fly over a certain area autonomously in order to gather reconnaissance data and to send them to a base station. In this base station, human operators will analyse these data and take the final decisions regarding complex missions or a possible weapons deployment (for the Predator drone, e. g., these operators are a pilot and two sensor specialists).

In case of a fully autonomous weapons system, however, the target selection, the distinction between combatants and civilians, the decision to attack and finally the weapons deployment itself would be made autonomously by the system without any human intervention. The military interest in an increased autonomy is based on two decisive advantages: On the one hand, an autonomous system can continue to act even with the communication connection to the base station being interrupted. On the other hand, it allows a faster response in combat situations, as there are no delays anymore due to radio transmission of sensor data or control commands and due to the decision-making process of the human operator resulting in the order to go into action. For this reason, high priority is given to an increased autonomy of military systems including weapons systems by some key actors in their strategic considerations.

From an ethical point of view, this development is the subject of controversial debates, particularly with regard to fully autonomous armed systems: The debate focuses on the question of how far it is ethically acceptable and shall be politically admissible to let machines take decisions autonomously about the life and death of humans in combat. In this context, in an open letter in 2015, more than 20,000 people – among them numerous AI researchers and well-known personalities such as Stephen Hawking and Elon Musk – called for a ban on offensive autonomous weapons. Human rights organisations (Human Rights Watch, for example, coordinates the campaign »Stop Killer Robots« initiated by a network of national and international NGOs) and the German Federal Government (according to its coalition agreement) are also calling for a ban on those weapons systems.

Objectives and approach

In 2011 already, the TAB presented an inventory and impact assessment regarding the military use of unmanned systems (TAB working report no. 144 »Status quo and perspectives of the military use of unmanned systems«) which, however, did not focus particularly on autonomous combat missions of such systems. Based on this previous study, three thematic areas shall be examined in the new TA project: technical aspects, ethical issues and international policy issues.

Technical aspects: The objective is to investigate how the technological maturity and the development prospects of autonomous weapons systems have evolved compared to the status quo of 2011. On that basis, the assessments made in the TAB report of 2011 shall be updated and supplemented focusing on (fully) autonomous combat missions.

Ethical issues: In view of the very different degrees of autonomy and intended purposes of unmanned military systems, the objective is to engage in a differentiated discussion on their ethical implications. In this context, special emphasis shall be put on integrability with regard to issues concerning law or practical life, e. g. concerning questions of responsibility and liability as well as regarding international humanitarian law or human rights.

International policy issues: The objective is to investigate which implications for security policy might result from a potential availability of autonomous weapons systems. For example, there is reason to fear that the decision to use weapons in a conflict would be taken more easily if it is possible to deploy autonomous weapons systems instead of human soldiers. This might have a destabilising effect and make wars more likely. From an arms control policy perspective, the question is which options for action Germany has at the international level to promote the intended ban on lethal autonomous weapons.

Project progress

The three thematic areas mentioned above have been examined by an external expert analysis each. Based on this a first draft of the final report was prepared and submitted for review.