The Game of Drones: Drones, Laws and Ethics
Here are some reflections from a work by Colonel (PhD) Gjert Lage Dyndal, Lieutenant Colonel (PhD) Tor Arne Berntsen and Assistant Professor Sigrid Redse-Johansen, although the work is oriented to the development of military drones, we consider that these kinds of reflections and debates are necessary since these technologies are destined to occupy a central place in the societies in a short time.
The delegation of life or death decisions to non-human agents is a recurring concern of those who oppose autonomous weapon systems, since allowing a machine to "decide" to kill a human being undermines the very value of human life. From this perspective, human life is of such significant value that it is inappropriate for a machine to proceed with its termination; in other words it is intrinsically immoral the development and use of autonomous drones.
Even if these autonomous systems were capable of discriminating between targets and non-targets, there is also the reasonable doubt as to whether such systems are capable of assessing whether or not an attack is proportional and whether the attack would cause unnecessary suffering. However, beyond the uncertainty about the technological capabilities that autonomous drones will possess in the future to make such distinctions, it can also be argued that if these weapons systems can not operate within certain parameters they are unlikely to be deployed, at least In operational environments where the risk of causing excessive harm to civilians is high.
From the opposite perspective it is argued that the use of autonomous drones is not only acceptable from a moral perspective, but even morally preferable to human soldiers. Self-contained drones could process more incoming sensory information than human soldiers and could therefore make more informed decisions. And since machine judgments would not be clouded by emotions like fear and rage, it could reduce the risk of war crimes that might otherwise have been committed by human soldiers.
The use of autonomous drones can also improve many aspects of humanitarian missions, benefiting civilians being assisted and reducing the risks for soldiers. Using autonomous systems to search for hazardous areas or perform high-risk tasks, such as removing bombs or cleaning a house, would eliminate the risk of human soldiers being injured or killed. While limiting the risk to soldiers by eliminating them from the battlefield altogether could make the war too "easy," reducing it to a low-cost technology company that no longer requires any public or moral commitment.
From the standpoint of ethics in the debates on autonomous drones, law enforcement is fundamental to any military, civil and commercial policy, although law and ethics often overlap. There are important issues at stake in particular in the case of emerging military technologies, which are not adequately addressed in current legislation. Ethical reflection, in other words, can complement the law by providing normative guidance in these "gray areas." It may also be important to emphasize when ethical obligations should exceed legal duties in the interest of good political governance.
It is difficult to predict the future, but it is clear that autonomous drones raise important judicial and ethical issues about liability for unintentional damage. Technologies create some moral accountability gaps, and when autonomous military systems are deployed, it is less clear how responsibility is to be shared, and these potential liability deficiencies need to be adequately addressed through technical solutions and legal regulations.
The debate is open and we are interested in all the reflections, in @mybeebetv together with stephan metral 🐝 Innovative Brand Ambassador we wait for you to exchange opinions.