Autonomous Weapons
In: in Dražan Djukić and Nicolò Pons, eds., The Companion to International Humanitarian Law, Leiden: Brill Nijhoff, 2018
943 Ergebnisse
Sortierung:
In: in Dražan Djukić and Nicolò Pons, eds., The Companion to International Humanitarian Law, Leiden: Brill Nijhoff, 2018
SSRN
In: T.M.C. Asser Institute for International & European Law, Asser Research 2023-03
SSRN
In: Max Planck Encyclopedia of Public International Law, Rüdiger Wolfrum, ed., Oxford University Press
SSRN
In: Marine corps gazette: the Marine Corps Association newsletter, Band 99, Heft 3, S. 43
ISSN: 0025-3170
Acknowledgements -- Contents -- Chapter 1: Ethics and the Autonomous Weapons Debate -- What This Book Is Not About -- Autonomous Weapons and the Ethics of Cyber Warfare -- Autonomous Weapons, Ethics, and 'Super-Intelligence' -- Autonomous Weapons and the Ethics of Counterterrorism Operations -- Autonomous Weapons Technology and Armed Conflict: An Ethical Perspective -- The Use of Force: Permission and Restriction -- Jus ad bellum and Autonomous Weapons -- Jus in bello and Autonomous Weapons -- Jus post bellum and Autonomous Weapons
This book is amongst the first academic treatments of the emerging debate on autonomous weapons. Autonomous weapons are capable, once programmed, of searching for and engaging a target without direct intervention by a human operator. Critics of these weapons claim that â#x80;taking the human out-of-the-loopâ#x80;#x99; represents a further step towards the de-humanisation of warfare, while advocates of this type of technology contend that the power of machine autonomy can potentially be harnessed in order to prevent war crimes. This book provides a thorough and critical assessment of these two positions. Written by a political philosopher at the forefront of the autonomous weapons debate, the book clearly assesses the ethical and legal ramifications of autonomous weapons, and presents a novel ethical argument against fully autonomous weapons.
In: Obrana a strategie: Defence & strategy, Band 22, Heft 1, S. 035-054
ISSN: 1802-7199
Since 2013, the debates on lethal autonomous weapons systems (LAWS) and approach to them from the point of view of international legislation have been taking place within the framework of the Convention on Certain Conventional Weapons (CCW) and later, since 2017, on the platform Group of Governmental Experts under CCW. Following the ongoing debates and the importance of the topic, in her article, the author aims to summarise the results achieved in the discussions under the CCW, especially the main steps, successfully handled challenges and shifts in the countries' positions since 2013, as well as open issues in regulating LAWS under this international convention. Part of the article is dedicated to the identification of existing similar features of countries sharing a common position on LAWS at CCW conferences, as well as the challenges and possible implications of China's status, which supports a ban on the deployment of LAWS, but not on their research and production.
Governments across the globe have been quick to adapt developments in artificial intelligence to military technologies. Prominent among the many changes recently introduced, autonomous weapon systems pose important new questions for our understanding of conflict generally, and coercive diplomacy in particular. These weapons dramatically decrease the cost of employing military force, in human terms on the battlefield, in financial and material terms, and in political terms for leaders who choose to pursue conflict. In this article, we analyze the implications of these new weapons for coercive diplomacy, exploring how they will influence the course of international crises. We argue that drones have different implications for relationships between relatively equal states than they do for unbalanced relationships where one state vastly overpowers the other. In asymmetric relationships, these weapons exaggerate existing power disparities. In these cases, the strong state is able to use autonomous weapons to credibly signal, avoiding traditional and more costly signals such as tripwires. At the same time, the introduction of autonomous weapons puts some important forms of signaling out reach. In symmetric conflicts where states maintain the ability to inflict heavy damages on each other, autonomous weapons will have a relatively small effect on crisis dynamics. Credible signaling will still require traditional forms of high-cost signals, including those that by design put military and civilian populations at risk.
BASE
In: Marine corps gazette: the Marine Corps Association newsletter, Band 97, Heft 12, S. 79-82
ISSN: 0025-3170
In: Journal of international humanitarian legal studies, Band 6, Heft 2, S. 247-283
ISSN: 1878-1527
Given the swift technologic development, it may be expected that the availability of the first truly autonomous weapons systems is fast approaching. Once they are deployed, these weapons will use artificial intelligence to select and attack targets without further human intervention. Autonomous weapons systems raise the question of whether they could comply with international humanitarian law. The principle of proportionality is sometimes cited as an important obstacle to the use of autonomous weapons systems in accordance with the law. This article assesses the question whether the rule on proportionality in attacks would preclude the legal use of autonomous weapons. It analyses aspects of the proportionality rule that would militate against the use of autonomous weapons systems and aspects that would appear to benefit the protection of the civilian population if such weapons systems were used. The article concludes that autonomous weapons are unable to make proportionality assessments on an operational or strategic level on their own, and that humans should not be expected to be completely absent from the battlefield in the near future.
Unlike conventional weapons or remotely operated drones, autonomous weapon systems can independently select and engage targets. As a result, they may take actions that look like war crimes—the sinking of a cruise ship, the destruction of a village, the downing of a passenger jet—without any individual acting intentionally or recklessly. Absent such willful action, no one can be held criminally liable under existing international law. Criminal law aims to prohibit certain actions, and individual criminal liability allows for the evaluation of whether someone is guilty of a moral wrong. Given that a successful ban on autonomous weapon systems is unlikely (and possibly even detrimental), what is needed is a complementary legal regime that holds states accountable for the injurious wrongs that are the side effects of employing these uniquely effective but inherently unpredictable and dangerous weapons. Just as the Industrial Revolution fostered the development of modern tort law, autonomous weapon systems highlight the need for "war torts": serious violations of international humanitarian law that give rise to state responsibility.
BASE
In: Journal of International Humanitarian Legal Studies, Band 2015, Heft 2
SSRN
In: 105 Cornell L. Rev. Online 233 (2020)
SSRN
In: Research & politics: R&P, Band 2, Heft 4
ISSN: 2053-1680
Autonomous weapons would have the capacity to select and attack targets without direct human input. One important objection to the introduction of such weapons is that they will make it more difficult to identify and hold accountable those responsible for undesirable outcomes such as mission failures and civilian casualties. I hypothesize that individuals can modify their attribution of responsibility in predicable ways to accommodate this new technology. The results of a survey experiment are consistent with this; subjects continue to find responsible and hold accountable political and military leaders when autonomous weapons are used, but also attribute responsibility to the designers and programmers of such weapons.
SSRN