Spelling suggestions: "subject:"autonoma vapensystem"" "subject:"autonomas vapensystem""
1 |
Autonomins baksida : En kvantitativ studie om blivande officerares syn på hur graden av autonomi påverkar den etiska försvarbarheten i en attackAxelsson, Marcus January 2023 (has links)
The development and application of artificial intelligence (AI) for military purposes are increasing rapidly in many parts of the world. Military powers are driving programs aimed at the advantages that AI can generate. Simultaneously, ethical questions arise concerning autonomous military systems. This study aims to provide clarity on how future Swedish officers with different backgrounds within the profession relate to the ethical issues that accompany the use of autonomous weapon systems. In this study, the respondents are presented with two fictitious scenarios based on the principles of distinction and proportionality, describing ethically problematic attacks that affect civilians. In each scenario, respondents are asked to take a stance on attacks carried out with different degrees of autonomy. The results of the study show that future officers consider the ethical defensibility of an attack to decrease as the degree of autonomy in the weapon system used increases.
|
2 |
AUTONOMA VAPENSYSTEM : ARGUMENTATIONSANALYS AV DEN DEONTOLOGISKA ARGUMENTATIONENOlausson, Per January 2022 (has links)
The ethical implications of autonomous weapon systems is a highly debated topic. While research and development of autonomous weapon systems is ongoing, non-governmental organizations seek to ban the technology. Ethicists give conflicting answers as to what is right and what is wrong. Although, arguments opposing the use of autonomous weapon systems seem to dominate the debate, particularly when balancing deontological arguments that oppose autonomous weapon systems against those who advocate the technology. The purpose of this study is to evaluate deontological arguments opposing the use of autonomous weapon systems using argument analysis. This is done in order to assess the deontological case for opposing autonomous weapon systems. The findings of this study are that, although influential deontological arguments opposing autonomous weapon systems are more numerous than supporting ones, the deontological case for opposing autonomous weapon systems is weak in both tenability and relevance. The main tenability concerns are the application of theory in premises and conceptual incoherence. The main relevance concern is variations in the way autonomous weapon systems is defined. These weaknesses show that the analysed deontological arguments opposing the use of autonomous weapon systems should not alone dictate the direction of the ethical debate.
|
3 |
Robotar och krigets lagar : en analys av autonoma vapensystems kompatibilitet med den internationella humanitära rätten / Robots and the Laws of War : An Analysis of the Compatibility of Autonomous Weapons with the International Humanitarian LawEinarsson, Gustav January 2024 (has links)
No description available.
|
4 |
Diskursanalys av autonoma vapensystem : Med Sverige i fokus / Discourse analysis of autonomous weapons systems : - with Sweden in focusAndersson, Ellen January 2022 (has links)
Military developments suggest that autonomous weapons systems will be the future ofwarfare. Therefore, it is important to understand how to define the concept and how peopleexpress themselves around it. This paper will analyze how important actors in Sweden talkabout autonomous weapon systems. The concept of how autonomous weapons is constructedand what diversities and similarities there are in the expressions of autonomous weapons willbe examined in this paper. The question is if there is a hegemonic status in any discourse?Actors' expressions, such as political parties and researchers as FOI, SIPRI and The SwedishPeace and Arbitration Society, will be investigated. Social constructivism and discourseanalysis provide theoretical tools for the analysis. The discourses are organized byoperational and financial, human involvement, legal field and political discourses.The analysis shows how important characters try to give meaning to the phenomenon, forexample economic, human involvement and fear, which affect the different discourses. Theconclusions indicate that the juridical discourse has reached a hegemonic status, because allactors connect autonomous weapons to juridical frameworks.
|
5 |
Potentiella ledarskapsutmaningar ur ett moraliskt stressperspektiv vid implementering av autonoma vapensystem : Krav och påverkan / Potential leadership challenges from a moral stress perspective when implementing autonomous weapon systems : Demands and impactsMalmborg, Karolina January 2019 (has links)
The purpose of this study was to gain a greater understanding of potential challenges from a moral stress perspective that Swedish military leaders can face when implementing autonomous weapons systems. Two questions were asked to investigate this: what demands may arise and how can leaders be impacted? The study was conducted as a literature study and data from nine peer reviewed articles and a research report from the Swedish Defense Research Institute were analyzed via thematic analysis. The result seems to show that the lack of control, the lack of trust and difficulty in demanding responsibility from an autonomous weapon system creates moral leadership challenges. Without control over the autonomous weapon system, the consequences of its actions risk going against the leader's moral and this creates problems with how leadership can and should be conducted. The study also seems to show that autonomous weapon systems can lead to a moral impact on leaders, since autonomous weapon systems risk leading to increased distancing and risking contributing to increased violence. Given the moral leadership challenges and the moral influence made visible in this study, there seems to be a great risk of moral stress and even moral injury if autonomous weapon systems are used for actions that go against the leader's morality.
|
6 |
Artificiell Intelligens och krigets lagar : Kan skyddet i internationell humanitärrätt garanteras?Öholm, Emma January 2023 (has links)
Artificial intelligence (AI) is one of the fastest developing technologies globally. AI has recently entered warfare and thus taken a place in international law. Today the use of AI in warfare is through machine learning and autonomous weapon systems. Autonomous weapons are expected to play a decisive role in future war- fare and therefore have a major impact on both civilians and combatants. This gives rise to an examination of the role of artificial intelligence, machine learning and autonomous weapon systems in international law, specifically international humanitarian law (IHL). The purpose and main research question of the thesis is to examine how the use of AI, machine learning and autonomous weapon systems is regulated within international law. Further the thesis examines if the regulations sufficiently can ensure the protection that is guaranteed within IHL or if additional regulation is needed. The research question is answered by examining the relevant rules in IHL, compliance with the protection stated in the principles of distinction, pro- portionality and precautions in attack and lastly by analyzing the consequences for civilians and combatants. Conclusions that can be made is that the rules of IHL are both applicable and sufficient to, in theory, regulate autonomous weapon systems. However the weapon itself must be capable to follow IHL and in order to guarantee this ad- ditional regulation is needed on the use of autonomous weapons. The use of autonomous weapon systems does not necessarily violate the principles of dis- tinction, proportionality and precaution in attack. On the contrary, the use of autonomous weapons can possibly ensure that the principles are respected even further. This however depends on the actual capabilities of autonomous weapon systems and whether they can make the complex judgments required for each principle. It is although still of importance to ensure that the element of human control is never completely lost. The issue that keeps returning is the potential loss of human control. At all times human control must be guaranteed to ensure that the final decision always remains with a human. If humanity in warfare is lost the consequences and risks for civilians will increase. Not only is there a possibility of increase in use of violence but also an increase of indiscriminate attacks. The rules of IHL aim to protect civilians as well as combatants, and the use of this new weapon will lead to difficulties to navigate armed situations for combatants. This will increase the suffering of civilians, but also risk overriding the protection of combatants that IHL ensures.
|
Page generated in 0.1006 seconds