Roe, Elena2025-08-072025-08-072025-06-27https://hdl.handle.net/10919/137097This case study discusses the rising use of Artificial Intelligence (AI) and Autonomous Weapons Systems (AWS) in modern warfare, with special focus on the humanitarian and moral implications of AI-driven military technology. Beginning with a discussion of the arms race among global powers for AWS, the case analyses how AI systems—such as drones and facial recognition software—are used to identify and engage targets with minimal human interaction. It examines the increasing reliance on AWS in wars like the Russia-Ukraine conflict and critically examines the Israeli military's application of AI in Gaza. The example centers around technologies like "Lavender," which provides Palestinian individuals with a score of likelihood of being linked with Hamas based on impenetrable patterns of data, and "The Gospel," which designates infrastructure for bombing. The case reveals the mechanisms through which AI can create prejudice, misidentification, and civilian casualties, and raise immediate concerns about algorithmic warfare, accountability, and international law. With over 50,000 Palestinians recorded to have been killed since October 2023, the case argues that AI use in targeting—when it makes distinction between combatants and civilians opaque—is highly perilous and necessitates strong regulation. With this focus, students are invited to discuss the moral obligations of AI developers, the personal liberties involved in pattern-of-life surveillance, and the necessity of global regulation of cyber war.7 pagesapplication/pdfenIn Copyright (InC)This Item is protected by copyright and/or related rights. Some uses of this Item may be deemed fair and permitted by law even without permission from the rights holder(s). For other uses you need to obtain permission from the rights holder(s).AI WarfareAutonomous Weapons SystemsEthics & Civilian HarmArtificial Intelligence as a Weapon of WarReportVirginia Tech