Hughes, Joshua (2019) The Law of Armed Conflict Issues Created by Programming Automatic Target Recognition Systems Using Deep Learning Methods. In: Yearbook of International Humanitarian Law :. Yearbook of International Humanitarian Law . T.M.C. Asser Press, The Hague, pp. 99-135. ISBN 9789462653429
Full text not available from this repository.Abstract
Deep learning is a method of machine learning which has advanced several headline-grabbing technologies, from self-driving cars to systems recognising mental health issues in medical data. Due to these successes, its capabilities in image and target recognition is currently being researched for use in armed conflicts. However, this programming method contains inherent limitations, including an inability for the resultant algorithms to comprehend context and the near impossibility for humans to understand the decision-making process of the algorithms. This can lead to the appearance that the algorithms are functioning as intended even when they are not. This chapter examines these problems, amongst others, with regard to the potential use of deep learning to programme automatic target recognition systems, which may be used in an autonomous weapon system during an armed conflict. This chapter evaluates how the limitations of deep learning affect the ability of these systems to perform target recognition in compliance with the law of armed conflict. Ultimately, this chapter concludes that whilst there are some very narrow circumstances where these algorithms could be used in compliance with targeting rules, there are significant risks of unlawful targets being selected. Further, these algorithms impair the exercise of legal duties by autonomous weapon system operators, commanders, and weapons reviewers. As such, this chapter concludes that deep learning-generated algorithms should not be used for target recognition by fully-autonomous weapon systems in armed conflicts, unless they can be made in such a way as to understand the context of targeting decisions and be explainable.