The world stands at a high-end technological breakthrough in the fields of civil and military technological developments. While some states mull weapons “modernization” programs to extend the age of arms technology into the next century, others insist on the path of balance and stability while keeping a considerate pace of strategic defense buildup. All the while, recent military conflicts experienced autonomous weapons demonstrations such as the use of drones in the Armenia-Azerbaijan and Russia-Ukraine conflicts. These developments in weapons technology ask for an informed debate on the risks, concerns, and associated implications for global and regional stability. Subsequently, Lethal Autonomous Weapons (LAWS) inductions will massively impact military planning and global strategic stability.
LAWS are a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy it without manual human control of the system. Cyber domain is the forefront of this revolution with advancements in military technologies such as facial recognition and computer vision, autonomous navigation in crowded environments, cooperative autonomy, or swarming. The merits and demerits of LAWS must be lined up when probing the plausibility of the stakes involved in the continued development or deployment of these weapons. There is an emphasis on technical, military, legal, and ethical issues with the weaponization of increasingly autonomous technologies. Including advanced disruptive technologies such as directed energy weapons (DEWs) which lie in the domain of future technologies, they coincide with the fourth industrial revolution that is underway.
The vital implications of LAWS on conventional and strategic domains have multi-layered impact on global and regional strategic stability. Three core aspects underscore the technicalities and risks associated with LAWS.
The first aspect has two technical viewpoints. Firstly, it views that these in-built systems and their technical efficacy offer the feasibility of precise targeting and reducing human factor in the battlefield. The other view sees the risks and concerns about unintentional escalation, miscalculation, misperceptions, and global unrest more realistically. Moreover, loose ended weapons inherit the danger of easy access to non-state actors for malicious activities.
The second aspect is the threat posed by LAWS to the global and regional strategic stability calculus. Hypothetically, if any military crisis includes autonomous and AI elements comprising high-precision tactical and hypersonic weapons with novel warheads, military engagement will dramatically change and intensify in the nuclear sphere. Similarly, unmanned spacecraft, low-orbit surveillance, and satellite communication systems are used in space. Cyber weaponry and automatic hacking systems are also becoming more common. It is a concerning factor for both nuclear and non-nuclear states due to their likelihood of enhancing vulnerabilities in controlling escalation prone crisis situations.
The third aspect is the high impact risks and threats resulting from the dramatically reduced time allotted for strategic decision-making within the military Communications, Command, and Control (C3) and Intelligence, Surveillance, and Reconnaissance (ISR) systems. The main disadvantage of human oversight on a computer is that it takes too long for the human intellect to analyze the situation and come to the correct conclusion. It becomes an advantage once it comes to avoiding devastating catastrophes across the world.
In this regard, we can take the example of Pentagon’s Maven, COMPASS, and Diamond Shield, which are just a few of the many military programs that aim to have supercomputers for data analysis and scenario development for political and military leadership. That includes the risk of time-bound decision-making while having limited human control over the strategic decisions, which are instead based on machine learning and mathematical algorithms rather than human thinking response. Therefore, the development and employment of LAWS can amplify the prevailing military balances to imbalances between countries, which not only makes them vulnerable to respond to any military crisis situation but also compels for arms race. For example, in the Eastern European security framework, emerging technologies appear to be an additional challenge to neutralizing conflict dynamics. It can also be relatable to the Korean Peninsula concerning North and South Korean issues.
In the present state of play, LAWS are in their initial phase of development, with significant gaps, including definitional challenges, risks, concerns, aspirations, and objectives. They have possible effects on strategic stability and nuclear risk in regional contexts and between great powers.
The High Contracting Parties (HCP) States that have signed or ratified a treaty are extremely divided on the issue of regulation vs prohibition/ban on LAWS. Four groups have emerged in the Convention on Certain Conventional Weapons (CCW) as a result of this divide. There have been discussions at the CCW on an additional protocol on LAWS.
The first group consists of States that consider LAWS a new tool for stability and promotion of responsible state behavior. They believe that these technologies offer speed, accuracy, and flawless coordination, which serve as a powerful “force multiplier” and helps reduce collateral harm.
The second group advocates a pre-emptive ban because LAWS cannot understand the changing features of actual warfare such as crisis escalation, decision-making, information processing, precautions, proportionality, the chain of command, target identification, selection, and engagement.
The third group of States emphasizes upon building consensus and common ground on key concepts and definitions before deciding about regulation or ban.
The fourth and largest group of States is the Non-Aligned Movement (NAM), which calls for a legally binding instrument stipulating prohibitions and regulations of such weapons to ensure meaningful human control over the critical functions of a weapon system.
There are few regulatory challenges to make a framework for LAWS. Nearly ten-year-old global regulatory initiatives surrounding LAWS are stuck in a corner. The development of a regulatory framework for LAWS has undoubtedly been impeded by ongoing annual debates to regulate them.
Lethal autonomous robots (LARs) were the subject of a ground-breaking report in 2013, and a few years later, the conversation about LAWS was started at the Convention on Certain Conventional Weapons and formalized in the form of a Group of Governmental Experts (GGE). Since 2017, the GGE on LAWS has had meetings annually, but until the end of their mandate in 2021, they were unable to agree on a normative and operational framework. Even though the Sixth Review Conference of the CCW extended its mandate in 2021 to continue discussions, there were no noteworthy breakthroughs in 2022. The meetings will now recommence in 2023. LAWS are still under discussion to get a formal definitional clarity and regulatory process establishment. In the Conference of Disarmament (CD) the GGE on LAWS is the multilateral venue where the issue is being debated.
The consensus-based approach of the GGE , which requires that every HCP in the Group, gaining a majority, be able to concur on single aspect of the GGE processes, is a gap that could be one factor due to which an impasse may last.
Subsequently, regarding LAWS, each HCP at GGE have its unique national agendas and interests. For states like Russia, South Korea, Israel, and the United States, that are presently investigating, producing, testing, deploying, and/or trading, the absence of any laws may provide a favorable environment. On the other side states that may wish for a strong regulatory framework to be built quickly, even if they lack the resources or the national interest to do so, are at risk owing to an adversary’s use of LAWS, or both. Few guiding principles are merely predicated on the least common denominators to make themselves agreeable to a number of opposing HCPs. As a result, the GGE largely continues to be an exclusive group where LAWS are continually discussed at the policy level, involving fundamental issues such as definitions, and concepts of autonomy, meaningful human control, and technical and ethical issues.
In conclusion, arms control regimes and mechanisms are tools and processes which serve to achieve the objectives of peace, risk reduction, tensions, crisis management, and conflict resolution. Even though the discussions on LAWS, originated in the UN Human Rights Council (UNHRC) as a UN problem and some members have insisted on their return, many of these governments seek to achieve a regulatory balance between military necessity and humanitarian considerations at a security-oriented body like the Group of Governmental Experts.
Globally, assessing the strategic significance of LAWS underlines several aspects that have direct and indirect impact on the security and strategic discourse. Strategic experts, regardless of their origin, agree that LAWS are a double-edged sword. On the one hand, LAWS technology related Artificial Intelligence could enhance nuclear command and control, early warning, ISR, and the physical security of nuclear capabilities, among other areas, and can improve states’ sense of security. On the other hand, the same advances could cast doubt on the survivability of their respective second-strike capabilities. This doubt would stimulate more aggressive nuclear postures increasing the nuclear risk.
For the foreseeable future, the use of LAWS will be a part of military capabilities. Therefore, the global and regional strategic security calculi will be more complex. By involving border sharing and competitive and confronting states with aggressive approaches, the bilateral and multilateral dynamics between them may aggravate. Hence, defense and security related experts and forums are required to look into these emerging issues to maintain regional and global strategic stability.
This article was originally published in another form at https://cscr.pk/explore/themes/defense-security/lethal-autonomous-weapons-conundrum-and-the-state-of-play/
Ms Huma Rehman
Ms. Huma Rehman is currently working as an Associate Director at the Center for International Strategic Studies (CISS) Islamabad.