By Ginger Matchett
The term “autonomous weapons” has emerged as a buzzword in the security and defense policy chessboard, as the future of warfare is becoming centered around modern defense capabilities enhanced with artificial intelligence, quantum, hypersonic, and robotic technologies. Thomas X. Hammes extensively writes on the evolving character of war and the defense industrial base, with his pieces underlining key issues often overlooked within debates over autonomous weapons. In his recent work, Autonomous weapons are the moral choice, he argues that because the lines between semi-autonomous and autonomous weapons are less clear than one may think and lethal autonomous weapon systems (LAWS) are no new phenomena, the American defense industrial base developing autonomous weapons outweighs a “morality” argument to limit weapon employment.
Agreeing with Hammes, in that all autonomous defense systems will continue to include a human aspect of their creation, the argument stands that such capabilities like uncrewed aerial vehicles (UAVs) or autonomous drones will never completely remove human oversight, whether that be through the programming of technology or the physical manufacturing of equipment. Critics of this comment mention how artificial intelligence-powered drones have the ability to independently survey their surroundings, report real-time analysis, as well as launch an attack without a human operator. Yet the bottom-line is that someone must still program the artificial intelligence that facilitates these capabilities, as advanced technology does not appear from thin air. LAWS require scientists and computer programmers to create algorithms and sensor suites to identify an object as hostile or threatening, formulate a decision to engage, and guide a weapon to the target. This ensures humans will continue to remain involved with the high-tech defense strategy planning and manufacturing and even offers the potential for an expanding job market if autonomous weapons development is prioritized.
The other note worth mentioning in this bottom line is that autonomous weapons are already appearing on the battlefield despite the ongoing morality debate and opposition toward utilizing modern capabilities. Though the United Nations 2023 New Agenda for Peace concludes that states should regulate autonomous weapons systems and prohibit LAWS that function without direct human control or infringe upon international humanitarian law, it is unrealistic to believe that aggressive states will cease—let alone curb—autonomous weapons usage. Looking at Russia’s invasion of Ukraine, notable cases of LAWS emerged in March 2022 when open-source analysts identified the presence of the Russian KUB-BLA loitering munition system supported with artificial intelligence visual identification (AIVI) technology. Though there was no confirmed evidence of the drone operating in its fully autonomous mode of automatically selecting and killing targets, this furthers the need to recognize and embrace that autonomous weapons systems are employed for better or for worse, and there is no checkmate in sight for ceasing their operationalization.
Hammes’ article also discusses the ethical reasoning behind autonomous weapons, arguing that opponents of them are widely unpersuasive in their logic. Although there may be a direct or immediate lack of human deliberation with targeting and launching, an attack from high-tech weapons is not unique in “limiting freedom, reducing the quality of life, and creating suffering.” Human-operated conventional weapons do exactly this too. At the end of the day, autonomous weapons create just as much violence as the age-old systems currently funded and perceived as critical materials for a country’s protection. In fact, modern capabilities may even be less of a threat than conventional weapons due to the military advantage of increased precision, which can prevent civilian casualties and protect soldiers.
However, it is important to recognize that autonomous weapons are not without their flaws. Despite being able to pinpoint the individual or force who programmed, built, and launched the system, there may be accountability issues for the acts committed by high-tech weapons. Hammes notes that, within Western military concepts, commanders hold responsibility for actions committed by their forces and individual operators’ weapon usage. However, if one were to take a critical glance into the history of some US military’s investigative and judicial cases, one would find the system often fails to hold high-level military officers accountable for violations and is unsuccessful in condemning its own battlefield misconduct. A Chatham House interview with a NATO naval commander speculated that this is due to the hesitancy of US military officers in criticizing the failures and gaps of their institution.
Hammes’ piece was thoroughly argued, made fair conclusions, and was clearly America-focused. But the briefly mentioned, yet imperative issue that needs more attention when discussing the future of defense innovation is Europe. Because Europe is the US’s biggest ally, NATO is the world’s most effective military alliance, and Russia’s invasion of Ukraine has revitalized European investments in defense budgets, a conversation surrounding autonomous weapons should be grounded in the transatlantic relationship. The US should not aim to over-expand its position as a dominating defense powerhouse and ignore its collaboration with allies, as this will lead the US toward a situation of a queen with no rooks, knights, or bishops. Though Europe poses challenges of defense industry fragmentation, the justification of autonomous weapons and accelerating the manufacturing of innovative defense technology is more effectively supported when the US and Europe are standing side by side. Together, they play an important role in providing practical political support for a new, interoperable, and tech-dominated defense posture.
It is clear that advanced technologies are paramount in today’s warfare, and it is highly unlikely that these high-tech defense systems will be sent back to the labs and stockpiles to collect dust on outdated munitions shelves. The US and Europe must take this into serious consideration. If adversaries continue using autonomous weapons, abusing human rights, and perpetuating violence, there needs to be some “good guys” on the chessboard to incorporate the same capabilities for strategizing a level playing field and to prevent further loss of life.
Comments