Military AI Drones Could Take Over Future Warfare
Headlines:
• "NASA's Perseverance Rover Discovers Evidence of Ancient Lake on Mars" (NASA)
• "World Health Organization Declares Monkeypox a Global Health Emergency" (WHO)
• "Tech Giants Warn of 'Catastrophic' Consequences of AI 'Biases'" (BBC)
• "Russian Invasion of Ukraine Enters 100th Day with No End in Sight" (New York Times)
• "Scientists Discover New Species of Ancient Human in the Philippines" (The Guardian)
• "European Union Pledges to Reduce Carbon Emissions by 55% by 2030" (The Guardian)
In March 2020, as civil war raged below, a fleet of quadcopter drones bore down on a Libyan National Army truck convoy. The kamikaze drones, designed to detonate their explosive payloads against enemy targets, hunted down and destroyed several trucks—trucks driven by human beings. Chillingly, the drones conducted the attack entirely on their own—no humans gave the order to attack.
"At some point," Samuel Bendett, Adjunct Senior Fellow at the Center for New American Security, tells Popular Mechanics , "military autonomy will become cost effective enough to be fielded in large numbers. At that point, it may be very difficult for humans to attempt to defend against such robotic systems attacking across multiple domains—land, air, sea, cyber, and information—prompting the defenders to utilize their own autonomy to make sense of the battlefield and make the right decision to use the right systems. Humans will still be essential, but some key battlefield functions may have to pass to non-human agents."
Advances in weaponry disrupted warfare time and time again, granting huge advantages to the country that adopted them first. Artificial intelligence is not only rapidly entering the civilian sector but also the military one. Ships, aircraft and ground vehicles can operate on their own. Robotic drones like the MQ-4 Triton spy on adversary forces, robotic ground vehicles carry equipment for soldiers, and robotic helicopters resupply friendly forces across great distances.
There is one exception, however, to the actions autonomous military drones can take: they must wait for explicit permission from a human controller before opening fire. The consensus among today's armies is that a human must remain in the decision-making loop. A human must study the available imagery, radar, and any other evidence that a target is legitimate and hostile, or order the drone to stand down if needed. The human being may be the only thing that stands between blurry sensors, buggy software, faulty AI logic, other unforeseen factors—and tragedy.
Comments
Post a Comment