This week, a debate is heating up at the Convention on Certain Conventional Weapons in Geneva. It’s about when and whether we will outsource to machines not just the act of killing, but the act of deciding when to kill. Sounds a little sci-fi right? Science? Yes. Fiction? Not really.
“Some estimates say that we could have fully autonomous war machines operating within battlefields with in 20-30 years,” says tech critic Molly Wood. “The technology to have at least semi-autonomous machinery exists now.”
Here’s an example: The MK 15 Phalanx close-in weapon system. This is a gun mount on aircraft carriers today. The nickname of the M-K 15 by the way is R2-D2. The gun can search, detect, track, and make a kill assessment.
The “kill assessment” function touted in the MK 15 is part of why organizations like Human Rights Watch and Amnesty International are worried. Just last month. almost 300 industry researchers from 37 countries also signed a statement backing a ban on using weapons that fire without human decision making. They say there are BIG questions about whether we can ever program or trust a machine to make the ultimate call in battle.
“If a robot has a black-and-white order it says carry out this mission,” says Wood. “But they may make mistakes. Because they won’t make a value judgment if they see for example human shields.”
Another issue is how quickly machines can make decisions. A human might hesitate, verify, or hold back when they feel something isn’t right. You have to program that in to a machine. The convention is expected to decide this week whether to take up the issue next year.
As a nonprofit news organization, our future depends on listeners like you who believe in the power of public service journalism.
Your investment in Marketplace helps us remain paywall-free and ensures everyone has access to trustworthy, unbiased news and information, regardless of their ability to pay.
Donate today — in any amount — to become a Marketplace Investor. Now more than ever, your commitment makes a difference.