“On the surface, who could disagree with quashing the idea of supposed killer robots?” writes Slashdot reader Lasrick. “Dr. Larry Lewis, who spearheaded the first data-based approach to protecting civilians in conflict, wants us to look a bit closer.” From the Bulletin of the Atomic Scientists:
The proponents of a UN ban are in some respects raising a false alarm. I should know. As a senior advisor for the State Department on civilian protection in the Obama administration, I was a member of the US delegation in the UN deliberations on lethal autonomous weapons systems… Country representatives have met every year since 2014 to discuss the future possibility of autonomous systems that could use lethal force. And talk of killer robots aside, several nations have mentioned their interest in using artificial intelligence in weapons to better protect civilians. A so-called smart weapon — say a ground-launched, sensor-fused munition — could more precisely and efficiently target enemy fighters and deactivate itself if it does not detect the intended target, thereby reducing the risks inherent in more intensive attacks like a traditional air bombardment. I’ve worked for over a decade to help reduce civilian casualties in conflict, an effort sorely needed given the fact that most of those killed in war are civilians. I’ve looked, in great detail, at the possibility that automation in weapons systems could in fact protect civilians. Analyzing over 1,000 real-world incidents in which civilians were killed, I found that humans make mistakes (no surprise there) and that there are specific ways that AI could be used to help avoid them. There were two general kinds of mistakes: either military personnel missed indicators that civilians were present, or civilians were mistaken as combatants and attacked in that belief. Based on these patterns of harm from real world incidents, artificial intelligence could be used to help avert these mistakes… Artificial intelligence may make weapons systems and the future of war relatively less risky for civilians than it is today. It is time to talk about that possibility.
Read more of this story at Slashdot.