People fear drones. People fear “killer robots.” People fear death by push button. People need to put away their fears and remember that computing power, coupled with automation and rules-based decision-making, has saved many lives and is likely to save many more than any runaway robot ever will.
That was the general consensus of a group of experts here at the annual meeting of the military’s electronic warfare experts discussing the provocatively titled subject: Can “Skynet” and “the Borg” Solve EW and Cyber Challenges?
The short answer was yes, probably. The long answer was, yes, probably, but it’s a long way off. Panelist Harry Wingo, former counsel to Google and to the Senate Commerce Committee, said he wished he was smart enough to come up with Wingo’s Law, which would predict the rate at which robots would save lives.
“There are going to be so many instance of robotics saving lives,” he said, offering the example of the combination of robotic defibrillators. A passerby who witnessed what looks like a heart attack could open an emergency app and hail a defibrillator which would respond more quickly and efficiently than would an emergency medical technician.
But on the issue of automated weapons that can kill people, the issues of fear and ethics are clearly still being grappled with at the Pentagon and in society at large. For example, Friedman ointed to the Flash Crash of May 6, 2010 when the market fell almost 1,000 points in a matter of minutes. “We have absolutely no idea what happened,” he noted of the automated trading disaster. That, of course, leaves commanders and policymakers wary of automated actions.
David Slayton, fellow at the conservative Hoover Institution, citing conversations with senior officials, said “there is a certain degree of discomfort with determining where you are going to achieve control over that kill chain.” That raises the question if we then “allow the next step to an automated system.”
Automated systems, of course, don’t get angry and are unlikely to order a retributive strike or to lose control and keep shooting after the target is killed. “Machines have the advantage over us in that they are a bit cooler and dispassionate, because they are machines,” Wingo said.
The issue is greatly complicated because there are clearly different types of attacks in the electronic world, which is what this panel was most closely focused on. Most attacks in the electronic world would be considered espionage or criminal activity. The financial world is most advanced in launching automated electronic actions, as everyone who has ever read about high-speed trading and hedge funds knows. Problems with those have wreaked havoc the cause of which is still not completely understood, said panelist Allan Friedman, an expert at the Brookings Institution.
Extend automated capabilities to electronic warfare and cyber weapons and you really raise few questions distinctive from those any military has to grapple with in deciding when and where to kill an enemy or to destroy their weapons.
In an interesting piece, the Air Force’s top lawyer offered this opinion last year in an official blog on “killer robots:”
link.
I am hoping and praying this is meant as an ironic twist just for Halloween.
Right? Right???? Right??????!?!?!?!?!
No comments:
Post a Comment