Picture an Air Force officer, sitting at a control panel at a Nevada military base, piloting a well-armed drone (a “remotely piloted aircraft” or RPA) flying thousands of miles away over Afghanistan looking for Taliban targets on his strike list. Some pilots are situated closer to their targets. But wherever located, he or she has life or death decisions to make, may be coordinating a potential strike with those on the ground so far away, but has probably been working some very long hours. To put it mildly, morale in this field is low, while stress is huge.
Air Force Secretary Deborah Lee James tells is that “these pilots are under ‘significant stress from what is an unrelenting pace of operations… Now, these pilots, just to give you a little color on this, fly six days in a row,’ she said. ‘They are working 13, 14 hour days on average. And to give you a contrast, an average pilot in one of our manned Air Force aircraft flies between 200 and 300 hours per year.’” Huffington Post, 1/16/15. And with drone operations moving away from the C.I.A. to the military, finding enough pilots is beyond challenging.
Even as the Air Force has sweetened pay levels for these volunteers, being a drone pilot carries a lot of emotional stress – antiseptic, long distance killing is disquieting to say the least… and recruiting isn’t easy. Most of these officers never thought that they would ever be drone pilots. They aspired to something more dramatic, taking personal risks and rising above the challenge. They now can kill with absolutely no risk to themselves.
Sleep deprivation, insufficient training, adjusting to operating in an entirely different time zone, infects these soldiers, who are often given the most demanding and deadly missions in the entire US military. Pilots and their “sensor operators,” who man the cameras, generate data and track targets, sit side-by-side. Whatever else is said and done, RPAs seem destined slowly to replace manned aircraft, generating ethical issues by the planeload, challenging those who want waging war to be more difficult and costly as a deterrent. War should be our last choice, say many.
The moral issues of remote killing are legion. With a human controller, the questions are large. Without that direct human supervision – a death robot, if you will, with targeting preprogramming or artificial intelligence – the moral issues are even larger. Oh sure, we’ve had “intelligent” cruise missiles, programmed with a priority of targets (images of tanks, missile batteries, etc.), taking out the highest-profile target available… automatically. But war without individual consequences does become a bit too easy.
There may even be a bigger issue looming back home as well. As police departments have received massive governmental military equipment, from armored personnel carriers and assault vehicles to other tactical equipment and tactics designed for a military battle, with more than a few developing their own drone-strike capacity, we are watching excessive force being deployed in some fairly minimally threatening situations (like breakdown the door of a drug dealer with military precision). The line between “us” and “them,” evidenced in recent racially-charged cop-on-black or black-on-cop shootings, is increasingly stark… increasing the distance between the police and the community. It’s a trend that needs to be reversed… and quickly.
The willingness of police to deploy death-technology in stand-off situations has drawn a critical eye, particularly foreigners from those from nations where the death penalty has been abolished. The Dallas sniper attack didn’t end well for the murderous perpetrator, Micah X. Johnson. He was killed by a blast from an explosive, delivered where he had sequestered himself, by a police-guided robot (an Andros F6B like the one pictured above). To some, the police became judge, jury and executioner, dispensing lethal justice without the benefit of a trial.
The July 12th BBC.com puts this all in perspective, from a British point of view: “”The use of a robot to deliver an explosive device and kill the Dallas shooting suspect has intensified the debate over a future of ‘killer robots.’
“While robots and unmanned systems have been used by the military before, this is the first time the police within the US have used such a technique with lethal intent… ‘Other options would have exposed our officers to greater danger,’ the Dallas police chief said.
“Robots are spreading fast. What might that mean? … Remote killing is not new in warfare. Technology has always been driven by military application, including allowing killing to be carried out at distance - prior examples might be the introduction of the longbow by the English at Crecy in 1346, then later the Nazi V1 and V2 rockets…
“South Korea pioneered using robots to guard the demilitarised zone with North Korea. These are equipped with heat and motion detectors as well as weapons… The advantage, proponents say, is that the robots do not get tired or fall asleep, unlike human sentries… When the Korean robot senses a potential threat, it notifies a command centre… Crucially though, it still requires a decision by a human to fire.
“And this gets back to the crucial point about the Dallas robot. It was still under human control… The real challenge for the future is not so much the remote-controlled nature of weapons but automation - two concepts often wrongly conflated… Truly autonomous robotic systems would involve no person taking the decision to shoot a weapon or detonate an explosive.
“The next step for the Korean robots may be to teach them to tell friend from foe and then fire themselves… Futurologists imagine swarms of target-seeking nano-bots being unleashed pre-programmed with laws of warfare and rules of engagement…
“And there are still risks to remote-controlled as well as fully automated systems… The military uses encrypted channels to control its ordnance disposal robots, but - as any hacker will tell you - there is almost always a flaw somewhere that a determined opponent can find and exploit.
“We have already seen cars being taken control of remotely while people are driving them, and the nightmare of the future might be someone taking control of a robot and sending a weapon in the wrong direction… The military is at the cutting edge of developing robotics, but domestic policing is also a different context in which greater separation from the community being policed risks compounding problems.
“The balance between risks and benefits of robots, remote control and automation remain unclear… But Dallas suggests that the future may be creeping up on us faster than we can debate it.” Warfare isn’t pretty, no matter where it is waged. But it gets worse when the warfare is waged at home… by local police. Perhaps Micah Johnson was doomed to die anyway, in a shootout by refusing to surrender… but was a police-imposed death sentence consistent with our own values, even our constitutional “due process” requirements? What are your thoughts?
I’m Peter Dekom, and it simply too easy to let technology accelerate before we’ve designed the ethical and legal restraints to define the relevant limits and conditions, particularly when lives are at stake.
No comments:
Post a Comment