Marc Goodman of the Future Crimes Institute and Singularity University shares his thinking on the promise — and threat — of drones.
For most people, drones are flying robots irrevocably associated with killing, warfare or even war crimes over the skies of Iraq or Afghanistan. Yet what started out as a purely military technology is rapidly migrating into our everyday lives.
The Department of Homeland Security (DHS) and its Customs and Border Protection Agency has been flying drone missions along the Southern US border to enforce everything from fishing and game violations to illegal immigration and narcotics smuggling. The FAA has authorized state and local police agencies in Texas, Miami, Seattle — even Mississippi State University — to fly drones in federal airspace. Moreover, police forces in South Africa, Kenya, Italy, Australia and elsewhere are now using drones for a variety of civilian law enforcement purposes.
What’s behind this rapid adoption of drone technologies? Mostly, Moore’s Law. Drones such as Northrop Grumman’s RQ-4 Global Hawk were once the sole purview of elite military forces. Each one cost billions of dollars to develop and hundreds of millions to buy. Today, thanks to incredible advances in computing, many of the Global Hawk’s capabilities, though certainly not all, can be replicated for thousands, if not hundreds of dollars.
What does this mean? It means these devices will become pervasive in our lives and that they are here to stay.
Innovation cannot be stopped and that means the drones are coming. It also means we need some safe and sane guideposts to help guide these discussions. Robots hold the potential to bring untold good to the world. But it strikes me that our knowledge of robotics is developing way more rapidly than our maturity or understanding of the ethical and moral implications of these technologies.
Innovation cannot be stopped, and that means the drones are coming.
We have to both understand and appreciate the fact that criminals and terrorists are often early adopters of technology, and the latest global trends in robotics have not been lost on them. In other words, sadly, criminals and terrorists can fly drones too. In 2011, the FBI arrested an al Qaeda affiliate who planned to launch three GPS guided miniature jet aircraft laden with C4 explosives against the US Capitol and the Pentagon. Hobbyists have successfully mounted firearms to home-built drones, such as this remote-controlled helicopter with a .45 caliber handgun. What will our future look like when the average “active shooter” madman begins to exploit these technologies?
Criminals have also been using drones to circumvent our current security paradigms. Prisons use tall, often electric, fences to isolate criminals for public safety. In Brazil, organized crime gangs used drones to fly cell phones and other contraband right over the fences. Modern security design is failing to keep pace with the flying robots.
Of course it’s not just criminals that are taking on government institutions with drones. In September 2013, members of the Pirate Party in Germany rapidly flew a drone directly towards Chancellor Angela Merkel as she delivered a speech. The stunt was a prank and the drone landed harmlessly a few feet away from her, but what if it had been armed with an explosive device?
Many, including Microsoft founder Bill Gates and MIT’s Cynthia Breazeal have noted the coming ubiquity of personal robots. There are certainly devices that, whether air-, sea- or ground-based, may have untold benefits for society. Applications include delivering medicines to rural Africa via Matternet [see today’s talk by the company’s Andreas Raptopoulous], responding to disasters in Haiti, helping farmers with precision agriculture, battling wildfires in Yosemite National Park, or even delivering Domino’s pizzas… Drones and robots will undoubtedly form an integral part of our future world. The question is, how best to use these devices?
Frankly, I am concerned about their misuse, both by criminals and by governments. Organizations such as the ACLU and the Electronic Frontier Foundation rightfully pose significant questions about the privacy implications of drones. And there are other reasons to be concerned.
All drones today are hackable. They are nothing more than flying computers that will liberate cybercrime from behind today’s computers screens and launch it into our everyday physical world. And they’ll do so in a way for which most of us are entirely unprepared.
All drones today are hackable. They are nothing more than flying computers
There have already been numerous cases of drones being hacked. Just one example: in 2012, students at the University of Texas, Austin, who for months had warned officials at the DHS of their drones’ vulnerability, staged a live demo. To the absolute disbelief of the assembled government officials, the students commandeered a Homeland Security drone right before their very eyes.
As we march into a world of ubiquitous robotics, what are the public policy and moral implications of launching drones (with or without weapons systems)? What questions should we be asking now, given that these devices are inherently insecure and subject to hacking? How do we weigh the risks versus benefits of these systems? And where, oh where, are the robot-ethicists?
The FAA alone cannot be the final arbiter of these questions. Though understandably cautious of an influx of unmanned flying robots roaming around the national air space, the agency is under intense pressure from drone companies looking to make their fortunes in the new world of unmanned robotics. With the market estimated to be worth nearly $90 billion within the next decade, that pressure is already beginning to tell; the FAA has already begun to loosen its restrictions. A series of new rules will soon go into effect as part of the FAA Modernization and Reform Act of 2012.
It is troubling that some officials within the FAA believe that privacy issues regarding drones are “outside its mission.” So who does oversee drone-related privacy infringements? It’s unclear. A 2012 Government Accountability Office (GAO) audit found that “no federal agency has been statutorily designated with specific responsibility to regulate privacy matters relating to Unmanned Aerial Systems within the entire federal government.”
Who then, should be responsible for the obvious legal, moral, ethical, privacy and public safety issues associated with drones? In short, we all should. I am not, as some would have it, anti-robot or anti-drone, and I clearly see the multifold benefits these devices may portend. I am here to urge us all to step up and take responsibility for what’s coming — and not to assume that someone else is looking out for our rights and security in this new, uncharted territory. The revolution in robotics will be every bit as significant as was the computing revolution that preceded it. What that future looks like and whether it is to humanity’s ultimate benefit is up to each and every one of us.
Marc Goodman is the founder of the Future Crimes Institute and chair for Policy, Law and Ethics at Singularity University. At TEDGlobal in Edinburgh in 2012, Goodman shared a sobering look at the dark side of technology in his talk, “A vision of crimes in the future.”