The leaders, executives, head researchers, and founders of what must be nearly every major artificial intelligence, robotics and/or machine learning company in the world, have signed an open letter to the United Nations voicing their urgent concerns about the development of autonomous weapons, or killer robots.
The experts declare that the world does not have long and that once this Pandora's box is opened it will be very difficult to close. They outline that they want to prevent an arms race of autonomous weapons that could also be used to terrorize civilians in conflicts of unfathomable pace.
In the open letter written to the United Nations from the Future of Life Institute, the signatories welcome the United Nations decision to establish a Group of Governmental Experts (GGE) which will be chaired by Ambassador Amandeep Singh Gill of India. This group is established under the United Nations Conference of the Convention on Certain Conventional Weapons and its intention is to explore the regulation or potential limitation of the development of Lethal Autonomous Weapon Systems around the world.
Some of the signatories include Elon Musk as well as Mustafa Suleyman of DeepMind, and Elon Musk has been known to be very wary of this kind of development. Recently, he openly declared that Mark Zuckerberg of Facebook, who doesn't share the same fears of AI, had limited knowledge of AI and how bad it could get.
As we observe AI and robots being able to perform an ever-increasing list of tasks, like Google's DeepMind teaching itself to walk in virtual space or Boston Dynamic's highly mobile Handle robot, it doesn't seem farfetched or irrational to begin to start worrying about the capacity for such things to be weaponized.
Already this year ISIS was found to be using commercial drones to drop grenades from the sky in Mosul. So let us hope that the fears of these experts of AI can be answered and we can avoid the further automatization of killing machines.
Source: Future of Life Institute |Image via Future of Life Institute
7 Comments - Add comment