As part of the international campaign to Stop Killer Robots, NPA calls for new international law in order to appropriately regulate autonomous weapons’ systems.
We fear a dehumanized future where people are reduced to data points, jeopardizing human rights and crossing moral and ethical boundaries.
Autonomy in weapons systems is a profoundly human problem. Killer robots change the relationship between people and technology by handing over life and death decision-making to machines. They challenge human control over the use of force, and risk creating legal and accountability gaps.
But, technologies are designed and created by people. We have a responsibility to establish boundaries between what is acceptable and what is unacceptable. We have the capacity to do this, to protect our humanity and ensure that the society we live in, that we continue to build, is one in which human life is valued – not quantified.
The development and use of autonomous weapons, where the human is taken ‘out-of-the-loop’ with respect to targeting and attack decisions on the battlefield, would represent a dramatic turning point in the conduct of warfare and raise serious humanitarian, legal, and ethical questions.
These weapons would cross a fundamental moral threshold by allowing machines to select and engage targets and make life or death decisions without any direct human control.
We are concerned that the development of fully autonomous weapons could have devastating consequences and must be prevented before countries risk entering into arms races and significant investment, technological momentum, and acceptance into military doctrine makes it more likely they will be widely used on the battlefield.