Don't make killer robots, scientists warn

Drones are already being used heavily by the military to conduct deadly strikes with missiles -- autonomous weapons could be next.

Dirk Trudeau | Jan 22, 2016

As artificial intelligence grows in complexity and ability, concerns are rising that we face a Terminator-like future, especially if we create killer robots for warfare.

A recent gathering of experts in the Swiss Alps at the ski resort of Davos involved a discussion on the need to create some rules guiding the creation of weaponized robots, particularly when combined with AI, according to a Daily Mail report.

Angela Kane, the former German UN High Representative for Disarmament Affairs, said that it may actually already be too late to have these discussions, but some effort must be made to take preemptive measures to protect humanity.

Drones are already being used heavily by the military to conduct deadly strikes with missiles, and it has often been criticized because it removes the human somewhat from the equation of killing. Developing weapons that are even more autonomous in the future could be very useful to the military but also come with grave risks. At least in the case of drones, a human pilot is controlling the drone, but an autonomous weapon would mean no human is behind it and an AI is decided who and what to kill.

About 1,000 science and technology experts penned an open letter last July warning about the development of such autonomous weapons. Among them is British physicist Stephen Hawking, one of the most famous and outspoken critics of going too far with AI.

bottom ad