Russia is beginning a programme to develop a new generation of weapons with built-in AI, according to defense officials and military weapons manufacturers. In essence, these ‘smart weapons’ could choose their own targets, taking warfare to a frighteningly dangerous new level.
At the moment, most advanced weapons in modern warfare are already capable of “making decisions” using built in sensors and technological tools, but crucially they do not have the ability to choose their own targets, the technology merely assists the operator.
But now Russia’s new goal is to develop autonomous weapons.
“Work in this area is under way,” Tactical Missiles Corporation CEO Boris Obnosov said at the MosAeroShow (MAKS-2017) on July 20, the TAAS Russian News Agency reported. “This is a very serious field where fundamental research is required. As of today, certain successes are available, but we’ll still have to work for several years to achieve specific results.”
Russia aims to emulate the capabilities of the U.S.’s Raytheon Block IV Tomahawk cruise missile. This was deployed in Syria, and Russia hopes to be able to reach this level within a few years.
Russia is also working on developing drones that function as “swarms” using AI that make combatting them extremely difficult.
Just because it can be done, should it?
The importance of developing sound policy to guide AI development cannot be overstated. One of many reasons is because it is to prevent humans from using such technology paired with devastating weaponry for nefarious purposes. Alarm bells should be ringing around the world when AI is mixed with military combat.
The US has long had plans to incorporate AI into long-range anti-ship missiles, and China is supposedly working on its on AI-powered weapons – so Russia is not alone.
While it is possible to build these weapons, its not clear if we should. Many, including industry experts are warning that AI could be the harbinger of humanity’s destruction.
If military weapons become autonomous, this certainly does not help calm those fears.