Fears of ‘killer robots’ using artificial intelligence to choose human targets are growing as the war between Russia and Ukraine intensifies

AUTONOMOUS robots have transformed modern warfare, while nations race to develop technologies that shoot at targets without human input, leaving experts alarmed.

AI-controlled weapons can be more destructive than nuclear weapons and confuse civilians with combatants, experts warn.

Ukrainian soldiers participating in a military exercise

3

Ukrainian soldiers participating in a military exerciseCredit: AFP – Getty
Russian President Vladimir Putin has publicly expressed interest in growing Russia's AI sector

3

Russian President Vladimir Putin has publicly expressed interest in growing Russia’s AI sectorCredit: Getty Images – Getty

An AI system that powers a drone can scan a battlefield and select targets for destruction, like a pistol aiming and firing itself.

Autonomous weapons can reduce the amount of risk human soldiers face – giving an obvious strategic advantage to a country at war.

But James Dawes, an expert in the weaponry of AI, wrote a harsh review of autonomous weapons and their potential for The Conversation:

“When choosing a target, will autonomous weapons be able to distinguish between enemy soldiers and 12-year-olds playing with toy guns? Between civilians fleeing a place of conflict and rebels taking a tactical retreat?”

Even worse is that a robot can not be held responsible for errors in combat – the responsibility for the responsibility has nowhere to go, Dawes claims.

Fortunately, killer robots do not appear to have been widespread yet, and there are no confirmed deaths caused by an autonomous weapon.

But a UN report last year said an autonomous killer drone was deployed on a battlefield in March 2020 during the second Libyan civil war, NPR reports.

The battle was between the UN-recognized government for national agreement and the forces of Army General Khalifa Haftar,

The UN report stated: “The lethal autonomous weapon systems were programmed to attack targets without requiring a data connection between the operator and the ammunition.”

It is unknown if the drones killed anyone.

A Obama-era political stance on autonomous weapons has such strict parameters that no killer robots have been submitted for review, New Scientist reports.

But other nations have been less restrictive.

A US intelligence report showed that Russia has more than 150 AI-powered military systems in various stages of development.

“The Russian military is seeking to take the lead in arming AI technology,” an intelligence official told National Defense.

Russia boycotted a February 2022 conference on autonomous arms control and will refrain from discussions continuing this month.

For experts, the most likely cause of an autonomous arms race is the use of killer robots against the Ukrainians.

“I can guarantee that if Russia uses these weapons, some people in the US government will ask ‘do we have now, or will we later need comparable capabilities for effective deterrence?'” Diplomacy expert Gregory Allen told New Scientist.

Attempts to regulate autonomous weapons have met with walls erected by threatening actors, while human rights organizations are pleading for their bans.

Meanwhile, the Campaign to Stop Killer Robots conducted a survey and found that 61% of respondents from 26 countries are against the use of lethal autonomous weapons.

A US political stance on autonomous weapons set under the Obama administration calls for a planned 10-year review

3

A US political stance on autonomous weapons set under the Obama administration calls for a planned 10-year reviewCredit: Getty Images – Getty

Leave a Comment