web analytics

Killer Robots: The Future of Warfare

Experts fear an arms race to develop autonomous AI weapons that, if left unregulated, could be the end of humanity.

Recently, the United Nations hosted a Convention on Certain Conventional Weapons (CCW), featuring experts in artificial intelligence, military strategy, humanitarian law, and disarmament, to discuss the future of “legal autonomous weapons systems” (LAWS) – what some have dubbed “slaughterbots.” An effort to ban these killer robots failed, even though some 125 member-states said they wanted new laws governing the development and use of this frightening new field of technology. The United States, Russia, China, the United Kingdom, and others were strongly opposed to a ban, making unilateral agreement impossible. So, for now, no regulations on AI weapons will be established.

What on Earth Is a Slaughterbot?

GettyImages-1351049655 military robots

(Photo By Marta Fernandez Jara/Europa Press via Getty Images)

Slaughterbots are autonomous weapons that select and apply force to targets without human intervention. Developed through artificial intelligence, this new weaponry has been deemed controversial. Such killer robots hold lethal weapons that operate and execute without a human conscience weighing in on decisions. The CCW was established in 1983 and has been convened annually to restrict the world’s most unethical or cruel weaponry. However, despite outcry by experts, this year’s convention failed to add killer bots to the list.

From 2016 to 2020, the United States budgeted $18 billion for its autonomous weapons program, and it was not alone. Militaries around the world, including those of Russia and China, have heavily invested in research and development. According to the U.N. Security Council, last year was the first time humans were killed by such armaments, which took place during the Libyan civil war. Kargu drones, developed by Turkey’s defense firm STM, held “precision strike capabilities for ground troops.” Strapped with guns, they were utilized by Tripoli’s government against militia fighters.

Turkey is not the only country to deploy frightening AI weapons. Korea’s Demilitarized Zone is patrolled by self-firing machine guns. Israel used its Harop unmanned combat aerial vehicle (UCAV) to find and target Hamas terrorists.

Russia has a new stealth fighter called Checkmate, a robot weapon that combines AI systems with a human pilot. It is creating a pilot-less version that will rely solely on technology. China has developed and tested armed robot submarines that can track, hunt, and destroy enemy ships autonomously.  It also has produced drone swarms and anti-submarine drones that can carry medium-size cruise missiles and are designed for long-endurance missions at high altitudes.

Concern From Experts

New Banner Military AffairsHuman rights and humanitarian organizations are desperate to establish prohibitions on such munitions. As the first cases of use have risen, it is clear such weapons represent the example of a slippery slope.

Companies across the world are making drones with AI systems able to detect a human target through thermal imaging or facial recognition. The technology required to distinguish between a civilian and a non-combatant requires extreme accuracy and precision. These firearms operate without a human brain, relying instead on algorithms and independently operating AI.

Max Tegmark, a professor at MIT and president of the Future of Life Institute, has warned that gangs and cartels will use slaughterbots when they become affordable and accessible. In a recent interview with CNBC, he said the drones are “going to be the weapon of choice for basically anyone who wants to kill anyone … be able to anonymously assassinate anyone who’s pissed off anybody.” Tegmark added that “if you can buy slaughterbots for the same price as an AK-47, that’s much preferable for drug cartels.”

Voiding an Arms Race

Experts are drawing similarities between bioweapons and this new line of LAWS. They can be made cheap and scalable, but, as the United States and Russia have realized, bioweapons are inefficient and too imprecise. It is hoped the same conclusion will be reached about slaughterbots.

Professor James Dawes of Macalester College drew parallels between the future of LAWS and the nuclear arms race. He warned that “the world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia.”

Just as technology has outpaced regulation in nearly every industry, the military-political discussion is no exception. Tegmark told Wired, “[W]e’re heading, by default, to the worst possible outcome.” It may seem unrealistic, or almost laughable, to the average person, but a robot apocalypse and elimination of cities are conceivable down this road, according to experts such as Dawes.

This arms race could be our last. The dangers include AI creating a mind of its own, operating independently and uncontrollably. These machines are unpredictable and prone to algorithmic errors. But unless limitations on the expansion and exploration of these weapons are established, they are likely to be armed with biological, chemical, or nuclear weapons. And once we reach that point, there is no going back.

~Read more from Keelin Ferris.

Read More From Keelin Ferris

Latest Posts

Social Media or Bust?

While social media can be a good venue to find and connect with relatives and friends, it has been accused of...

White House Muzzling Free Speech?

The Supreme Court hears arguments against social media censorship. https://www.youtube.com/watch?v=RY1v36oBgKc...

Survey Says: It’s Time to Leave New York

Things are tough all over in New York, and a recent citizens survey describes just how dissatisfied residents are...

Latest Posts

Social Media or Bust?

While social media can be a good venue to find and connect with relatives and friends, it has been accused of...