Image: Unsplash

Preparing for autonomous warfare

Share on FacebookTweet about this on TwitterPin on PinterestShare on Tumblr

Popular culture has done its part to warn of killer robots, making the case that weapons and artificial intelligence (AI) do not mix well. At least not for humanity. It is no surprise then that with modern advancements in AI, the United Nations (UN) are meeting to discuss autonomous weaponry, with over 20 countries having already called for a total ban on it.

The meeting follows in the wake of a recently published letter calling for a boycott of a South Korean university due to its new lab in collaboration with a major defence company.

Korea Advanced Institute of Science and Technology (KAIST) is partnering with Hanwha Systems to develop “national defence technology”. The letter’s signatories fear “the third revolution in warfare”, behind the invention of gunpowder and nuclear arms. Sung-Chul Shin, KAIST’s president responded, “KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control”.

It is no surprise then that with modern advancements in AI, the UN is meeting to discuss autonomous weaponry

Though in spite of his assurances, the letter’s efforts may be in vain. South Korea’s Dodaam Systems already produce a fully autonomous “combat robot” and China, the U.S., and Russia are engaged in an AI arms race. BAE Systems, here in the UK, are developing autonomous battlefield machines and these developments are unlikely to stop soon, success already being seen with AI’s implementation in surveillance.

Much like how Facebook and YouTube use machine-learning software to flag explicit or copyrighted content, militaries are using AI to scan drone footage. Faster and more effective than humans, AI is not only able to find pre-specified objects and people but are also able to spot patterns humans may have missed, locating previously unknown terrorist safe-houses and finding new suspects. This same tracking could be developed to work in real time, helping soldiers better locate and identify threats whilst in the field. Autonomous robots could also be used in non-combat roles, for example providing on-site medical care or transporting the wounded to safety.

Faster and more effective than humans, AI is not only able to find pre-specified objects and people but are also able to spot patterns humans may have missed

However, despite their potential usefulness, there is still cause for concern. Machines can make mistakes. At worst YouTube’s algorithm may erroneously take down a video, a drone equipped with an autonomous weapon could accidentally kill a civilian it flagged as a threat. For this reason, there are currently self-imposed restrictions on weapons, requiring a human to make a lethal shot. Hacking too presents a sizable risk, an autonomous system could be hijacked and used to cause harm. Experts also fear complete automation would allow for warfare on an unprecedented scale.

So, should we be worried? With the possibilities AI’s implementation offers in so many civilian fields, inevitably it would find its place in military roles too. Though this may not be the worst development. Robots are more expendable than people. Rather than putting lives in danger, militaries could wage wars without casualties. Furthermore, whilst machines can malfunction, people make mistakes too. With the correct programming, an autonomous drone could be more accurate at identifying threats and be safer to deploy than a human soldier. Increased government spending on AI could also mean that we see its benefit in regular daily life. Countries spend billions on the development of military technology; GPS, computers and digital cameras are part of the long list of gadgets which we have because of it. Perhaps robot assistants are next to be added to that list.

With the correct programming, an autonomous drone could be more accurate at identifying threats and be safer to deploy than a human soldier

The real question is instead one of morality, is it right for an unfeeling, unthinking program to take a life? Should we literally strip warfare of all humanity? Hopefully, the UN and the international community will find the right answer.

Related Posts

Share on FacebookTweet about this on TwitterPin on PinterestShare on Tumblr

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *