

Nations Meet At UN For 'Killer Robot' Talks (reuters.com) 24
An anonymous reader quotes a report from Reuters: Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology.
Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change.
"Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary."
In 2023, 164 states signed a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons.
Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change.
"Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary."
In 2023, 164 states signed a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons.
Black Mirror level scary shit (Score:2)
Re: Black Mirror level scary shit (Score:3)
As it should. Look up Israel's use of Lavender AI, a tool they used to profile and target suspected Hamas members for drone strikes. There was almost no human oversight, and the "acceptable losses" included entire households of civilians per target.
Re: (Score:3)
Out of three evils: Hamas Islamist attacks, Israeli human military targeters, Israeli AI-assisted military targeters. I’m not sure who I prefer. Probably the Israeli human military targeters to be honest (at least here the current conflict). I know it’s not a popular view, but the double-tap war
Need to transcend scarcity perspective (Score:1)
... when using technologies of abundance. Otherwise, rules and regulations can't fix the irony of scarcity-minded people using advanced technology to destroy what people have and create more artificial scarcity instead of create abundance for all. Nothing can be "reasonable" if it is all ironic. See: https://pdfernhout.net/recogni... [pdfernhout.net]
Or a little ironic story I wrote in 2010 about trying to talk the USA out of collective suicide from scarcity fears called "Burdened by Bags of Sand" (sadly, all too predictive
Post-scarcity "Downfall" parody of bunker scene (Score:1)
Another humorous perspective-shifting attempt by me from 2009: https://groups.google.com/g/op... [google.com]
====
Dialog of alternatively a military officer and Hitler:
"It looks like there are now local digital fabrication facilities here, here, and here."
"But we still have the rockets we need to take them out?"
"The rockets have all been used to launch seed automated machine shops for self-replicating space habitats for more living space in space."
"What about the nuclear bombs?"
"All turned into battery-style nuclear powe
There is no problem here (Score:2)
Re: (Score:2)
Bender from Temu, is that you?
AI's got a gun (Score:2)
Write a parody ("AI's Got a Gun") of Aerosmith's "Janie’s Got a Gun" based on the proliferation of AI-based weapons of war.
What, you thought I'd post the AI generated lyrics? Nah, half the fun is getting to participate in the carnage yourself!
Re: (Score:2)
half the fun is getting to participate in the carnage yourself!
Just not like that [vice.com].
Re: (Score:2)
Just not like that [vice.com].
I saw that awhile back. Considering that some very small minority of people actually have sex with their cars, I can't say I'm surprised.
That horse has long ago left the barn (Score:5, Insightful)
Re: (Score:2)
"robots" (mostly in the form of flying drones) that pretty much already locate their target and enter kill mode autonomously once they reached their target area
That's pretty much bullshit. Drones are piloted, targets are identified by humans, targets are chosen by humans.
You've "pretty much" described any anti-aircraft missile, or a guided anti-tank missile. They "locate their target and kill autonomously". They aren't just chucked into the air on blow up whatever the microprocessor happens to find and let god sort it out missions, and neither are drones.
Killer "robots" are not a problem, and they are inevitable. The problem is throwing something deadly at ... the
Re: (Score:2)
Not to mention that the UN can pontificate all it wants but the security council is the only group that has any power and even then it only take one nation to torpedo any resolution. The place is a waste of valuable real-estate.
Banning things. (Score:3)
Re: (Score:2)
Mod parent up.
The view that countries will make these weapons regardless of international agreements, while true, does not make such international agreements worthless.
Nuclear weapons are a good example. The international consensus (to the extent that it is a consensus) means that the use of nuclear weapons is universally considered an extreme measure. This is fortunate, but it didn't necessarily have to turn out that way. There were many in the immediate post-WWII era that wanted to use these weapons in th
Re: (Score:2)
Heck, torture has not only been "banned", but also proven time and again to not provide any benefit, and yet neither the western nor the eastern bully nations, nor the old and new tin-pot dictators have stopped using it, if only for their personal perverse satisfaction.
Deeper issue is they are all ironic devices (Score:2)
As I suggested in 2010: https://pdfernhout.net/recogni... [pdfernhout.net]
====
Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels) to replace
Why bother? (Score:2)
Can they programmed to target Amnesty International employees, and donors?
Kill bots (Score:2)
Re: (Score:2)
Re: (Score:2)
That's probably why they tested so many in New Jersey... they wanted test flights at scale before putting them all away in warehouses or automated motherships.
The truth is out there!