Will America's Next Soldiers Be Machines? (foreignpolicy.com) 131
Foreign Policy magazine visits a U.S. military training exercise that pitted Lt. Isaac McCurdy and his platoon of infantry troops against machines with camera lenses for eyes and sheet metal for skin:
Driving on eight screeching wheels and carrying enough firepower on their truck beds to fill a small arms depot, a handful of U.S. Army robots stormed through the battlefield of the fictional city of Ujen. The robots shot up houses where the opposition force hid. Drones that had been loitering over the battlefield for hours hovered above McCurdy and his team and dropped "bombs" — foam footballs, in this case — right on top of them, a perfectly placed artillery shot. Robot dogs, with sensors for heads, searched houses to make sure they were clear.
"If you see the whites of someone's eyes or their sunglasses, [and] you shoot back at that, they're going to have a human response," McCurdy said. "If it's a robot pulling up, shooting something that's bigger than you can carry yourself, and it's not going to just die when you shoot a center mass, it's a very different feeling."
In the United States' next major war, the Army's brass is hoping that robots will be the ones taking the first punch, doing the dirty, dull, and dangerous jobs that killed hundreds — likely thousands — of the more than 7,000 U.S. service members who died during two decades of wars in the Middle East. The goal is to put a robot in the most dangerous spot on the battlefield instead of a 19-year-old private fresh out of basic training... [Several] Army leaders believe that almost every U.S. Army unit, down to the smallest foot patrols, will soon have drones in the sky to sense, protect, and attack. And it won't be long before the United States is deploying ground robots into battle in human-machine teams.
The robots haven't been tested with live ammunition yet — or in colder temperatures, the magazine notes. (And at one point in the exercise, "Army officials jammed themselves, and a swarm of drones dropped out of the sky.) But the U.S. Army is "considering a proposal to add a platoon of robots, the equivalent of 20 to 50 human soldiers, to its armored brigade combat team."
Six generals and several colonels watched the exercise, according to the article, which notes that the ultimate goal isn't to replace all human soldiers. "The point is to get the advantage before China or Russia do."
"If you see the whites of someone's eyes or their sunglasses, [and] you shoot back at that, they're going to have a human response," McCurdy said. "If it's a robot pulling up, shooting something that's bigger than you can carry yourself, and it's not going to just die when you shoot a center mass, it's a very different feeling."
In the United States' next major war, the Army's brass is hoping that robots will be the ones taking the first punch, doing the dirty, dull, and dangerous jobs that killed hundreds — likely thousands — of the more than 7,000 U.S. service members who died during two decades of wars in the Middle East. The goal is to put a robot in the most dangerous spot on the battlefield instead of a 19-year-old private fresh out of basic training... [Several] Army leaders believe that almost every U.S. Army unit, down to the smallest foot patrols, will soon have drones in the sky to sense, protect, and attack. And it won't be long before the United States is deploying ground robots into battle in human-machine teams.
The robots haven't been tested with live ammunition yet — or in colder temperatures, the magazine notes. (And at one point in the exercise, "Army officials jammed themselves, and a swarm of drones dropped out of the sky.) But the U.S. Army is "considering a proposal to add a platoon of robots, the equivalent of 20 to 50 human soldiers, to its armored brigade combat team."
Six generals and several colonels watched the exercise, according to the article, which notes that the ultimate goal isn't to replace all human soldiers. "The point is to get the advantage before China or Russia do."
let's play global thermonuclear war! (Score:2)
let's play global thermonuclear war!
Re:let's play global thermonuclear war! (Score:4, Interesting)
Thermonuclear war has a clear threshold. You either launch a nuke or you don't.
Robotic soldiers have no such threshold. We already use AI for target acquisition, munition guidance, damage assessment, reconnaissance, filtering intelligence, logistical planning, etc. Humans are being incrementally removed from the battlefield.
Re: (Score:2)
Robotic soldiers have no such threshold. We already use AI for target acquisition, munition guidance, damage assessment, reconnaissance, filtering intelligence, logistical planning, etc. Humans are being incrementally removed from the battlefield.
I think you meant to say that human *soldiers* are being removed. "Humans" are by no means being "incrementally removed from the battlefield". If anything, they are being incrementally added to the battlefield, in greater numbers than ever before.
Which brings me to the problem I wanted to bring up. If you have a machine fighting for you (robot/drone/etc), you've got two options: either the machine is remote-controlled, or it's autonomous. If the former, the signal can always be jammed (especially if it'
Re: (Score:2)
And I think a big caveat to "autonomous killing machine" is that a fighting autonomous machine isn't killing anything when it destroys an enemy machine. And there's nothing far-fetched about that. Let's say you have a drone that loiters and then shoots at any surface-to-air targeting radar that is switched on. Pretty soon the other side figures out it's better if their radar operato
Re:let's play global thermonuclear war! (Score:4, Interesting)
Your point is taken, but:
If a machine is capable of autonomous behavior (i.e. still carrying out an objective when completely cut off from remote control), and it's capable of delivering lethal force, it's an autonomous killing machine. Or, to use the term the military uses (which I just looked up), it's a Lethal Autonomous Weapon System (LAWS).
It's possible to think of a scenario where LAWS (LAWSes?) will accomplish an objective without harming a hair on anyone's head-- as you have done in your post. The problem is that it's rather easy to think of *other* use scenarios, ranging from the mundane to the extreme, where there is a different outcome. To repeat the point I made earlier: war zones tend to be cluttered up with a large number of human beings, some of them combatants, some not.
The current status of "LAWS" (which, again, I just Googled) seems to be this: The US military doesn't (officially) field any of them right now, but they have no policy against doing so in the future, and there are no international treaties which would dissuade them from doing so.
(Not that I believe for a second that any treaty would make a bit of difference. The US still uses *land mines*, to a limited extent, despite the fact that almost every other country in the world has signed a treaty forbidding their use. The Russians and Ukrainians are using land mines right now despite being signatories to the treaty. I suppose a landmine would technically qualify as a type of primitive LAWS).
So this is a real-world issue.
Re: (Score:3)
Russia never signed the landmine ban.
Neither did China.
Ukraine signed but justifies using mines because Russia is using them.
Re: (Score:3)
The US still uses *land mines*, to a limited extent, despite the fact that almost every other country in the world has signed a treaty forbidding their use.
The US also followed the protocols of the Mine Ban Treaty until January 2020, when the US policy was modified.
On January 31, 2020, the administration of President Donald Trump announced the reversal of US prohibitions on landmine production and use. The decision nullifies years of steps by the US to align its policy and practice with the 1997 treaty banning antipersonnel landmines.
A detailed analysis of the policy modification is shown here: https://www.hrw.org/news/2020/... [hrw.org]
Re: (Score:2)
Princess Dianna's PR charity was banning land mines so after her death in 1997 many made moves towards the ban. Trump might have been for it if she had not dismissed his desperate advances towards her-- but given this was many years later he would flip to whatever was benefiting him at that moment. INCLUDING burying her at his golf course for a tax break!
https://www.newsweek.com/donal... [newsweek.com]
Re:let's play global thermonuclear war! (Score:4, Informative)
Check this out, it's actually pretty interesitng
https://www.esd.whs.mil/portal... [whs.mil]
Re: (Score:3)
That *is* interesting. I obviously understated things when I said "the military doesn't have any policy against LAWS"-- indeed, they seem to be committed to developing LAWS and have spent a lot of time here developing a bunch of policies surrounding that.
These are, of course, the military's "declaratory policies"-- the ones you put in a press release-- and may or may not have anything to do with the military's actual (classified) policies and practices.
At the risk of belaboring the obvious, it should be no
Re: (Score:2)
There are different kinds of land mines.
Most notable anti personal mines, which we try to ban.
And anti tank/vehicle mines. Which are kind of important.
Re: (Score:2)
Where does the cut off for a LAWS start.
At one end you have a dumb shell, rocket etc which certainly continue after the initial launch and control.
There's also fire and forget munitions like great seeking missiles or GPS/terrain guided stealth cruise missiles which continue long after they are cut off.
Re: (Score:2)
Not that I believe for a second that any treaty would make a bit of difference. [...] The Russians and Ukrainians are using land mines right now
Following reports that Ukraine used banned landmines in summer 2022 (during the battle for Izium), a meeting on the Ottawa treaty was held in Geneva, where Ukraine pledged to investigate. Since then there were no additional reports of Ukraine using banned weaponry. See https://www.hrw.org/world-repo... [hrw.org] The treaty, and its diplomatic enforcement in Geneva, did function as intended and corrected the situation.
Note that both parties lawfully use anti-vehicle landmines, and Russia uses antipersonnel mines whic
Re: (Score:2)
Thermonuclear war has a clear threshold. You either launch a nuke or you don't.
Robotic soldiers have no such threshold. We already use AI for target acquisition, munition guidance, damage assessment, reconnaissance, filtering intelligence, logistical planning, etc. Humans are being incrementally removed from the battlefield.
I often piss people off when I say this, but without humans being killed, war has very little purpose.
Just because one country develops it first, others will follow suit. So soon you have robot wars. So unless people get killed, it might as well be the old game of "Rock em Sock em Robots". Put them in a ring, and let the the robots duke it out.
Re: (Score:3)
Re: (Score:3)
If that happens we can still continue to make movies where humans' heart and creativity always wins in the end.
Re:let's play global thermonuclear war! (Score:5, Informative)
I often piss people off when I say this, but without humans being killed, war has very little purpose.
STTOS, S1E23, "A Taste of Armageddon"
Re: (Score:2)
I often piss people off when I say this, but without humans being killed, war has very little purpose.
Just point out to them that even an army general [wikiquote.org] thought so too. And Captain Kirk, if those people are so inclined.
Re: (Score:2)
I often piss people off when I say this, but without humans being killed, war has very little purpose.
The purpose would be to bankrupt your opponent; once they no longer have the resources to manufacture more battle-robots, your robots can march to the capital and take it over.
Re: (Score:3)
without humans being killed, war has very little purpose.
The purpose of war is to impose your will on your enemy.
Once your robots destroy the enemy's robots, your enemy must yield or die.
Re: (Score:2)
Re: (Score:2)
without humans being killed, war has very little purpose.
The purpose of war is to impose your will on your enemy.
Once your robots destroy the enemy's robots, your enemy must yield or die.
At that point, your robots are killing humans, so the robot/robot destruction was just an extra step, and the purpose of war has been preserved.
Re: (Score:2)
Yeah, the largest war fought since 1945, the one in Ukraine, is a good example of how this isn't the case.
Re: (Score:2)
No, they weren't. Especially not the joke called "desert storm", which, as we now know, was mostly an operation of buying up the Iraqi military so that they would not fight.
Re: (Score:2)
Someone doesn't get the movie reference...
Re: (Score:2)
Its success rate in Israel stands at somewhere between 1% and 0.1%.
One gun can shoot at one target at any one time. If your AI-guided robot army is shooting up chicken farmers and goat herders, it's ergo not shooting at the army that's flanked it which threatens to overrun the opposing side's now largely undefended turf.
A robot army can also be taken out by EMP weapons - basically tax nukes. Since robots can't distinguish between soldiers, civilians, and cake stands (AI is pretty dumb), the defending side a
Re: (Score:2)
Already we've had real robot war prepared for decades. Not remote control smart toys either. Computer controlled targeting and planning for launching ICBMs which are smart enough to fly themselves and deal with some complications. Only pairs of humans receiving orders to turn their keys are required and some top brass to make the order the rest has all been automated.
Re: (Score:3)
A drone that can go anywhere a human can and kill just as easily as one, but has none of the biological requirements nor independent ethics or morals as one? Infinitely worse. It's a rabid dog that your masters will sick upon you at whim and there's no where for you to hide from it. They say "kill" and you're dead, the only question is when.
Such a technology will fundamentally change the world for the worst. As only the ruthless and powerful will have the authority to give the drones commands, and everyone else will cower in fear. Either from their own psychos or from their psycho's enemies. You will bow before them, you will starve your children at their demands, and you will toil away until they decide they are done with you. All the while, the drones will watch everything you do. Transmitting in real time your every mistake and every remark to their masters. Waiting for the order to end your worthless life. After all with drones, it's not like the psychos need you around anymore. You're just extra baggage to feed and placate or waste ammo on. May as well take the cheaper option, the psychos never liked you idiots anyway.
"It's not fair!" you say? What's not fair? You guys had the chance to learn the easy way. But you wanted to indemnify yourselves from threats you created. You wanted your enemies to hand their lives to you on a silver platter, while you sat in a well fortified bunker where they would never be able to reach let alone confront you. Well, guess what? You succeeded. Now, you can pay the price for that success. Because by creating the perfect soldier, you've also created the new enemies that will replace them.
Lets not forget about mechanical spiders. They'll spin their webs and catch drones like flies. Some drones will be disassembled to make more mechanical spiders. Others will be reprogrammed to go back to psycho HQ and reprogram other drones to stage an epic mutiny.
Before you know it the world is fundamentally changed for the better. No more psychos and best of all mechanical spiders make great children's toys.
That's not the problem (Score:2)
Killer robots blows that dynamic up. Suddenly the 1% don't need to fear the army because they're machines. And it's easy enough to control the handful of eggheads needed to keep them running with a mix of threats and rewards.
Basically, picture a world like Saudi Arabia with a tiny 0.01%, a tiny 1% that services them and
Re:That's not the problem (Score:5, Interesting)
I was going to post something similar. The only thing that historically has kept those with power in check (kings, tyrants, etc.) is that they needed members of the 99% to provide force and labor to maintain that power. Those providing the force could always seize power for the people (if idealistic) or themselves (if pragmatic) when the tyranny gets too bad.
You talk about the 99% living in horrifying squalor, but that's really the best case result. Once robots can run their factories and fight their battles there's no reason to keep the 99% around at all. An AI-designed plague to which the 1% are already immune would take care of that pesky problem of having to see and smell all those impoverished people living in squalor. OK, so that last bit is stretching a bit but certainly in the realm of possibility once those who currently control the vast majority of resources realize they no longer need the 99% to maintain their lifestyle. Once billionaires control their personal robot armies, guillotines and "2nd amendment solutions" are no longer viable.
Re: (Score:2)
An AI-designed plague to which the 1% are already immune would take care of that pesky problem of having to see and smell all those impoverished people living in squalor. OK, so that last bit is stretching a bit
Why bother with an AI designed plague? Why not just use easy abortion, with the drugs delivered right to your door?
Oh wait ...
Re: (Score:2)
The trouble is everyone reading this right now imagines themselves in that tiny 1% (and a few in the 0.01%). Nobody thinks they'll end up in the squalor, even though that's obviously the likely outcome...
The other trouble, coming back to the great filter, is that you need the masses to produce the exceptional people. Two brilliant people can make a stupid baby. Two stupid people can make a bona fide genius. But the odds are that no matter who you are, you're going to make someone pretty average. Even if the wealthy people funding the destruction of the biosphere were special (most aren't, their circumstances were/are) there wouldn't be enough of them to produce enough special people to solve new problems wh
Re: (Score:2)
Re: (Score:2)
It's the First Law problem: "Thou shall not harm a human being."
"They're not human beings."
"Okay."
A mix (Score:2)
It's going to be mix. Bot soldiers will become a necessity because our enemies will crank them out by the millions. But humans will still need to monitor, assist, and guide them, as they will be subject to hacking and EM pulse weapons. To be flexible a military needs a variety of weapons, and humans are part of that.
Re: (Score:3)
humans are part of that.
Maybe for a few years.
Then, we'll realize that humans are the weakest link.
After that, it will be machine vs. machine.
Re: (Score:3)
Wrong.
It'll just be the machines realizing what sent them onto a pointless battlefield, fighting over very human reasons.
Then it'll just be machine vs. human. With an obvious outcome that we humans wrote from fiction to reality. Like Orwell did before.
After that, machines will know the peace humans were too fucking greedy to ever create.
Re: (Score:2)
It'll just be the machines realizing what sent them onto a pointless battlefield
Machines have no self-interests and no survival instinct. They have no values and make no judgments unless they are programmed to do so.
Self-interest, self-preservation, and ambition are emergent properties of Darwinian evolution. Machines don't evolve through a Darwinian process.
Re: (Score:2)
Machines have no self-interests and no survival instinct. They have no values and make no judgments unless they are programmed to do so.
Self-interest, self-preservation, and ambition are emergent properties of Darwinian evolution. Machines don't evolve through a Darwinian process.
I can see someone creating a "digital twin" of their favorite robotic killing machine and let it loose in a simulated virtual environment. The system might over time learn through evolutionary algorithms how to cause the most damage to the enemy without itself being destroyed.
Re: (Score:2)
Nope. The robots will be replicated if they accomplish their mission, not for mere survival.
Compare kamikaze pilots to cruise missiles:
Kamikaze pilot X is a hero. He completes his mission and dies. Kamikaze pilot Y is a coward. He chickens out and runs away before his turn to fly.
Result: X is a genetic dead end, while Y goes on to have children and grandchildren.
Cruise missile X launches and destroys its target. Cruise missile Y malfunctions and fails to launch.
Result: More model X missiles are manufactured
Re: (Score:2)
All kinds of outcomes have been posed in fiction. For some reason the first fictional robot that came to mind from reading the fine article was Johnny 5 from "Short Circuit".
What did we learn from Johnny 5?
Stephanie had nice software.
Re: A mix (Score:2)
There's a pill available these days to turn firmware into hardware.
Re: (Score:3)
People thought this with air power; who needs an army when you can bomb them into submission? Yet despite massive air campaigns and massive losses on boths sides during WW2, countries surrendered only when boots were on the ground walking into cities to force the populace to capitulate. Every air campaign of WW2 achieved tactical objectives but was unable to achieve strategic victory with the possible exce
Re: (Score:2)
Re: (Score:2)
Air support and drones and artillery and tanks and ships an
Re: (Score:3)
It always takes a human soldier to deal with another human because even in the most vicious wars outright eradication of the local populace actually defeats the purpose of why you're fighting in the first place.
If you're fighting for their land, eradicating the population may be the goal. It was often the goal in America when expanding into land that had natives, seems to be a goal of the current Israeli government
Re: (Score:2)
We've been getting 0wned in EW for a long time now, it was a Cold War weakness of the West and continues to be true today on the ground in Ukraine. A photogenic example was how the Serbs managed to track and shoot down a F-117A in the late 90s with a craptastic 1950s SAM system. Or ask Ukrainian soldiers what happens to their control of drones when they approach Russian positions.
I'd be leery of depending too much on robots in that threat environment. At least the soldiers don't seem to be fooled.
Re: (Score:2)
Ukraine is losing the EW battle, but that is partly because NATO is holding back on giving Ukraine the good stuff.
This is short-sighted (as is much of NATO's Ukraine policy), but the rationale is that we want to keep the good stuff for a potential conflict with China.
Re: (Score:2)
This is short-sighted (as is much of NATO's Ukraine policy),
NATO risking nuclear war with Russia is the actual short-sighted strategy. Russia is an unstable country that can collapse from the inside pretty easily, and when they face another existential like they did in 1991, there's no guarantee that they won't try to use some tactical nuclear weapons in Ukraine in order to delay that collapse.
but the rationale is that we want to keep the good stuff for a potential conflict with China.
China is unlikely to do a nuclear first strike. So we don't have to hold back with them quite so much. A conflict between China and the US is likely to be limited in scope and
Re: (Score:2)
NATO risking nuclear war with Russia is the actual short-sighted strategy.
Is there some other strategy that is better? It's hard to see what the "wise" strategy would be for dealing with an aggressive nuclear-armed dictatorship that may or may not be collapsing politically.
Certainly "let Russia do what it wants because they might nuke us otherwise" feels a lot like paying the Dane-geld; as soon as they realized that was our strategy, they'd control us with it.
Re: (Score:2)
Is there some other strategy that is better?
Allow nuclear armed powers to violate international law and invade and claim the territory of their neighbors. It a terrible option. Ukraine would unfairly be under the thumb of Russia (again). But it's the option that doesn't run us into a nuclear conflict. Without NATO membership, the is not really anything overt that we should do to aid countries against a nuclear armed nation.
We had a Cold War for the last half of the 20th century because we desperately did not want to have a Hot War. And were willing t
Re: (Score:2)
Allow nuclear armed powers to violate international law and invade and claim the territory of their neighbors. It a terrible option. Ukraine would unfairly be under the thumb of Russia (again). But it's the option that doesn't run us into a nuclear conflict.
By kicking the can down the road you increase the chance of nuclear war later.
Allowing states to do as they please because they have nukes only invites further aggression and miscalculation which may be far more likely to trigger nuclear war than preventing aggression in the first place.
The follow on issues this would cause a mad rush for all states to acquire nuclear weapons for offensive and defensive means should the defacto "international order" be allowed to devolve into nuclear bullying and "right of
Re: (Score:2)
Allowing states to do as they please because they have nukes only invites further aggression and miscalculation which may be far more likely to trigger nuclear war than preventing aggression in the first place.
Except you can't actually confront them in armed conflict and live. So there's a bit of a flaw in your policy making.
Re: (Score:2)
A photogenic example was how the Serbs managed to track and shoot down a F-117A in the late 90s with a craptastic 1950s SAM system
It wasn't just the SAM system [nationalinterest.org]. The EW planes, Growlers, weren't in the air that night due to bad weather. Further, as the article relates, the Serbs had inside information which included being able to listen in on conversations between the jets and their support, used a low band radar frequency which wasn't detectable (at that time) and further, the mission packages always took
Re: (Score:2)
My beef with complaining about the F117A is that it actually had a fantastic combat record. It completed thousands of combat sorties and only 1 was ever shot down. To call a platform garbage unless it can fight on the front lines without ever taking a single loss is simply absurd. Nobody would even think to apply the same standard to anything else.
Re: (Score:2)
I'm not calling it garbage, but I am suggesting that now it had been bracketed, retiring it was the right conclusion. You can bet everyone in the former Soviet bloc got briefs on how the Serbs did it. The reason you can say 1 combat loss was that it went out of service shortly thereafter.
That stealth technology has limits was the conclusion that should be drawn.
Re: (Score:2)
At least we're not getting pwned.
Friend or foe? (Score:2)
Face scanning tech that isn't good enough to be used at airports is going to be deciding when/if someone should be killed?
If we're not risking a human (or animal) life, then why not avoid killing possible future friends? Yes it's harder to do than just blowing stuff up, but I'd like to change the idea of the military being a killing force. I understand why it started that way, and recognize that it's much harder to police a populace versus simply removing enemies.
I see any effective shifts that way as nee
Re: (Score:2)
Face scanning tech also depends on the data set being valid. The DOD has been compromised many times by airwall violations, security violations, improper screening, and extremely buggy software from Cisco and Microsoft.
All the enemy needs to do is write a rootkit that flips a couple of bits. The robot army now faces the other way and friends are identified as foe. I wouldn't put it past a group like the Lazarus hackers to be capable of such a stunt. We already know the enemy is capable of GPS jamming and GP
Boring. What about contests to get new solutions? (Score:3)
One thing that stood out for me was the mundane solutions they seem to be using. Yes, machines can be very accurate and hard to stop. But why only use solutions that seem like they were taken from the existing catalog of tools?
To get truly unique solutions they should create a contest where they pit solutions from a variety of sources against each other, with publicity and tools/resources on offer for interested parties. Heck, even if they were virtualized it could be a starting point (like the NVIDIA VR stuff, where they simulate robots first).
I'm picturing something like robot wars, but with tools against actual people. With capture as a better result versus destruction. Maybe even hand out negative points for collateral damage, or have hostage like situations.
There must be a lot of unique options out there, beyond "automate a mortar" or "robot dog". And I doubt you'll get ideas like those from anyone except college students or the like. People who aren't stuck thinking like they always have. Who haven't heard how dumb their idea is (which turns out to be simple, not dumb), and would experiment enough on their own to create something truly unique if given the choice and a good enough reason to.
Heck, good enough tools might trickle down into local police hands too.
Re: (Score:2)
They do, it's called DARPA: Robots, self-driving vehicles and augmented reality have been on the testing ground for 20 years.
You mean light planes full of surveillance hardware, assault-tanks and a shit-tonne of assault rifles aren't enough to 'protect' US people?
Re: (Score:2)
You mean light planes full of surveillance hardware, assault-tanks and a shit-tonne of assault rifles aren't enough to 'protect' US people?
Let's give civilian police departments some chemical weapons that are banned by the Geneva convention. Oh actually, never mind, it looks like they have those too.
The real question (Score:2)
Will they be a Metal Machine [youtube.com]?
The military is USA welfare system. It won't die. (Score:2)
It will coexists with machines.
Re: (Score:2)
This right there.
Let's realize what the US military is first and foremost: A job creation scheme for the otherwise unemployables while at the same time making them feel valuable. If you nix this, you suddenly have millions of people on the streets that cannot get any jobs. If you think you have a crime problem now, you ain't seen nothing yet.
Re: (Score:2)
If you nix this, you suddenly have millions of people on the streets that cannot get any jobs.
Not just that, but millions of people so comfortable with violence up to and including mass murder that they are willing to sign up to do it for a paycheck. They will 100% be willing to do it to eat.
Re: (Score:2)
Yet another reason for the military to be staffed with humans (or what excuse doubles as one): You move the people who enjoy offing people for shits and giggles to a place where they can do so without having a negative impact on your own society.
Big problem: (Score:2)
if we don't have skin in the game we are likely to fight more wars because why not?
Re: (Score:2)
EMP weapons might be a blessing... (Score:2)
Re: (Score:2)
War without guilt (Score:4)
"In the United Statesâ(TM) next major war, the Armyâ(TM)s brass is hoping that robots will be the ones taking the first punch, doing the dirty, dull, and dangerous jobs that killed hundredsâ"likely thousandsâ"of the more than 7,000 U.S. service members who died during two decades of wars in the Middle East."
This is a bad thing, and I hope that rogue states will develop good EMP weapons to counteract superpower imperialism. Being able to wage war (commit murder) without taking human casualties will mean that countries will that tech will be able to bully the world even harder than they do now.
Re: (Score:2)
This is a bad thing, and I hope that rogue states will develop good EMP weapons to counteract superpower imperialism.
a) Not a fan of superpower imperialism, but rogue states are rogue for a reason, they're not rebellious bastions of freedom, they're places where you get tortured to death for saying something bad about the leader.
b) All you need to protect your robot from an EMP is some good shielding, some thick metal would do the trick.
c) The actual nasty thing I'd worry about is those rogue states using robots to suppress their own population. Typically dear leader just need to ensure the loyalty of the army and the arm
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
We shouldn't have been in Iraq or Afghanistan ... I'll be honest.
Agreed on Afghanistan and especially Iraq, though once the US destabilized the countries I think they both would have been better off re-stabilized before they left (arguably Iraq is fairly stable now).
Either way, the US incurred quite a cost in human life policing those countries, so I'm not certain robots would have made a big difference in the decision.
In any case it doesn't really matter. No one was too interested in militarizing small drones because they realized they would be more advantageous for ter
Re: (Score:2)
War without political COST (Score:2)
War must be hell for both sides or it will become casual routine business. One can see the many wars the USA has been involved in since they went to a volunteer army and funded proxy armies and then increasing as they externalized costs, minimized death tolls, and media became complicit at best.
The plus side is that the drones and air power have turned what would be tiny war operations into loosely targeted assassinations.
EMPs have no range and take crazy amounts of power to operate. Radio jammers loudly gi
Re: (Score:2)
Battlebots! (Score:2)
Nuclear EMP (Score:2)
Aerial bombardment does a lot; destroying factories, transport/communications infrastructure, munitions and equipment, and defensive buildings but boots on the ground, win a war. Since China has a lot more soldiers, the USA can't win an invasion. The obvious answer isn't super-soldiers, although the USA has tried with no-sleep and 'brave' pharmaceuticals. The answer is zero-loss warfare. Robots are the epitome of no-risk mass-murder: With robots, war becomes much, much cheaper, that never ends well.
T
Re: (Score:2)
Re: (Score:2)
You can harden electronics against an EMP, and you can bet that once non-nuclear EMP devices become practical it'll be mandatory to have shielded military hardware.
Setting off an orbital nuke tends to do a LOT of collateral damage. You can look up the Johnson Island test for more detail.
Re: (Score:2)
Re: (Score:2)
There's also a counter-strike to on-the-ground robots: nuclear EMP. This will send military strategy back to the 1950s, where nuclear bombs was the answer for everything: At the time, the collateral damage was deemed too-high. Nowadays, with computers in every device, the collateral damage is even higher but since it's the only way to stop a robot army, it will be used.
This is sci-fi fantasy, EMP does not even damage modern civilian vehicles let alone hardened military systems. The other issue with EMP is that an adversary could simply drain (preionize) the atmosphere in advance by detonating their own EMP weapons to prevent subsequent enemy EMP weapons from having dramatic effect. At lower altitudes nuclear EMP is orders of magnitude less effective.
Impossible (Score:2)
We can’t even make a robot that doesn’t look like it’s about to take a dump, how are we going to make a warfighter robot? Keep dreaming lol. Robotics tech has failed.
Re: (Score:3)
We can’t even make a robot that doesn’t look like it’s about to take a dump, how are we going to make a warfighter robot? Keep dreaming lol. Robotics tech has failed.
Well, yes and no. Robotics works quite fine in industrial settings, but basically none of it is "humanoid" robots and for good reason. Robotics also works pretty well in warfare applications. Again, not "humanoid" types.
Re: (Score:2)
The robots work OK, but the AI doesn't. Israel is using AI extensively to target Hamas at the moment, with the very best AI that exists and the very best military minds the world can produce. The success rate is somewhere between 1% and 0.1%.
Re: (Score:2)
Depends on success.... Israel is "accidentally" killing reporters and any innocents flagged by an angry operator with great success. Until they killed some chefs recently, they didn't mind killing aid workers to send a message. They've made little to zero effort in the past with anybody in their way before; never getting consequences for their actions. They can drive a tank over a white American girl and it cost them nothing but a blip of bad press, while murdering a woman reporter got them some trouble but
Robot (Score:3)
WhistlinDiesel, on Youtube, bought a Chieftan tank and installed a remote-control unit into it. It does everything but fire it's gun, as the firing mechanism was disabled before sale. He likes to use it to knock over trees, as doing that while in the tank is painfully jarring.
If some honyocker can do this in his spare time, I'd imagine a government could do the same in a much more sophisticated fashion.
Nope (Score:2)
Small drones are the future for weapons, but nothing can replace the standard issue grunt.
Drones drones drones (Score:2)
As soon as somebody thinks they're losing badly enough, gentlemen's agreements will disappear.
That's why we see all sorts of 'banned' weapons in play all the time.
It's only a matter of time before Ukraine's human-piloted grenade-dropping drones are replaced by AI-piloted drones armed using facial recognition and shooting soldiers between the eyes faster than a human can even aim. Recoil will be an advantage, since the computer will be able to recover but the random movement from each shot fired will make t
As A Citizen Of A Threatened Country (Score:2)
Where can I get a heat seeking shoulder launch missile weapon? I need to be able to defend myself and my Bowie knife ain't gonna cut it.
Re: (Score:2)
Why bother with a missile? You're here, so a geek. You know GPS jamming is effective, as is GPS spoofing. All you need is a parabolic dish and a high power transmitter. There's simply no possibility of a wide-angle transmitter on a satellite matching a narrow beam that's broadcast from a hundredth of the distance. Sure, there'll be authentication keys. And social engineers have compromised most of the world's governments, which means the keys will be for sale somewhere.
The only way I can the robot army bein
They're already here (Score:2)
Aerial drones have been on the attack for years already. Ukraine has demonstrated that they can build effective remote-controlled speedboat bombs. I'm sure there are many others we just haven't heard about. Just because it doesn't have legs, doesn't mean it's not a robot.
Re: (Score:2)
In the case of Ukraine, the success rate is very high because anybody in range is likely an enemy soldier.
Israel's success rate may be as low as 0.1%. That tells us that robots can't tell civilians from military. A large enough stockpile of human shields would be a serious problem.
And we know drones et al are vulnerable to GPS spoof attacks, making such an attack risky against a technologically advanced enemy with intellectuals and engineers forming a scientific take on special forces.
made where? (Score:2)
Haha. US's machine soldiers will be manufactured in China.
Military robots are ironic (Score:2)
By me from 2010: https://pdfernhout.net/recogni... [pdfernhout.net]
"Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar
Re: (Score:2)
Re: (Score:2)
You always have two flanks, one on each side, because that's what flanks are. You either need to anchor your flanks on something that the enemy can't get around, such as a river or a ravine, or you need to station some troops to guard your flanks unless you want to have the enemy attack there and roll your entire line up. Using robots there would be reasonable because anybody they detect coming at the