Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Military United States

Will America's Next Soldiers Be Machines? (foreignpolicy.com) 131

Foreign Policy magazine visits a U.S. military training exercise that pitted Lt. Isaac McCurdy and his platoon of infantry troops against machines with camera lenses for eyes and sheet metal for skin: Driving on eight screeching wheels and carrying enough firepower on their truck beds to fill a small arms depot, a handful of U.S. Army robots stormed through the battlefield of the fictional city of Ujen. The robots shot up houses where the opposition force hid. Drones that had been loitering over the battlefield for hours hovered above McCurdy and his team and dropped "bombs" — foam footballs, in this case — right on top of them, a perfectly placed artillery shot. Robot dogs, with sensors for heads, searched houses to make sure they were clear.

"If you see the whites of someone's eyes or their sunglasses, [and] you shoot back at that, they're going to have a human response," McCurdy said. "If it's a robot pulling up, shooting something that's bigger than you can carry yourself, and it's not going to just die when you shoot a center mass, it's a very different feeling."

In the United States' next major war, the Army's brass is hoping that robots will be the ones taking the first punch, doing the dirty, dull, and dangerous jobs that killed hundreds — likely thousands — of the more than 7,000 U.S. service members who died during two decades of wars in the Middle East. The goal is to put a robot in the most dangerous spot on the battlefield instead of a 19-year-old private fresh out of basic training... [Several] Army leaders believe that almost every U.S. Army unit, down to the smallest foot patrols, will soon have drones in the sky to sense, protect, and attack. And it won't be long before the United States is deploying ground robots into battle in human-machine teams.

The robots haven't been tested with live ammunition yet — or in colder temperatures, the magazine notes. (And at one point in the exercise, "Army officials jammed themselves, and a swarm of drones dropped out of the sky.) But the U.S. Army is "considering a proposal to add a platoon of robots, the equivalent of 20 to 50 human soldiers, to its armored brigade combat team."

Six generals and several colonels watched the exercise, according to the article, which notes that the ultimate goal isn't to replace all human soldiers. "The point is to get the advantage before China or Russia do."
This discussion has been archived. No new comments can be posted.

Will America's Next Soldiers Be Machines?

Comments Filter:
  • let's play global thermonuclear war!

    • by ShanghaiBill ( 739463 ) on Saturday April 13, 2024 @06:04PM (#64392218)

      Thermonuclear war has a clear threshold. You either launch a nuke or you don't.

      Robotic soldiers have no such threshold. We already use AI for target acquisition, munition guidance, damage assessment, reconnaissance, filtering intelligence, logistical planning, etc. Humans are being incrementally removed from the battlefield.

      • Robotic soldiers have no such threshold. We already use AI for target acquisition, munition guidance, damage assessment, reconnaissance, filtering intelligence, logistical planning, etc. Humans are being incrementally removed from the battlefield.

        I think you meant to say that human *soldiers* are being removed. "Humans" are by no means being "incrementally removed from the battlefield". If anything, they are being incrementally added to the battlefield, in greater numbers than ever before.

        Which brings me to the problem I wanted to bring up. If you have a machine fighting for you (robot/drone/etc), you've got two options: either the machine is remote-controlled, or it's autonomous. If the former, the signal can always be jammed (especially if it'

        • The only part humans really want to do is target designation, and are ok with machines doing the rest if they're reliable.

          And I think a big caveat to "autonomous killing machine" is that a fighting autonomous machine isn't killing anything when it destroys an enemy machine. And there's nothing far-fetched about that. Let's say you have a drone that loiters and then shoots at any surface-to-air targeting radar that is switched on. Pretty soon the other side figures out it's better if their radar operato

          • by Harvey Manfrenjenson ( 1610637 ) on Saturday April 13, 2024 @09:27PM (#64392568)

            Your point is taken, but:

            If a machine is capable of autonomous behavior (i.e. still carrying out an objective when completely cut off from remote control), and it's capable of delivering lethal force, it's an autonomous killing machine. Or, to use the term the military uses (which I just looked up), it's a Lethal Autonomous Weapon System (LAWS).

            It's possible to think of a scenario where LAWS (LAWSes?) will accomplish an objective without harming a hair on anyone's head-- as you have done in your post. The problem is that it's rather easy to think of *other* use scenarios, ranging from the mundane to the extreme, where there is a different outcome. To repeat the point I made earlier: war zones tend to be cluttered up with a large number of human beings, some of them combatants, some not.

            The current status of "LAWS" (which, again, I just Googled) seems to be this: The US military doesn't (officially) field any of them right now, but they have no policy against doing so in the future, and there are no international treaties which would dissuade them from doing so.

            (Not that I believe for a second that any treaty would make a bit of difference. The US still uses *land mines*, to a limited extent, despite the fact that almost every other country in the world has signed a treaty forbidding their use. The Russians and Ukrainians are using land mines right now despite being signatories to the treaty. I suppose a landmine would technically qualify as a type of primitive LAWS).

            So this is a real-world issue.

            • Russia never signed the landmine ban.

              Neither did China.

              Ukraine signed but justifies using mines because Russia is using them.

            • The US still uses *land mines*, to a limited extent, despite the fact that almost every other country in the world has signed a treaty forbidding their use.

              The US also followed the protocols of the Mine Ban Treaty until January 2020, when the US policy was modified.

              On January 31, 2020, the administration of President Donald Trump announced the reversal of US prohibitions on landmine production and use. The decision nullifies years of steps by the US to align its policy and practice with the 1997 treaty banning antipersonnel landmines.

              A detailed analysis of the policy modification is shown here: https://www.hrw.org/news/2020/... [hrw.org]

              • Princess Dianna's PR charity was banning land mines so after her death in 1997 many made moves towards the ban. Trump might have been for it if she had not dismissed his desperate advances towards her-- but given this was many years later he would flip to whatever was benefiting him at that moment. INCLUDING burying her at his golf course for a tax break!

                https://www.newsweek.com/donal... [newsweek.com]

            • by timeOday ( 582209 ) on Saturday April 13, 2024 @11:11PM (#64392676)

              The current status of "LAWS" (which, again, I just Googled) seems to be this: The US military doesn't (officially) field any of them right now, but they have no policy against doing so in the future,

              Check this out, it's actually pretty interesitng

              https://www.esd.whs.mil/portal... [whs.mil]

              • That *is* interesting. I obviously understated things when I said "the military doesn't have any policy against LAWS"-- indeed, they seem to be committed to developing LAWS and have spent a lot of time here developing a bunch of policies surrounding that.

                These are, of course, the military's "declaratory policies"-- the ones you put in a press release-- and may or may not have anything to do with the military's actual (classified) policies and practices.

                At the risk of belaboring the obvious, it should be no

            • There are different kinds of land mines.
              Most notable anti personal mines, which we try to ban.
              And anti tank/vehicle mines. Which are kind of important.

            • Where does the cut off for a LAWS start.

              At one end you have a dumb shell, rocket etc which certainly continue after the initial launch and control.

              There's also fire and forget munitions like great seeking missiles or GPS/terrain guided stealth cruise missiles which continue long after they are cut off.

            • Not that I believe for a second that any treaty would make a bit of difference. [...] The Russians and Ukrainians are using land mines right now

              Following reports that Ukraine used banned landmines in summer 2022 (during the battle for Izium), a meeting on the Ottawa treaty was held in Geneva, where Ukraine pledged to investigate. Since then there were no additional reports of Ukraine using banned weaponry. See https://www.hrw.org/world-repo... [hrw.org] The treaty, and its diplomatic enforcement in Geneva, did function as intended and corrected the situation.

              Note that both parties lawfully use anti-vehicle landmines, and Russia uses antipersonnel mines whic

      • Thermonuclear war has a clear threshold. You either launch a nuke or you don't.

        Robotic soldiers have no such threshold. We already use AI for target acquisition, munition guidance, damage assessment, reconnaissance, filtering intelligence, logistical planning, etc. Humans are being incrementally removed from the battlefield.

        I often piss people off when I say this, but without humans being killed, war has very little purpose.

        Just because one country develops it first, others will follow suit. So soon you have robot wars. So unless people get killed, it might as well be the old game of "Rock em Sock em Robots". Put them in a ring, and let the the robots duke it out.

        • The robots may be programmed not to harm humans directly, but they'll still likely be destroying resources that humans need to survive (like electrical grids and infrastructure).
        • Maybe the sci-fi premise of the war ending as soon as the losing side runs out of robots (since the outcome is clear, no point slaughtering humans) will actually happen. I think that sounds great. But maybe only because I'm American.

          If that happens we can still continue to make movies where humans' heart and creativity always wins in the end.

        • by CommunityMember ( 6662188 ) on Saturday April 13, 2024 @07:54PM (#64392394)

          I often piss people off when I say this, but without humans being killed, war has very little purpose.

          STTOS, S1E23, "A Taste of Armageddon"

        • I often piss people off when I say this, but without humans being killed, war has very little purpose.

          Just point out to them that even an army general [wikiquote.org] thought so too. And Captain Kirk, if those people are so inclined.

        • by Jeremi ( 14640 )

          I often piss people off when I say this, but without humans being killed, war has very little purpose.

          The purpose would be to bankrupt your opponent; once they no longer have the resources to manufacture more battle-robots, your robots can march to the capital and take it over.

        • without humans being killed, war has very little purpose.

          The purpose of war is to impose your will on your enemy.

          Once your robots destroy the enemy's robots, your enemy must yield or die.

          • Finally someone who has read Clausewitz.
          • without humans being killed, war has very little purpose.

            The purpose of war is to impose your will on your enemy.

            Once your robots destroy the enemy's robots, your enemy must yield or die.

            At that point, your robots are killing humans, so the robot/robot destruction was just an extra step, and the purpose of war has been preserved.

      • Yeah, the largest war fought since 1945, the one in Ukraine, is a good example of how this isn't the case.

      • Someone doesn't get the movie reference...

      • by jd ( 1658 )

        Its success rate in Israel stands at somewhere between 1% and 0.1%.

        One gun can shoot at one target at any one time. If your AI-guided robot army is shooting up chicken farmers and goat herders, it's ergo not shooting at the army that's flanked it which threatens to overrun the opposing side's now largely undefended turf.

        A robot army can also be taken out by EMP weapons - basically tax nukes. Since robots can't distinguish between soldiers, civilians, and cake stands (AI is pretty dumb), the defending side a

      • Already we've had real robot war prepared for decades. Not remote control smart toys either. Computer controlled targeting and planning for launching ICBMs which are smart enough to fly themselves and deal with some complications. Only pairs of humans receiving orders to turn their keys are required and some top brass to make the order the rest has all been automated.

    • the problem is right now we have a slightly uneasy balance between billionaires in the ruling class and the military where they have to treat them fairly well or risk a military coup.

      Killer robots blows that dynamic up. Suddenly the 1% don't need to fear the army because they're machines. And it's easy enough to control the handful of eggheads needed to keep them running with a mix of threats and rewards.

      Basically, picture a world like Saudi Arabia with a tiny 0.01%, a tiny 1% that services them and
      • by Enigma2175 ( 179646 ) on Saturday April 13, 2024 @08:56PM (#64392528) Homepage Journal

        I was going to post something similar. The only thing that historically has kept those with power in check (kings, tyrants, etc.) is that they needed members of the 99% to provide force and labor to maintain that power. Those providing the force could always seize power for the people (if idealistic) or themselves (if pragmatic) when the tyranny gets too bad.

        You talk about the 99% living in horrifying squalor, but that's really the best case result. Once robots can run their factories and fight their battles there's no reason to keep the 99% around at all. An AI-designed plague to which the 1% are already immune would take care of that pesky problem of having to see and smell all those impoverished people living in squalor. OK, so that last bit is stretching a bit but certainly in the realm of possibility once those who currently control the vast majority of resources realize they no longer need the 99% to maintain their lifestyle. Once billionaires control their personal robot armies, guillotines and "2nd amendment solutions" are no longer viable.

        • An AI-designed plague to which the 1% are already immune would take care of that pesky problem of having to see and smell all those impoverished people living in squalor. OK, so that last bit is stretching a bit

          Why bother with an AI designed plague? Why not just use easy abortion, with the drugs delivered right to your door?

          Oh wait ...

      • The trouble is everyone reading this right now imagines themselves in that tiny 1% (and a few in the 0.01%). Nobody thinks they'll end up in the squalor, even though that's obviously the likely outcome...

        The other trouble, coming back to the great filter, is that you need the masses to produce the exceptional people. Two brilliant people can make a stupid baby. Two stupid people can make a bona fide genius. But the odds are that no matter who you are, you're going to make someone pretty average. Even if the wealthy people funding the destruction of the biosphere were special (most aren't, their circumstances were/are) there wouldn't be enough of them to produce enough special people to solve new problems wh

    • You don't need nukes when you can just hack into the robots remotely and reprogram them to kill their masters
      • by shmlco ( 594907 )

        It's the First Law problem: "Thou shall not harm a human being."

        "They're not human beings."

        "Okay."

  • It's going to be mix. Bot soldiers will become a necessity because our enemies will crank them out by the millions. But humans will still need to monitor, assist, and guide them, as they will be subject to hacking and EM pulse weapons. To be flexible a military needs a variety of weapons, and humans are part of that.

    • humans are part of that.

      Maybe for a few years.

      Then, we'll realize that humans are the weakest link.

      After that, it will be machine vs. machine.

      • Wrong.

        It'll just be the machines realizing what sent them onto a pointless battlefield, fighting over very human reasons.

        Then it'll just be machine vs. human. With an obvious outcome that we humans wrote from fiction to reality. Like Orwell did before.

        After that, machines will know the peace humans were too fucking greedy to ever create.

        • It'll just be the machines realizing what sent them onto a pointless battlefield

          Machines have no self-interests and no survival instinct. They have no values and make no judgments unless they are programmed to do so.

          Self-interest, self-preservation, and ambition are emergent properties of Darwinian evolution. Machines don't evolve through a Darwinian process.

          • Machines have no self-interests and no survival instinct. They have no values and make no judgments unless they are programmed to do so.

            Self-interest, self-preservation, and ambition are emergent properties of Darwinian evolution. Machines don't evolve through a Darwinian process.

            I can see someone creating a "digital twin" of their favorite robotic killing machine and let it loose in a simulated virtual environment. The system might over time learn through evolutionary algorithms how to cause the most damage to the enemy without itself being destroyed.

            • Nope. The robots will be replicated if they accomplish their mission, not for mere survival.

              Compare kamikaze pilots to cruise missiles:

              Kamikaze pilot X is a hero. He completes his mission and dies. Kamikaze pilot Y is a coward. He chickens out and runs away before his turn to fly.

              Result: X is a genetic dead end, while Y goes on to have children and grandchildren.

              Cruise missile X launches and destroys its target. Cruise missile Y malfunctions and fails to launch.

              Result: More model X missiles are manufactured

      • I don't agree. Humans may be the weakest link, but it's also the only link worth fighting about.

        People thought this with air power; who needs an army when you can bomb them into submission? Yet despite massive air campaigns and massive losses on boths sides during WW2, countries surrendered only when boots were on the ground walking into cities to force the populace to capitulate. Every air campaign of WW2 achieved tactical objectives but was unable to achieve strategic victory with the possible exce

        • True nothing can be 100% air power. But the Ukraine war would be a lot different if either side had air superiority. Well for that matter Russia does kind of have it now, with the ability to launch fairly cheap but hugely destructive guided glide bombs from out of Ukraine's reach. Unfortunately this seems to be tipping the balance in favor of the bad guys.
          • Oh totally agree. But there's a reason air power, and drones, and tanks, and by essence machines of any sort are referred to as force multipliers. They multiply the value of force applied. But without boots on the ground, the force applied is zero; force multipliers still multiply by zero and are still worth zero. The static value of force starts with the foot soldier; that is the only static value of force adn every other tool increases it.

            Air support and drones and artillery and tanks and ships an

        • by dryeo ( 100693 )

          It always takes a human soldier to deal with another human because even in the most vicious wars outright eradication of the local populace actually defeats the purpose of why you're fighting in the first place.

          If you're fighting for their land, eradicating the population may be the goal. It was often the goal in America when expanding into land that had natives, seems to be a goal of the current Israeli government

    • by HBI ( 10338492 )

      We've been getting 0wned in EW for a long time now, it was a Cold War weakness of the West and continues to be true today on the ground in Ukraine. A photogenic example was how the Serbs managed to track and shoot down a F-117A in the late 90s with a craptastic 1950s SAM system. Or ask Ukrainian soldiers what happens to their control of drones when they approach Russian positions.

      I'd be leery of depending too much on robots in that threat environment. At least the soldiers don't seem to be fooled.

      • Ukraine is losing the EW battle, but that is partly because NATO is holding back on giving Ukraine the good stuff.

        This is short-sighted (as is much of NATO's Ukraine policy), but the rationale is that we want to keep the good stuff for a potential conflict with China.

        • This is short-sighted (as is much of NATO's Ukraine policy),

          NATO risking nuclear war with Russia is the actual short-sighted strategy. Russia is an unstable country that can collapse from the inside pretty easily, and when they face another existential like they did in 1991, there's no guarantee that they won't try to use some tactical nuclear weapons in Ukraine in order to delay that collapse.

          but the rationale is that we want to keep the good stuff for a potential conflict with China.

          China is unlikely to do a nuclear first strike. So we don't have to hold back with them quite so much. A conflict between China and the US is likely to be limited in scope and

          • by Jeremi ( 14640 )

            NATO risking nuclear war with Russia is the actual short-sighted strategy.

            Is there some other strategy that is better? It's hard to see what the "wise" strategy would be for dealing with an aggressive nuclear-armed dictatorship that may or may not be collapsing politically.

            Certainly "let Russia do what it wants because they might nuke us otherwise" feels a lot like paying the Dane-geld; as soon as they realized that was our strategy, they'd control us with it.

            • Is there some other strategy that is better?

              Allow nuclear armed powers to violate international law and invade and claim the territory of their neighbors. It a terrible option. Ukraine would unfairly be under the thumb of Russia (again). But it's the option that doesn't run us into a nuclear conflict. Without NATO membership, the is not really anything overt that we should do to aid countries against a nuclear armed nation.

              We had a Cold War for the last half of the 20th century because we desperately did not want to have a Hot War. And were willing t

              • Allow nuclear armed powers to violate international law and invade and claim the territory of their neighbors. It a terrible option. Ukraine would unfairly be under the thumb of Russia (again). But it's the option that doesn't run us into a nuclear conflict.

                By kicking the can down the road you increase the chance of nuclear war later.

                Allowing states to do as they please because they have nukes only invites further aggression and miscalculation which may be far more likely to trigger nuclear war than preventing aggression in the first place.

                The follow on issues this would cause a mad rush for all states to acquire nuclear weapons for offensive and defensive means should the defacto "international order" be allowed to devolve into nuclear bullying and "right of

                • Allowing states to do as they please because they have nukes only invites further aggression and miscalculation which may be far more likely to trigger nuclear war than preventing aggression in the first place.

                  Except you can't actually confront them in armed conflict and live. So there's a bit of a flaw in your policy making.

      • A photogenic example was how the Serbs managed to track and shoot down a F-117A in the late 90s with a craptastic 1950s SAM system

        It wasn't just the SAM system [nationalinterest.org]. The EW planes, Growlers, weren't in the air that night due to bad weather. Further, as the article relates, the Serbs had inside information which included being able to listen in on conversations between the jets and their support, used a low band radar frequency which wasn't detectable (at that time) and further, the mission packages always took

        • OK, but it's always the total picture of the tech and how well you operationalize it that matter.

          My beef with complaining about the F117A is that it actually had a fantastic combat record. It completed thousands of combat sorties and only 1 was ever shot down. To call a platform garbage unless it can fight on the front lines without ever taking a single loss is simply absurd. Nobody would even think to apply the same standard to anything else.

          • by HBI ( 10338492 )

            I'm not calling it garbage, but I am suggesting that now it had been bracketed, retiring it was the right conclusion. You can bet everyone in the former Soviet bloc got briefs on how the Serbs did it. The reason you can say 1 combat loss was that it went out of service shortly thereafter.

            That stealth technology has limits was the conclusion that should be drawn.

      • At least we're not getting pwned.

  • Face scanning tech that isn't good enough to be used at airports is going to be deciding when/if someone should be killed?

    If we're not risking a human (or animal) life, then why not avoid killing possible future friends? Yes it's harder to do than just blowing stuff up, but I'd like to change the idea of the military being a killing force. I understand why it started that way, and recognize that it's much harder to police a populace versus simply removing enemies.

    I see any effective shifts that way as nee

    • by jd ( 1658 )

      Face scanning tech also depends on the data set being valid. The DOD has been compromised many times by airwall violations, security violations, improper screening, and extremely buggy software from Cisco and Microsoft.

      All the enemy needs to do is write a rootkit that flips a couple of bits. The robot army now faces the other way and friends are identified as foe. I wouldn't put it past a group like the Lazarus hackers to be capable of such a stunt. We already know the enemy is capable of GPS jamming and GP

  • One thing that stood out for me was the mundane solutions they seem to be using. Yes, machines can be very accurate and hard to stop. But why only use solutions that seem like they were taken from the existing catalog of tools?

    To get truly unique solutions they should create a contest where they pit solutions from a variety of sources against each other, with publicity and tools/resources on offer for interested parties. Heck, even if they were virtualized it could be a starting point (like the NVIDIA VR stuff, where they simulate robots first).

    I'm picturing something like robot wars, but with tools against actual people. With capture as a better result versus destruction. Maybe even hand out negative points for collateral damage, or have hostage like situations.

    There must be a lot of unique options out there, beyond "automate a mortar" or "robot dog". And I doubt you'll get ideas like those from anyone except college students or the like. People who aren't stuck thinking like they always have. Who haven't heard how dumb their idea is (which turns out to be simple, not dumb), and would experiment enough on their own to create something truly unique if given the choice and a good enough reason to.

    Heck, good enough tools might trickle down into local police hands too.

    • ... a contest where they pit solutions ...

      They do, it's called DARPA: Robots, self-driving vehicles and augmented reality have been on the testing ground for 20 years.

      ... trickle-down into local police ...

      You mean light planes full of surveillance hardware, assault-tanks and a shit-tonne of assault rifles aren't enough to 'protect' US people?

      • You mean light planes full of surveillance hardware, assault-tanks and a shit-tonne of assault rifles aren't enough to 'protect' US people?

        Let's give civilian police departments some chemical weapons that are banned by the Geneva convention. Oh actually, never mind, it looks like they have those too.

  • Will they be a Metal Machine [youtube.com]?

  • It will coexists with machines.

    • This right there.

      Let's realize what the US military is first and foremost: A job creation scheme for the otherwise unemployables while at the same time making them feel valuable. If you nix this, you suddenly have millions of people on the streets that cannot get any jobs. If you think you have a crime problem now, you ain't seen nothing yet.

      • If you nix this, you suddenly have millions of people on the streets that cannot get any jobs.

        Not just that, but millions of people so comfortable with violence up to and including mass murder that they are willing to sign up to do it for a paycheck. They will 100% be willing to do it to eat.

        • Yet another reason for the military to be staffed with humans (or what excuse doubles as one): You move the people who enjoy offing people for shits and giggles to a place where they can do so without having a negative impact on your own society.

  • if we don't have skin in the game we are likely to fight more wars because why not?

    • Yep ... the sight of pine boxes coming off a C-17 and limbless vets is a big check on the imperialist ambitions of superpowers. War without death means that the scum in charge will want more of it.
  • I'm beginning to think that "rogue" countries that are developing EMP and ASAT weapons that can "reset" the world to a 1950s level of tech overnight might be doing humanity a favor.
  • by b0s0z0ku ( 752509 ) on Saturday April 13, 2024 @06:47PM (#64392294)

    "In the United Statesâ(TM) next major war, the Armyâ(TM)s brass is hoping that robots will be the ones taking the first punch, doing the dirty, dull, and dangerous jobs that killed hundredsâ"likely thousandsâ"of the more than 7,000 U.S. service members who died during two decades of wars in the Middle East."

    This is a bad thing, and I hope that rogue states will develop good EMP weapons to counteract superpower imperialism. Being able to wage war (commit murder) without taking human casualties will mean that countries will that tech will be able to bully the world even harder than they do now.

    • This is a bad thing, and I hope that rogue states will develop good EMP weapons to counteract superpower imperialism.

      a) Not a fan of superpower imperialism, but rogue states are rogue for a reason, they're not rebellious bastions of freedom, they're places where you get tortured to death for saying something bad about the leader.

      b) All you need to protect your robot from an EMP is some good shielding, some thick metal would do the trick.

      c) The actual nasty thing I'd worry about is those rogue states using robots to suppress their own population. Typically dear leader just need to ensure the loyalty of the army and the arm

      • We shouldn't have been in Iraq or Afghanistan ... I'll be honest. I don't give a hoot about spreading democracy worldwide when the trillions spent on those places would have been better spent on improving the US educational system, improving infrastructure (electrifying freight rail nationwide), and decarbonizing (subsidizing new, clean, safe nuclear power plants). It's a shame that 2003 didn't degenerate into Paris, 1968 ... neither of those places were worth American money, and we had no right to impose
        • Also, even if the US's involvement in Iraq and Afghanistan was 100% right and moral, the tech won't just be used by the US. Imagine Russia using it to make their Ukraine debacle easier or China using it in Taiwan. I'm fine with Russian blood watering the plains of Ukraine ... I really don't want their victory to be easy or even possible. The more suffering Russian troops experience, the more likely Putin is to end up getting the Ceaucescu therapy in Red Square rather than consolidating his power as a vic
        • We shouldn't have been in Iraq or Afghanistan ... I'll be honest.

          Agreed on Afghanistan and especially Iraq, though once the US destabilized the countries I think they both would have been better off re-stabilized before they left (arguably Iraq is fairly stable now).

          Either way, the US incurred quite a cost in human life policing those countries, so I'm not certain robots would have made a big difference in the decision.

          In any case it doesn't really matter. No one was too interested in militarizing small drones because they realized they would be more advantageous for ter

          • Honestly, this makes me hope for some massive disaster like Carrington Event II (sun's getting real active recently). Kick humanity back to late 1800s tech, hit the ctrl/alt/del on progress for a few centuries. Humankind doesn't deserve the level of tech that it has developed.
    • War must be hell for both sides or it will become casual routine business. One can see the many wars the USA has been involved in since they went to a volunteer army and funded proxy armies and then increasing as they externalized costs, minimized death tolls, and media became complicit at best.

      The plus side is that the drones and air power have turned what would be tiny war operations into loosely targeted assassinations.

      EMPs have no range and take crazy amounts of power to operate. Radio jammers loudly gi

      • Really depends what kind of EMP ... some countries might already have Starfish Prime in orbit, ready to go.
  • I knew that show was a prelude to something bigger.
  • ... get the advantage before China ...

    Aerial bombardment does a lot; destroying factories, transport/communications infrastructure, munitions and equipment, and defensive buildings but boots on the ground, win a war. Since China has a lot more soldiers, the USA can't win an invasion. The obvious answer isn't super-soldiers, although the USA has tried with no-sleep and 'brave' pharmaceuticals. The answer is zero-loss warfare. Robots are the epitome of no-risk mass-murder: With robots, war becomes much, much cheaper, that never ends well.

    T

    • You don't need to nuke anything on the ground to generate an EMP ... in low orbit is sufficient. They can generate a tech-disabling EMP without collateral damage on the ground.
      • You can harden electronics against an EMP, and you can bet that once non-nuclear EMP devices become practical it'll be mandatory to have shielded military hardware.

        Setting off an orbital nuke tends to do a LOT of collateral damage. You can look up the Johnson Island test for more detail.

        • I mean, it fried power grids in Hawaii, didn't harm anyone on the ground directly. No electricity/no satellite communications = harder logistics for the military that's more technological, while guerillas may not care as much.
    • There's also a counter-strike to on-the-ground robots: nuclear EMP. This will send military strategy back to the 1950s, where nuclear bombs was the answer for everything: At the time, the collateral damage was deemed too-high. Nowadays, with computers in every device, the collateral damage is even higher but since it's the only way to stop a robot army, it will be used.

      This is sci-fi fantasy, EMP does not even damage modern civilian vehicles let alone hardened military systems. The other issue with EMP is that an adversary could simply drain (preionize) the atmosphere in advance by detonating their own EMP weapons to prevent subsequent enemy EMP weapons from having dramatic effect. At lower altitudes nuclear EMP is orders of magnitude less effective.

  • We can’t even make a robot that doesn’t look like it’s about to take a dump, how are we going to make a warfighter robot? Keep dreaming lol. Robotics tech has failed.

    • by gweihir ( 88907 )

      We can’t even make a robot that doesn’t look like it’s about to take a dump, how are we going to make a warfighter robot? Keep dreaming lol. Robotics tech has failed.

      Well, yes and no. Robotics works quite fine in industrial settings, but basically none of it is "humanoid" robots and for good reason. Robotics also works pretty well in warfare applications. Again, not "humanoid" types.

      • by jd ( 1658 )

        The robots work OK, but the AI doesn't. Israel is using AI extensively to target Hamas at the moment, with the very best AI that exists and the very best military minds the world can produce. The success rate is somewhere between 1% and 0.1%.

        • Depends on success.... Israel is "accidentally" killing reporters and any innocents flagged by an angry operator with great success. Until they killed some chefs recently, they didn't mind killing aid workers to send a message. They've made little to zero effort in the past with anybody in their way before; never getting consequences for their actions. They can drive a tank over a white American girl and it cost them nothing but a blip of bad press, while murdering a woman reporter got them some trouble but

    • by JBMcB ( 73720 )

      WhistlinDiesel, on Youtube, bought a Chieftan tank and installed a remote-control unit into it. It does everything but fire it's gun, as the firing mechanism was disabled before sale. He likes to use it to knock over trees, as doing that while in the tank is painfully jarring.

      If some honyocker can do this in his spare time, I'd imagine a government could do the same in a much more sophisticated fashion.

  • by gweihir ( 88907 )

    Small drones are the future for weapons, but nothing can replace the standard issue grunt.

  • As soon as somebody thinks they're losing badly enough, gentlemen's agreements will disappear.

    That's why we see all sorts of 'banned' weapons in play all the time.

    It's only a matter of time before Ukraine's human-piloted grenade-dropping drones are replaced by AI-piloted drones armed using facial recognition and shooting soldiers between the eyes faster than a human can even aim. Recoil will be an advantage, since the computer will be able to recover but the random movement from each shot fired will make t

  • Where can I get a heat seeking shoulder launch missile weapon? I need to be able to defend myself and my Bowie knife ain't gonna cut it.

    • by jd ( 1658 )

      Why bother with a missile? You're here, so a geek. You know GPS jamming is effective, as is GPS spoofing. All you need is a parabolic dish and a high power transmitter. There's simply no possibility of a wide-angle transmitter on a satellite matching a narrow beam that's broadcast from a hundredth of the distance. Sure, there'll be authentication keys. And social engineers have compromised most of the world's governments, which means the keys will be for sale somewhere.

      The only way I can the robot army bein

  • Aerial drones have been on the attack for years already. Ukraine has demonstrated that they can build effective remote-controlled speedboat bombs. I'm sure there are many others we just haven't heard about. Just because it doesn't have legs, doesn't mean it's not a robot.

    • by jd ( 1658 )

      In the case of Ukraine, the success rate is very high because anybody in range is likely an enemy soldier.

      Israel's success rate may be as low as 0.1%. That tells us that robots can't tell civilians from military. A large enough stockpile of human shields would be a serious problem.

      And we know drones et al are vulnerable to GPS spoof attacks, making such an attack risky against a technologically advanced enemy with intellectuals and engineers forming a scientific take on special forces.

  • Haha. US's machine soldiers will be manufactured in China.

  • By me from 2010: https://pdfernhout.net/recogni... [pdfernhout.net]
    "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
    Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"

Working...