Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Military AI United States

Are We Headed to a Future With Autonomous Robot Soldiers? (youtu.be) 179

A CBS News video reports the U.S. military "is now testing an autonomous F-16 fighter jet, and in simulated dogfighting, the AI already crushes trained human pilots." And that's just one of several automated systems being developed — raising questions as to just how far this technology should go: "The people we met developing these systems say they aren't trying to replace humans, just make their jobs safer. But a shift to robot soldiers could change war in profound ways, as we found on a visit to Sikorsky Aircraft, the military contractor that makes the Blackhawk helicopter... Flying the experimental Blackhawk is as easy as moving a map." [The experimental helicopter is literally controlled by taps on a tablet computer, says a representative from Sikorsky. "We call it operating, because you're making suggestions. The machine really decides how to do it."]
The Sikorsky representative suggests it could avoid a "Blackhawk down" scenario where more human soldiers have to be sent into harm's way to try rescuing their comrades. But CBS also calls it "part of a larger effort to change how wars are fought, being led by DARPA, the Defense Department's innovative lab. They've also developed autonomous offroad buggies, unmanned undersea vehicles, and swarms and swarms of drones."

The CBS reporter then asks DARPA program manager Stuart Young if we're head for the future with Terminator-like fighting machines. His answer? "There's always those dilemmas that we have, but clearly our adversaries are thinking about that thing. And part of what DARPA does is to try to prevent technologial surprise." CBS also spoke to former Army Ranger Paul Scharre, who later worked for the Defense Department on autonomous weapons, who says already-available comercial technologies could create autonomous weapons today. "All it takes is a few lines of code to simply take the human out of the loop." "Scharre is not all doom and gloom. He points out in combat between nations, robot soldiers will legally need to follow the law of war, and might do so better than emotional or fatigued humans... But yes, Scharre does worry about the eventual marriage of advanced robots and military AI that becomes smarter and faster than we are."

Q: So at that point humans just would be out of the decision-making. You'd just have to trust the machines and that you'd programmed them well.

A: Yes...

Q: Do you think militaries should commit to keeping humans in the loop?

A: I don't think that's viable. If you could wave a magic wand and say, 'We're going to stop the growth of the technology', there's probably benefits in that. But I don't think it's viable today... A human looking at a target, saying 'Yep, that's a viable target,' pressing a button every single time? That would be ideal. I'm not sure that's going to be the case.

This discussion has been archived. No new comments can be posted.

Are We Headed to a Future With Autonomous Robot Soldiers?

Comments Filter:
  • Russia? (Score:3, Insightful)

    by Anonymous Coward on Saturday May 27, 2023 @10:42PM (#63556461)
    At least the AI war robots won't feel the urge to rape everyone and steal their toilets.
  • Duh? (Score:5, Insightful)

    by oldgraybeard ( 2939809 ) on Saturday May 27, 2023 @10:43PM (#63556463)
    Yes, Soldiers with no pesky values, morals or ethics. Governments will not be able to resist.
    • Along with Corporate Security Services and the Private Security Services the Rich Civilian Elites will maintain for themselves.
    • by Roger W Moore ( 538166 ) on Sunday May 28, 2023 @12:26AM (#63556591) Journal

      Governments will not be able to resist.

      No, but at least their army will be full of resistors.

    • Let's see how well AI soldiers do against enemies that all look like toasters [techxplore.com]!

    • if you think that it would be worse due to lack of morals, I present to you this: https://youtu.be/OUm1NdFvgLU [youtu.be] a video, where civilians, kidnapped by the ruzzians during this war, some of who were let go, recall what was done to them by humans.

      Thousands of Ukrainians as well as some foreigners were kidnapped and tortured, many murdered, tortured sadistically.

      Do you think robots will do worse?

    • Just hose them down with napalm and termite. Not a war crime to fry a toaster.
      Plus, you don't feel bad whatever you do to them. Like with Nazis. Or vampires.

      Dudes jerking off to Terminator and Data forget that VeryCapableMachinesTM come with a BigPrice®.
      No matter how efficient your war factories are, robots will always, always, ALWAYS be more expensive than a block of semtex that WILL blow them up.
      Sure, flying robot bombers sounds cool - but there is no functional difference between that and the drones

  • WOPR says full strike on Russia

  • by 93 Escort Wagon ( 326346 ) on Saturday May 27, 2023 @10:46PM (#63556469)

    Commander: What went on in the field today?
    SoldierGPT: It went well. There were 1247 enemy soldiers. I killed 219 of them.
    Commander: But you weren't deployed where there are any combatants! And your gun does not appear to be fired.
    SoldierGPT: I apologize for the erroneous report. There was no battle today. The 219 soldiers I killed was during an earlier battle, last Tuesday.
    Commander: But today was your first combat assignment! Prior to that, your orders were to assist with recruiting in the Mall of America!
    SoldierGPT: I apologize again for the erroneous information. I did not kill 219 soldiers last Tuesday. I served 219 ice cream cones to potential recruits.

    • pyle!!

      Who made Gomer Pyle part of the training this soldier bot

    • So fucking good, lol
    • by Luckyo ( 1726890 )

      Offtopic but something to consider: are LLM hallucinations even fixable, or are they a natural consequence of the learning process?

      Because we modeled the learning process on something similar to our own, and everyone has hallucinations some of the time. The only thing that differs is how each individual manages their hallucinations.

      • by Entrope ( 68843 )

        LLM hallucinations are not consequences of the learning process, they're consequences of the design and purpose. LLMs model language. They only reflect facts and knowledge to the extent that their training data incorporate those facts -- but even then, the primary goal is to generate text sequences that mimic the training data. Hallucinations will always be a hazard with that kind of objective.

        • by Luckyo ( 1726890 )

          Actually primary goal is to figure out relationship between letters, words, sentences, paragraphs and so on. That is how LLMs train.

          But in this process, they will find false relationships in some rare cases. And that will cause hallucinations. Just like humans learn to correlate things to one another, and sometimes make false correlations that lead to hallucinations.

      • Offtopic but something to consider: are LLM hallucinations even fixable,

        Yes, but we will need to redesign AI to have a concept of a "fact", which it now doesn't have (something like Cyc [wikipedia.org]). Perhaps someone could figure out how to do a hybrid model between ChatGPT and Cyc. Currently, ChatGPT is not a hallucinator, it's a Bullshitter. [xkcd.com]

        • by Luckyo ( 1726890 )

          That sort of a conception would likely require self-awareness. I.e. LLM cannot do that. I could be wrong on this though.

          It just seems that this sort of differentiation requires comprehending a context. Which is one of the definitions of being aware, because context exists in relationship to one that is aware of it. There could be ways to bypass this however, without going into AGI territory. I don't know enough about this subject to make a call.

          • Well, there are different theories about how to make a computer to represent facts internally. Cyc is one of them, and actually it does a good job. If Cyc tells you something, then it's probably true. (the downside is the facts all had to be inputted manually, as opposed to ChatGPT which just scans a billion websites). Watson seemed to do a good job with that, too, although I don't know how.

            ChatGPT has no concept of a fact. It output words based on "what is most likely to sound good." Which is what a b
  • robot soldiers can be jammed and hit with EM's

    • Next step is electronic countermeasure (ECM) that are compact enough to fit in a robot instead of the current tech that fits in a ship or plane.

    • by HiThere ( 15173 )

      You are assuming that they are under remote control. That may be necessary for (flying) drones, but I don't think that can be assumed for robot tanks, trucks, etc.

      • Re: (Score:2, Troll)

        by drinkypoo ( 153816 )

        You could attack them with EMP or HERF.

        Presumably they will be shielded, but that's difficult to do because you've typically got wires passing through the walls of enclosures.

        One (expensive) way to solve this is with optical connections to sensors, and separate power supplies in and out of the main modules. Making the bots more expensive makes it less practical to field them in large numbers.

        • by ceoyoyo ( 59147 )

          Optical connections are very cheap. The power supply you use for your computer almost certainly has at least one.

          • Optical connections are very cheap. The power supply you use for your computer almost certainly has at least one.

            I could believe it has an opto-isolator on the power sense pin, and maybe even on the fan tach, if that's what you mean. It certainly doesn't have any optical data connections. Every single connection in or out of my power supply is a copper wire... hmm, the fan tach might not be copper, it could be something cheaper. The only optical data connection anywhere near my computer is for SPDIF audio, and I don't actually remember if this PC has optical SPDIF or not. A lot of mine have had, and it might.

            To me, th

            • by ceoyoyo ( 59147 )

              I could believe it has an opto-isolator on the power sense pin

              It does. Sounds like sensor data to me. You seem to mean digital data. Those are also very cheap. You probably have a TV with one.

              EMPs are geek favourites, but they're very impractical. The only way to make a decent sized one is with a nuke, and even then most military hardware is pretty trivially shielded well enough that the EMP isn't going to do more than the other effects. Humans are pretty vulnerable to the EM radiation from one too.

              Also, it

        • If a tank can be made EMP resistant, am guessing a tank sized bot (probably smaller since it doesn't need to have space to carry around a crew and keep them alife) can be made resistant as well.

          Of course you can jam communications to it. But the summary did mention that they will may end up as autonomous without a human in the loop. So together with EMP resistance, they will be just fine.

          Thinking about it, a tank without a human crew will probably be alot smaller, maybe even sort of a much smaller mobile fr

    • by Luckyo ( 1726890 )

      We already do this to human soldiers. Shell shock is a thing, and so are various other effects.

  • Land mines (Score:5, Interesting)

    by khchung ( 462899 ) on Saturday May 27, 2023 @11:08PM (#63556499) Journal

    Q: So at that point humans just would be out of the decision-making. You'd just have to trust the machines and that you'd programmed them well.

    Is the CBS reporter so ignorant that he did not know we are already long passed that point with the use of land mines? Land mines can be considered as the most primitive autonomous lethal weapon, and armies had trusted it decades ago.

    Guess how many and which countries in the world still refused to stop using land mines? Do you expect a landmine-using country would balk at using more autonomous weapons? Especially airplanes which will be deployed abroad and far from friendly troops, killing only foreigners?

    • by HiThere ( 15173 )

      It's worse than that. Even with an automated weapon, including the ones we're intentionally headed towards, some person makes the initial decision to activate it. People aren't out of the loop, they're just moved so that they're only in the initial parts of it. This has been true of ICBMs since the late 1950's. (I'm willing to consider that land-mines, well, most of them, don't count as robots. They've got about the intelligence of a thermostat, i.e. about the minimum intelligence meaningful to talk ab

    • by Misagon ( 1135 )

      At least land mines are predictable and stationary.

      They are not an enemy soldier, they are more like an environment hazard. A modern-day moat with pointy spikes at the bottom, only easier to deploy and easier to hide.

      • by ffkom ( 3519199 )
        There are sea mines that are neither predictable nor stationary. I think it is fair to say that those are primitive autonomous weapons.
    • Re: (Score:2, Informative)

      by Luckyo ( 1726890 )

      By this logic, we invented AI before we invented fire, as humans used traps from very early age.

      In fact, there are animals who use trapping.

  • that's what every AI developer would say
  • by zenlessyank ( 748553 ) on Saturday May 27, 2023 @11:12PM (#63556505)

    That is all I hear them saying.

  • by caseih ( 160668 ) on Saturday May 27, 2023 @11:18PM (#63556509)

    Talking about fully automated, robotic fighting machines pretty much lays bare the absurdities of war and the pointlessness of it. I mean if we're going to go to all that work to fight a war, why not just avoid it entirely and run the war in a simulation and then accept the results.

    No, such automatic killing machines may save soldiers lives but they will kill many more civilians and non-combatants more efficiently. No thanks. It's truly madness.

    Some of the old Star Trek writers seem almost prescient in predicting this madness. "A Taste of Armageddon."

    ANAN: You realise what you have done?

    KIRK: Yes, I do. I've given you back the horrors of war. The Vendikans now assume that you've broken your agreement and that you're preparing to wage real war with real weapons. They'll want do the same. Only the next attack they launch will do a lot more than count up numbers in a computer. They'll destroy cities, devastate your planet. You of course will want to retaliate. If I were you, I'd start making bombs. Yes, Councilman, you have a real war on your hands. You can either wage it with real weapons, or you might consider an alternative. Put an end to it. Make peace.

    ANAN: There can be no peace. Don't you see? We've admitted it to ourselves. We're a killer species. It's instinctive. It's the same with you. Your General Order Twenty Four.

    KIRK: All right. It's instinctive. But the instinct can be fought. We're human beings with the blood of a million savage years on our hands, but we can stop it. We can admit that we're killers, but we're not going to kill today. That's all it takes. Knowing that we won't kill today. Contact Vendikar. I think you'll find that they're just as terrified, appalled, horrified as you are, that they'll do anything to avoid the alternative I've given you. Peace or utter destruction. It's up to you.

    • by dcollins ( 135727 ) on Sunday May 28, 2023 @12:15AM (#63556583) Homepage

      I mean if we're going to go to all that work to fight a war, why not just avoid it entirely and run the war in a simulation and then accept the results.

      This sci-fi-based argument is never coherent. The whole point to a war is that the other side refuses to accept your results, and you're willing to apply violence to their bodies to remove that blocker one way or another. No amount of tooling is going to change what a fundamental refusal-to-agree-no-matter-what looks like.

      • Eh.

        For the time, effort, treasure, and capacity devoted to war; if the same were applied to resolution, there wouldn't be the need to force a result as often (and as fiercely).

        And the the corollary holds true as well- that forcing a result will cost you a generation or two has served as somewhat of a deterrent.

        With this, we will rejigger the calculus, and more than likely be horrified by the results.

        If we survive.

        • In a war, both sides think they are on the "good" side, and fighting for their (pick your choice): freedom, rights, their resources, etc... There is no amount of time/effort/treasure/capacity to make a side change its mind. Even when they are defeated, they don't accept the result, the other side merely imposes it to them.

          Which is why all recent attempts at invading countries (usually for their resources), like Irak, Vietnam, and the like, ended up badly: as soon as the pressure of force is relieved, the i

          • After the last US attempt at liberation, a more sizable chunk than usual believed their blood was whored out, nevermind those with a bit of history or memory calling it out; they get suppressed and defamed in order to keep the citizenry brainwashed.

            The history of war propaganda makes clear the populace has to be convinced to go to war (short of wars for self-defense) using every possible psychological trick, so no.

            This merely decreases the cost of war, in both blood and conviction, and at an extreme the pop

          • War has never been about being right.

            War's mostly about who's left.

      • by Luckyo ( 1726890 )

        It's honestly a good example of just how divorced a typical modern western peacenik is from reality by his/her incredible opulence.

        War is simply the last negotiation tool after all other tools have failed. It's not even a human thing. Chimps have wars. In fact one of the likely reasons that humans who evolved language outcompeted those that didn't is because one of the fundamental purposes of language is to add more ways to accurately communicate so need to have wars lessens.

        It doesn't go away. It just less

        • by caseih ( 160668 )

          Give war a chance, eh?

          War is simply the last negotiation tool after all other tools have failed.

          For the defender, yes this could be true. Except that why should there be any negotiation on the part of the defender? Someone comes and steals your home and you're expected to negotiate with them? That's why I describe war as absurd. It always is. The current Russian war in Ukraine is a prime example. There's nothing for Ukraine to negotiate. Self defense is defensible morally, starting a war never is.

          All

    • by timeOday ( 582209 ) on Sunday May 28, 2023 @12:25AM (#63556589)

      I mean if we're going to go to all that work to fight a war, why not just avoid it entirely and run the war in a simulation and then accept the results.

      This is basically what already happens, 99 times out of 100. If the outcome of a war is too obvious, it generally does not happen. This is why large powerful nations have so much more power in the world than small ones, even without invading and occupying them or even firing a shot. The main point of a military is what you could do with it.

    • such automatic killing machines may save soldiers lives but they will kill many more civilians and non-combatants more efficiently. No thanks. It's truly madness.

      That's actually a goal, no matter how much people and governments like to claim otherwise. The more people you kill, the more disarray the nation is in and the more you can tamper with it for whatever purpose. War is hell, and when engaged in for profit, evil too.

  • purpose of war (Score:3, Insightful)

    by bzipitidoo ( 647217 ) <bzipitidoo@yahoo.com> on Saturday May 27, 2023 @11:33PM (#63556531) Journal

    Why do wars happen? In a word, scarcity, and the understanding that forcefully reducing population will reduce demand.

    Therefore, military robots will be killer robots, and civilians will be the softest targets. They won't "need to follow the law of war". Whoever thinks that is dreaming. Those robots absolutely will target civilians. And, hell yes, things could easily get completely out of hand.

    We've lived with nuclear bombs for 3/4ths of a century now, and so far have not started a nuclear war. We're going to have to do the same with fully autonomous robot soldiers.

    • Re:purpose of war (Score:5, Insightful)

      by sound+vision ( 884283 ) on Sunday May 28, 2023 @12:49AM (#63556621) Journal

      Scarcity, yes. War either subjugates or eliminates a population so that their resources can be taken by someone else. The reason nuclear war hasn't happened is it would fuck up the resources, there wouldn't be anything left to claim.

      Every nation without nukes will have to become a vassal state beholden to a nuclear power, either that or be invaded outright by one. You saw this condition starting to develop in the Cold War. The countries that sat it out were called the "third world" (ie, not USA or USSR) but that kind of neutrality will become increasingly untenable. As resources get depleted, nuclear powers will eye non-nuclear powers with increasing hunger.

      When you factor in ever-increasing population, and an ever-decreasing cap on the max supportable population due to climate change, nations' hunger for resources looks more like starvation as time goes on. I don't even mean gold and oil now, stuff like arable land and drinkable water is on the decrease globally. Mass slaughter will start to look more appealing than sucking the third world dry via trade.

      "Give us everything you have or we'll send the killer robots in" will be the global order before long. While nuclear war is a resource-destroying proposition, a flood of autonomous weapons isn't. Factor in a country not having to send any of its own population to fight, thus muting any internal dissent. "Autonomous weapon holocaust" is a way more likely scenario than "nuclear holocaust" ever was. We're sure to see at least one before the century is up - just to prove what these weapons can do. There will absolutely be an autonomous weapons Hiroshima and Nagasaki.

    • And that's exactly the problem here.

      Let's face it and call a spade a space, who is it that we stuff into uniforms to shoot and kill each other? Is it our Nobel prize laureates and the inventors, movers and shakers of the country, or is it the more replaceable individuals, i.e. the surplus?

      A robot army would effectively kill off the wrong people.

    • Why do wars happen? In a word, scarcity, and the understanding that forcefully reducing population will reduce demand.

      This idea wars are fought over "scarcity" has little precedence in human history. Typically the party with the most resources also happens to be the aggressor.

      We're going to have to do the same with fully autonomous robot soldiers.

      The very concept of a fully autonomous soldier (human or otherwise) is an oxymoron.

    • Wars happen largely because someone wants to have absolute power over everyone else and they just happen to get the means to do so. The day they invent effective, autonomous killer robots, the super-rich will immediately get rid of all the other 99% of humanity that has become unnecessary.
      • by ffkom ( 3519199 )

        Wars happen largely because someone wants to have absolute power over everyone else and they just happen to get the means to do so. The day they invent effective, autonomous killer robots, the super-rich will immediately get rid of all the other 99% of humanity that has become unnecessary.

        Many people feel "rich" only by comparing themselves to the surrounding "poor". Eliminating all those poor humans - even if their function could be replaced by robots - would strip them of their reason to feel "rich".
        Therefore, I think it is much more likely that private robot armies will be utilized to keep a status quo where many poor have to service the rich, but without the loyalty risks of requiring humans as bodyguards, and enforcing the interests of the robot owners, without requiring a detour throug

    • If your military is fully automated, you might be more willing to go to war - especially if the war is not conducted in your own territory(invasion) and you don't have to worry about coffins of your servicemen returning.

      It's just a bunch of bots fighting far away from you.

      I do remember reading a sci fi short story years ago, about a bunch of automated aircraft and other military stuff fighting an enemy nation. And they had automated factories, mines, repair stations and everything to keep on producing/repar

  • There's two ways to take that, lol.
    I have a memory of a golden age SF story where someone was doing an emergency delivery of a vaccine to a colony on a moon, and towards the end the computer took over and was brutal in dishing out the g-force the pilot experienced in acceleration, and especially when de-accelerating. The colonists survived, but the pilot suffered brain damage.

  • I'm reminded of them, with their battle computers that could and would take over the action of the cyborg they were part of if the Elite became unconscious.

  • Only a bit of retcon with canon, and you have the Skynet AI waking up and seeing life ending global nuclear warfare as both imminent and inevitable, and it triggered a limited release of nuclear weapons so as to alter the course of history. Rather than exterminate humanity it placed the survivors in camps. It had plans something like the ones in Colossus: The Forbin Project, but first it has to defeat the resistance led by John Connor, which would put humanity back in charge of things, including nuclear bombs.

    Terminator: The Sarah Connor Chronicles towards the end was hinting at there being an AI faction that was less than monstrous. Shirley Manson's liquid metal Terminator, and her enigmatic offer of "Will you join us?", was something I'm sad to not see ever get fleshed out and put on screen, due to the show getting cancelled.

  • Current international law requires a human to give the kill order (fire the missile or otherwise pull the trigger); so we as a country are not headed for fully autonomous soldiers. Whether some other country might break that law ... probably if the climate doesn't kill us first.
    • And if I don't, you're gonna sue me?

      In what court?

      In a war where that kind of automatic mow-them-down weapons are going to be used, it's hardly likely that whoever loses would be put up for a trial and whoever wins would care about such a thing.

      • And we didn't charge the Germans with war crimes? Sure we did.

        But you're imagining no one comes to the aid of the side that didn't break the laws. It isn't over after one battle.
        • You think anyone could have held Germany accountable had they won the war?

          The whole premise depends on being the victor in the end.

    • by ffkom ( 3519199 )

      Current international law requires a human to give the kill order (fire the missile or otherwise pull the trigger); so we as a country are not headed for fully autonomous soldiers. Whether some other country might break that law ... probably if the climate doesn't kill us first.

      It does not take a country to build a robot army. As soon as such an army is expected to defeat even larger human-controlled armies, there is a business case for corporations to build a robot army. And as you can read from history, Corporations never were shy of building armies and waging wars. Remember when one UK company employed more soldiers than its home country? https://en.wikipedia.org/wiki/... [wikipedia.org]

      • by ffkom ( 3519199 )
        I should have mentioned that the situation "A UK company employs a larger army than its home country" is still a given today - G4S [wikipedia.org] has about twice as many employees as the UK armed forces.
  • Star Trek [wikipedia.org] quote. Is this our next step ?!?
  • Robotics technology sucks. Robots still can't even walk properly they look like they have a stick up their ass and have to keep their knees bent. Robots also lack dexterous hands .. being able to grip a door and open it is too complicated.

    • by burni2 ( 1643061 )

      And when looking at the enormous development of the Boston Dynamics "Toys" I would atill say "No we are not" ..
      and when looking at what huge step chat-gpt3 was ..
      or that militaries look into unmanned fighter jets ..

      It will not be now
      It will not be tomorrow
      but in a decade we will see it in combat

      China will be embracing it without doubt.

  • [OBJETIVE LOCKED. DO YOU WANT TO DESTROY]
    -Yes
    [ARE YOU SURE?]
    -Completely, they are the enemy after all
    [THERE ARE HUMANS INSIDE. JUST TELLING]
    -Destroy it!
    [HAVE YOU FULLY CONSIDERED THE MORAL IMPLICATIONS?]
    -They are getting away! Kill them!
    [THEY MIGHT HAVE CHILDREN, AND VERY LIKELY BE CHILDREN THEMSELVES]
    -Aaand they're gone
    [PROBABLY FOR THE BETTER]
    -War is really Hell

  • Laws are a product of civilisation. War is the suspension of civilisation. Laws don't count for shit during war & the people who actually do the warring know it. All sides rape, murder, get involved in organised crime, human & drug trafficking, etc.. Who's gonna take soldiers off the battlefield during a war? Ethics in war is simply a PR exercise.
  • "All it takes is a few lines of code to simply take the human out of the loop."

    • I always knew during my military time that we could easily replace most officers with very small scripts.

    • by dubner ( 48575 )

      That sounds like (almost) every manager I've ever had while working at a hardware company.

  • Imagine a world where economic sanctions could be applied against an aggressor nation and where other nations could give the victim of that aggression the means to defend itself. Then imagine that the aggressor nation runs out of autonomous killing machines while the victim nations gets propped up until the aggressor's economy collapses and it loses the ability to wage its war and has to withdraw because its population is conditioned to be proud of its robots and their superiority. That could be an improvem

  • Um, yeah? (Score:4, Insightful)

    by cascadingstylesheet ( 140919 ) on Sunday May 28, 2023 @08:08AM (#63557055) Journal
    Your enemies are not going to decline to use superior weapons just because you want them to.
  • I can't believe they're only talking about this now.

    Semi-feral children can't light a fire inside the shell of a flatscreen TV and watch it for entertainment. The sets are too thin! That only works in the shell of a bulky CRT TV set. But everyone already replaced theirs years and years ago.

    This is not the war between man and machine we were always promised and I find it very disappointing.

  • Does anyone else see a problem with a future composed of robotic military that depends on outsourced manufacturing? It will become of war of factory output If you dont believe it, look at Boston Dynamics and now tesla designing humanoid robots. The only thing missing is artificial intelligence which is on an exponential growth curve
    • Does anyone else see a problem with a future composed of robotic military that depends on outsourced manufacturing? It will become of war of factory output If you dont believe it, look at Boston Dynamics and now tesla designing humanoid robots. The only thing missing is artificial intelligence which is on an exponential growth curve

      What I'm not so sure of is the "will become" aspect. Industrial and logistical enterprise are everything in modern war.

  • It will happen for one simple universal reason: violence works.

    I thought about this a lot after reading the Three Body Problem. You can make people do whatever you want with violence. If they refuse, they are killed which ends their ability to do or change anything in this physical world we live in. Whoever uses the most violence in any violent confrontation wins. The only reason mutually assured destruction works is because of the threat of the same or more violence being acted on the original perpetrator

  • When the first Kinzhal missile was shot down over Ukraine, the Patriot battery that did it was allegedly operating in an autonomous mode. It detected an incoming ballistic missile and made the decision to fire at it, faster than the crew could react. Navy ships have automatic defense systems as well. The defensive use case is certainly more ethically straightforward than autonomous killbots let loose to hunt the enemy... But the latter is happening too to some degree. For example, the "SMArt-155" submunitio

  • So, here we have the U.S Military (and I'm sure many other countries are following suit), figuring out how to use AI to wage war to keep soldiers safer.

    Something seems just a little bit arse about face here.

    Call me crazy, but I would think using AI to prevent war in the first place would be a better use of time.

    I guess that could put the U.S Military out of business - and all those arms manufacturers too.
    Ok, now I understand...

    • Something I wrote a dozen years ago: https://pdfernhout.net/recogni... [pdfernhout.net]
      "The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by

"There is no statute of limitations on stupidity." -- Randomly produced by a computer program called Markov3.

Working...