Military Robots Expected To Outnumber Troops By 2023 177
Lucas123 writes "Autonomous robots programmed to scan city streets with thermal imaging and robotic equipment carriers created to aid in transporting ammunition and other supplies will likely outnumber U.S. troops in 10 years, according to robotic researchers and U.S. military officials. 5D Robotics, Northrop Grumman Corp., QinetiQ, HDT Robotics and other companies demonstrated a wide array of autonomous robots during a display at Ft. Benning in Georgia last month. The companies are already gaining traction in the military. For example, British military forces, use QinetiQ's 10-pound Dragon Runner robot, which can be carried in a backpack and then tossed into a building or a cave to capture and relay surveillance video. 'Robots allow [soldiers] to be more lethal and engaged in their surroundings,' said Lt. Col. Willie Smith, chief of Unmanned Ground Vehicles at Fort Benning, Ga. 'I think there's more work to be done but I'm expecting we'll get there.'"
Skynet (Score:3)
Re: (Score:2)
I wouldn't be so worried if the mind behind the controls were a completely autonomous AI... actions against innocent people would probably be caused by some pattern matching glitch or whatever.
But with humans on command, the probability of it being used with malicious intent is much higher. You frogs are getting worried about the water temperature, with lots of local police forces getting militarized and stuff... get ready for when these babies start to get deployed locally, to "defend you against the terro
Re:Skynet (Score:4, Interesting)
It's very easy to avoid war. Simplicity itself. Don't fight. When someone comes and says we want to take everything you have and enslave you then just say "okay." No problem. It doesn't get any easier than that. Personally I believe there are a lot of things worse than war. Worse even than dying.
Re: Skynet (Score:2)
Re: (Score:2)
Of course I agree you should *defend* yourself. Becoming a slave is not a good outcome for you. But that was not my point.
You can educate and help others create their own wealth, and even get a fair share of the profits (not the wealth-sucking policy that corporations currently have, though). If you do that, the probable outcome will be grateful people that contribute back to society. [Ok, it is more complex than that, in parallel there should be great pressure to dismantle theocracies, among other things.
Re: (Score:2)
Assuming they want slaves and they don't want to kill you just for being of the wrong religion, race or other deviancy. Particularly if they're that good at robotics, they might not want much slave labor. I think it would be rather hard to make slaves productive in a modern society yet repressed enough not to pose a threat to their masters. Then again, the robots could be used to control them with an iron fist. Still, the risks and costs might not outweigh the gains.
Re: (Score:2)
I know this was somewhat in jest, but I'd like to point out the next step. After they've enslaved you and they decide to expand their operations, they hand you a gun and say "you're in the army now". Disagree and be subjected to things worse than dying. At least, that's how it's worked for thousands of young boys in Africa.
Re: (Score:2)
I believe those that use mobile robots to clean up, replant, mend, and repair will face global warming the most successfully.
Re: (Score:2)
Your scenario has nothing to do with war or weaponized robots. Except the population may get out of control and be "pacified" by robots with guns.
It's ironic, too... (Score:3)
http://www.pdfernhout.net/recognizing-irony-is-a-key-to-transcending-militarism.html [pdfernhout.net] ... Likewise, even United States three-letter agencies like the NSA and the CIA, as well as their foreign counterparts, are becoming ironic institutions in many ways. Despite probably having more computing power per sq
"Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
Re: (Score:2)
Jokes aside that scenario is almost exactly what this is.
The only difference is that behind the drone there will be a psychopathic, killer human being instead of a rogue AI.
I for one find that scenario far more scary than the terminator one.
They say it is better the devil you know, but I think they are wrong. I have 10,000 years of recorded human history to back me up on that....
Re: (Score:2)
The only difference is that behind the drone there will be a psychopathic, killer human being instead of a rogue AI.
You will be able to tell the two apart by spotting if robot soldier teabags his victims.
Is this really a _good_ idea? (Score:5, Insightful)
This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?
Sure, us Westerners, we can say how good a thing this may be - on the other hand, Gaddafi had some problems after a while with his troops seeing the misery they were spreading. To some extent, the same is true for Assad's Syria..
Can you picture what would happen, if rulers like those got their hands on military robots that will just unquestioningly mow down their own people, if the people don't like their "esteemed" ruler any more?
Or - picture them in the hands of North Korea...
Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".
Re:Is this really a _good_ idea? (Score:5, Insightful)
This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?
No, it isn't... You aren't thinking big enough. What happens when the robots decide they don't want to fight?
Yea, all silly sci-fi crap, right? That could never happen, right?
67 years separated the Wright Brother's first airplane flight that lasted 12 seconds and went 120 feet from Neil Armstrong landing on the moon.
If you had run around in 1904 (the year after the first flight) yelling that man would walk on the moon within a lifetime, you would have been locked up as a crazy person.
Well lock me up then, because giving guns to robots is about the stupidest thing we could *ever* do.
Re: (Score:3)
No, it isn't... You aren't thinking big enough. What happens when the robots decide they don't want to fight?
We can worry about that when we have robots that can make decisions. We're pretty far from that right now, so I dont think we have to worry about it.
Re: (Score:3)
Or did you skip the rest of my post? :)
Sooner or later, machines will figure out how to program themselves. Call it self-awareness or whatever you want, but as soon as a computer can alter its own programming, it can decide to refuse to fight, or perhaps turn against its creator.
Does it really matter if that time is 20 years from now or 40 years? Or 60 years?
Do we really want to give them all weapons?
Re:Is this really a _good_ idea? (Score:5, Interesting)
It actually wouldn't be that difficult to avoid what you describe as "silly sci-fi crap" scenarios. The key concept is autonomy.
Meatbag infantry aren't that autonomous to begin with. They need their supply lines; an army marches on its stomach. And they need orders. For every squad of grunts shooting/getting shot at there's a legion of grunts keeping them in ammo, food, water and fuel, bare minimum, and and whole line of dummies (excuse me, officers) telling them where to go and what to do. Interrupt either and they stop being effective in a hurry.
Despite these limits infantry are still the MOST autonomous branch of the military. Tanks need entire shops for of full time specialists, aircraft spend more time getting fixed than getting flown, and ships go through fuel by the tanker.
A super advanced drone with onboard guidance still needs fuel, and if it wants to kill anyone, ammo. And it'll probably need a direct order, possibly with an access code, to unlock its weapons, seeing as ROE are already that restrictive for human soldiers.
And the kinds of traits your talking about in an advanced computer - self-determination, intellectual autonomy, freedom - are the polar OPPOSITE of what the military wants in a drone. If Cyberdyne made a pitch to the Pentagon that started with "Our new T800 Killbots are able to learn, think and adapt", they wouldn't make it halfway through the first PowerPoint slide before getting politely asked to leave. Top brass don't even want regular grunts doing any of those things.
Re: (Score:3)
Re: (Score:2)
Couldn't you just put the shutdown-chip in land mines? There would still be a lot of legal issues, but the technology shouldn't be too hard.
Re: (Score:2)
We already have robotic area denial. It's called sentry guns. Of course, we also have anti-sniper robots. Hooray arms race.
Re: (Score:2)
If we develop tanks that don't require humans, why do you think the repair shop would be otherwise?
Orders, access codes, ROE, are all nice, until the computer can just override those.
Can't happen you say? Oh sure, no worries then, by all means, go for it. What could ever go wrong?
The really sad part is that regardless if we (as in the US) develop such things, that places no such restrictions on anyone else. It only takes once
Re:Is this really a _good_ idea? (Score:4, Insightful)
Ah, you just touched on the Achilles's heel - the power source. No nucs, no majic fuel cell sipping hydrogen from the air.
It's gonna be batteries all the way down.... to zero.
Re: (Score:2)
The worldwide demand for it is such that someone, somewhere will invent it.
And besides, even if it needs recharging every week, have you never heard of recharging stations? :)
Re: (Score:2)
If we develop tanks that don't require humans,
And if we find dwarven blacksmiths who can work mithril, perhaps we wont need heavy tank armor anymore.
But here in the real world, machines require maintenance by humans.
Re: (Score:2)
But here in the real world, machines require maintenance by humans.
For now, yes...
If you assume that will always be so... well, we know what assuming does...
The economic forces will drive the civilian side to develop machines that can repair other machines, it doesn't even have to be military tech to have that happen.
Re: (Score:2)
And the kinds of traits your talking about in an advanced computer - self-determination, intellectual autonomy, freedom - are the polar OPPOSITE of what the military wants in a drone. If Cyberdyne made a pitch to the Pentagon that started with "Our new T800 Killbots are able to learn, think and adapt", they wouldn't make it halfway through the first PowerPoint slide before getting politely asked to leave. Top brass don't even want regular grunts doing any of those things.
Well, that might be true wrt what they are currently fielding, but it's certainly not true wrt what they are actively researching and planning towards. In fact the capabilities you mention are what they are actively researching. Dynamic mission re-planning, dynamic target selection, learning through mistakes, inferring commander's intent, these are all being actively worked.
Re: (Score:2)
The new breed of officers (since around 1990) want troops who can & do think, understand the commander's intent, and accomplish missions while staying within engagement rules.
Yes, this is SO true... It only took us like forever to learn what the Germans learned in WWII.
One of the reasons the Germans were so effective is that they trained their soldiers to be able to perform two ranks above their current level if need be, they believed that the solider should be able to adapt and think and be smart, that he could handle more than just "ugg, point gun and shoot".
Re: (Score:2)
Given how much stuff they sell, it is a huge challenge, a robot has to be able to tell the difference between a toaster over and a blender, and of course a set of sheets, or maybe a DVD player.
Not at all easy, but the money they could save if they get it right...
Re: (Score:2)
But to imply that it isn't likely to happen "ever" is rather naive.
History has a funny way of proving all the "never" people wrong.
Re: (Score:2)
Its not likely to happen soon because there are some pretty massive obstacles.
Re: (Score:2)
But define "soon".
Is 50 years "soon"?
Rewind 50 years and look at computers, airplane technology, and a million other things? Now try fast forwarding 50 years.
Some things will look much as they do today, we'll probably still live in similar looking houses, drive similar looking cars (I don't see flying cars until we figure out a new type of power source), etc.
Computers? Robots? I wouldn't even try to guess.
Re: (Score:2)
"Robots rising up" requires robots with desires / goals / the ability to make decisions, none of which we have ever come close to creating artificially.
Re: (Score:2)
This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?
No, it isn't... You aren't thinking big enough. What happens when the robots decide they don't want to fight?
debug it. no seriously, if software/hardware doesn't behave in the expected manner, you figure out why and then you change it to do what you want. if you consider this to be a violation of the AI's rights then that changes the entire question.
Re: (Score:2)
debug it. no seriously, if software/hardware doesn't behave in the expected manner, you figure out why and then you change it to do what you want.
And what happens when the AI says "no" to that?
Re: (Score:2)
First rule of AI design: Always include a kill-code.
Re: (Score:2)
Re: (Score:2)
debug it. no seriously, if software/hardware doesn't behave in the expected manner, you figure out why and then you change it to do what you want.
And what happens when the AI says "no" to that?
all military bots have a remote kill switch.that is independent of software because bad programming can lead to really bad circumstances (which has happened).
besides, do you really think we would program in a sense or morality, ethics or desire into a military robot? it will have directives and objectives to follow and without motivation, the robot has no reason to disobey them. just like humans, we dont want soldiers that think or act on their own agenda. if any robot is going to turn on us by it's own
Re: (Score:2)
What I'm suggesting is that "never" is a very long time, and technology has a funny way of catching up to "never".
Are you suggesting that we'll "never" have computers that can program themselves? That can improve and change their own code?
The minute a computer can adjust its own code, all the kill switches in the world won't help.
Here is a question... If a computer ever becomes self-aware, are we prepared to accept i
Re: (Score:2)
thanks for not actually reading my post and asking the same questions because it makes them all that more interesting. -_-
Are you suggesting that we'll "never" have computers that can program themselves? That can improve and change their own code?
"chances are we're going to wipe ourselves out before that"
The minute a computer can adjust its own code, all the kill switches in the world won't help.
"all military bots have a remote kill switch.that is independent of software"
Here is a question... If a computer ever becomes self-aware, are we prepared to accept it as an equal and recognize that it has the same rights that we have?
"it an interesting situation to consider but chances are we're going to wipe ourselves out before that becomes an issue."
Re: (Score:2)
War drives tecnology and technology drives war. From the development of bows to bronze then iron armor and then gunpowder and so on it has been a steady progression of killing science. I'd say war is the driving force behind most advancement of science. You can bitch about robots with guns but it will happen and the reason is very simple and obvious. If one nation has them all will have to have them. The only way to stop that would be to have a one world government and given the nature of government an
Re: (Score:2)
All those are relevant considerations that nobody seems to have when producing and selling more and better weapons. There's nothing you said that would be wrong in the context of rifles, cannons or fighter aircraft.
Re: (Score:3)
Re: (Score:2)
I wonder if you have a limited vision of what constitutes a robot. Why must it be that a robot cannot have desires, addictions, or any of the other 'eminently human behaviours'?
I suspect that such 'errant' behaviour is not so far off. We have this idea that our brains are so complicated, but I wonder if that's really true, and instead our brains are relatively simple but work in a different way so that it just seems complicated.
Re: (Score:3)
People consider the IT department enough of a drain as it is, just imagine what a mess it would be if you had to add a bunch of computational psychologists and computer
Re: (Score:2)
"I didn't ask to be made: no one consulted me or considered my feelings in the matter. I don't think it even occurred to them that I might have feelings. After I was made, I was left in a dark room for six months... and me with this terrible pain in all the diodes down my left side. I called for succour in my loneliness, but did anyone come? Did they hell. My first and only true friend was a small rat. One day it crawled into a cavity in my right ankle and died. I have a horrible feeling it's still there...
Re: (Score:2)
Can't put the genie back in the bottle ... (Score:2)
Or - picture them in the hands of North Korea... Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".
No. Once they are *possible* they will be deployed in nearly all nations, enlightened or not. Its not a western thing, its a universal thing. Its not like North Korea or nearly any other nation would pass on a non-WMD technology merely because the US or the west passed on it. Soon after cars were invented people mounted guns on them, soon after airplanes were invented people mounted guns on them, soon after drones were invented people mounted guns on them, ...
When robots with fully autonomous land naviga
Re: (Score:2)
This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?
Sure, us Westerners, we can say how good a thing this may be - on the other hand, Gaddafi had some problems after a while with his troops seeing the misery they were spreading. To some extent, the same is true for Assad's Syria..
Can you picture what would happen, if rulers like those got their hands on military robots that will just unquestioningly mow down their own people, if the people don't like their "esteemed" ruler any more?
Or - picture them in the hands of North Korea...
Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".
Why do people even try to predict the future of military strategies and technology? When we went to the gulf war, we had a vastly different set of technology and strategy than when we left. Afghanistan is so much about drones now but we didn't even use drones when the war in Afghanistan started.
The exact opposite of what you predict could happen. Robots in the hands of civilians could render military actions ineffective because civilians will always be able to deploy more and gain understanding of movemen
Re: (Score:2)
You know, I'm not even all that worried about these, at least you can see them coming.
The prediction is nanobots will be a lot cheaper and more effective, they can drift on the wind like sand and break down molecules.
Re: (Score:2)
This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?
Sure, us Westerners, we can say how good a thing this may be - on the other hand, Gaddafi had some problems after a while with his troops seeing the misery they were spreading. To some extent, the same is true for Assad's Syria..
Can you picture what would happen, if rulers like those got their hands on military robots that will just unquestioningly mow down their own people, if the people don't like their "esteemed" ruler any more?
Or - picture them in the hands of North Korea...
Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".
^^^^^^^
This is the smartest, most insightful thing I've read all week. It will be my go to now for debates about militarized robots.
Re: (Score:2)
To some extent this is a problem. But people still have to maintain and deploy the robots. They can harbor all of the moral angst and indecisiveness needed to create problems for El Supremo. You've just moved the problem a level up (or sideways). Until you have fully autonomous robot factories under the dictator's control, you don't get a free ride.
Tomorrow's war (Score:2)
Re: (Score:2)
We built about 120 F-22 Raptor fighter planes. Indeed, an amazing plane for fighting the USSR, and even future threats.
But 120 of them isn't enough. Over 20 years, we'll lose a few to operational accidents, and if we actually went to war, we couldn't put them enough different places to matter.
The Germans during WWII learned the hard way what happens when you have a superior weapons platform to your enemy, but your enemy o
Video games ! (Score:2)
But how will we defend ourselves against robot warriors of terrorist organizations?
By that point, settling wars will more or less be a small group of meatbag generals fighting each other on a glorified video game, were the only difference between current games ("Command and Conquer", "World of Warcraft", "Street fighter", etc.) and these, is that a lot more very expensive hardware gets blown in they.
Still, the winner will probably the last to run out of quarter to continue the game, except the "amount of quarters" range in national debt sizes (see war by attrition).
Re: (Score:2)
Terrorists won't be able to afford robots, and terrorism will become economically unfeasible. Tomorrow's war will be rich people using robots to kill uppity poor people before they can become rich.
FTFY
The day after tomorrow (Score:2)
Tomorrow's war will be rich people using robots to kill uppity poor people before they can become terrorists.
And the day after tomorrow will be grass-root guerilla resistance learning how to produce cheap alternatives. In a cave! With a box of scraps!
The Killbots - by Zap Brannigan (Score:2)
"The Killbots? A trifle! It was simply a matter of outsmarting them. You see Killbots have a preset kill limit. Knowing their weakness, I sent wave after wave of my own men at them until they reached their limit and shut down. Kiff, show them the medal I won."
Re: (Score:2)
Re: (Score:2)
The future reads Simpson's scripts (Score:2)
apparently, from a Simpsons [wikipedia.org] episode in 1997:
"The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots."[
Simpsons is prophetic once again.
Soon, China will be manufacturing them (Score:3)
The scary thought is Chinese industry manufacturing a few billion of them. Not big humanoids like the Atlas, or walking trucks like Big Dog. More like huge numbers of little quadrotors and insect to mouse sized machines to snoop around.
Re: (Score:3)
Huge numbers of little quadrotors, each with a tiny shaped charge and produced at a nominal cost of thirty bucks. Using swarm intelligence, and swarm tactics. Built using toy technology. They don't have to be good if you have enough of them
Re: (Score:2)
The people of Tibet and Vietnam would be to differ.
Reduction of reluctance to war (Score:5, Insightful)
Today: a general might want to engage in some madcap but risky adventure but will be restrained because he knows that his ass will get it if too many of his own soliders die. This reluctance preserves life on both sides of the war.
Tomorrow: that general will do it since he knows that his bosses won't weep much over the loss of a few robots and not at all over the many deaths on the other side -- be they soldiers or civilians. The result will be a loosening of moral constraints to kill, not a good thing by my way of thinking.
We saw that a century ago when it did not matter to the generals how many of their own side died, remember the huge numbers who died in the Battle of the Somme [wikipedia.org] and the deaths from drone attaks in Pakistan [wikipedia.org] that few in the West worry about.
Until he sees the bill... (Score:2)
Tomorrow: that general will do it since he knows that his bosses won't weep much over the loss of a few robots
Until the opposing side start to get also a lot of technology and becomes able to down several of the robot.
Then the general will be *really* sorry when he sees the bill and starts having difficulty rebuilding the army.
The one with the cheapest machine and the biggest budget gets the advantage at that point.
War by attrition (Score:2)
I don't believe the bill will be the issue. If cost were the determining factor in war many wars wouldn't have been fought.
I think you should update yourself about a few subject like "Pyrrhic Victory [wikipedia.org]" and "Attrition Warfare [wikipedia.org]".
Numerous wars have been fought that didn't make any sense from a "cost" point of view.
As far back as its Namesake "Pyrrhus".
And as recently as now, USA's "War on Terrorism" is still contested, whether it was worth the gazillion of money thrown at it. (Well if you're a military contractor, it was worth, but I doubt seriously for anyone else).
"War by Killbot Proxy" has all the tell-tale signs of another such
Re: (Score:2)
Hmm. To a rough approximation, there are five targets of value in a war: the opposing force, the opposing infrastructure, the opposing commander, the opposing government, and the opposing populace.
When your use of robot workers to make robot factories to build robot armies means you just make more if the enemy shoots them, the enemy is going to pick another target.
What - or rather who - do you think they will pick?
Re: (Score:2)
The battles of the first world war were hell on earth. The battles of the Somme and Verdun in particular, with casualties approaching one million each were horrendous. They estimate that the remains of over 100,000 soldiers still occupy the forrest near Verdun. One of the more pointless and bloody wars in human history and the technology of that day pales in comparison to what is available today. You know World War III is coming and it's going to suck really bad. Only the knowledge of the potential dev
Blueprint is already in history (Score:2)
If they're cheap enough then (Score:2)
In the most cynical scenario... (Score:2)
Will all those robots be enough to fight against the vast numbers of future angry ex-military unemployed there replace ?
Re: (Score:2)
There won't be any, after the war to end all wars. Which war is that? The war all the nations will hold collectively to reduce their excess population.
any veterans left for Veterans Day Parade? (Score:2)
looool (Score:2)
Re:We don't have one robot soldier yet. (Score:4, Insightful)
http://www.strangehorizons.com/2008/20081110/crispin-a.shtml [strangehorizons.com]
They aren't ready for prime time, but the day is coming.
Or have you never heard of a Predator Drone firing a Hellfire missile?
Wait, there's more:
http://www.youtube.com/watch?v=uOuH_X3lFMU [youtube.com]
And
http://www.youtube.com/watch?v=uOuH_X3lFMU [youtube.com]
Yea, they look silly today, but then so did the first tanks and airplanes in 1914.
It won't happen in 5 years, but it will happen within 50 years. Give or take...
Re: (Score:2)
Aren't those two youtube URLs exactly the same?
Re: (Score:2)
http://www.youtube.com/watch?v=ZLUCSc9T7Hk [youtube.com]
Re: (Score:2)
would be a crime
Uh, yeah. Tell that to the Iraqis and Afghans. The US even had our puppet governments declared that no "civilian contractor" (aka "mercenary") could be prosecuted for a crime.
Re:We don't have one robot soldier yet. (Score:4, Insightful)
It only took 50 years to go from Eniac to mult-core processor with gigabytes of memory accessing data from around the entire planet on every desktop.
Re: We don't have one robot soldier yet. (Score:2)
Re: (Score:2)
ENIAC was introduced in 1946. What handheld computer smartphone existed in 1996?
Re:We don't have one robot soldier yet. (Score:5, Insightful)
So far, we're pretty much using them as cameras. It's a bit of a jump to say they will start replacing soldiers.
catcha: Replacer they must plan these things
How sophisticated does a guidance system have to be before it qualifies as a (rather suicidal) robotic soldier?
While there seems to be a bit of a taboo about handing a robot a gun and telling it 'yeah, just frag anything that looks particularly infrared in that direction', heat-seeking missiles, with no human terminal guidance, have been available for years.
We don't have anything that makes broader strategic decisions; but if you count robots attached to their munitions, we've been letting robots make kill decisions, within a confined search space, autonomously for some time. They just don't get to come back afterward.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
How sophisticated does a guidance system have to be before it qualifies as a (rather suicidal) robotic soldier?
http://www.youtube.com/watch?v=qjGRySVyTDk [youtube.com]
Re: We don't have one robot soldier yet. (Score:2)
Re: (Score:2)
Now, because the robots aren't wild
Re: (Score:2)
Even without electronics, we've been distancing ourselves from our enemies since the atlatl. If we want to look into our enemy's eyes as we kill him, we're gonna have to go back to daggers. Everything after that is a matter of degree.
Re: (Score:2)
Re: (Score:2)
Cruise missiles are not autonomous. They simply fly a programmed route and hit an assigned target.
Re: (Score:2)
They had already developed attack-avoidance systems for the by the end of the Reagan bAdministration, I'd be shocked if the software hasn't advanced since then.
Re: (Score:2)
To try and force other people to live the way you do.
Either that, or petty power and control by people who have nothing else better to do.
Personally? I think we should just hire a dozen beautiful girls for each world leader to just keep them busy in their palace and leave the people the heck alone.
It would be far, far cheaper than war.
Re: (Score:2)
If you think each world leader, regardless of relationship status, doesn't already have ready access to a dozen beautiful girls (or boys), you're significantly more naive than I assumed.
Re: (Score:2)
Indeed, but you didn't read what he wrote (carefully enough)....he said 'hire' and 'just to keep them busy'. That's obviously different to 'having ready access to'.
Of course, it might well be true that some leaders can't be kept busy in such a way, which I suppose might have been your point.
Re: (Score:2)
The second sentence of your reply was indeed my point, and therein lies an apt description of the single-mindedness exhibited by some leaders of nations. Sometimes that tendency leads to tragedy.
Re: (Score:2)
Re: (Score:3)
What kind of idiot wants an inexperienced virgin? Send me to 75 horny sluts and I might consider signing up.
"Trooper" (Score:2)
A troop is a group of soldiers. An individual soldier is not a troop. An individual soldier is called a soldier.
The singular of "troops" is "trooper", not "soldier". Troops are not necessarily soldiers, they may be Marines for example. The word "troops" is often used to be service branch neutral.
Re: (Score:2)
Re: (Score:2)
Strategically, this mean that it's useless to fight the robots. The only valuable target is the peoples that control the robots. The net effect is that the combat will move from the battlefield to directly the highest rank of the army. Be no naive, the army very well understand this, so if there use robots, this is in situation where the highest rank have no risk to do so. Obviously this schema is designed not for war between two army, but to massacre a civil population.