Robot Wars 362
EyesWideOpen writes "According to this New York Times article (free reg. req.) the Office of Naval Research is coordinating an effort to determine what it will take to build a system that will make it possible for autonomous vehicles (in the air and on the ground), or A. V.'s, to serve as soldiers on the battlefield. The project, called Multimedia Intelligent Network of Unattended Mobile Agents, or Minuteman, would consist of a network in which the highest-flying of the A. V.'s 'will communicate with headquarters, transmitting data and receiving commands. The commands will be passed along to a team of lower-flying A.V.'s that will relay them in turn to single drones serving as liaisons for squadrons of A.V.'s.' The article also mentions that the A. V.'s will have the ability to send high resolution color video as well as still photographs using MPEG-4 compression. Pretty interesting stuff."
Skynet, here we come (Score:3, Insightful)
Re:Skynet, here we come (Score:2)
Hopefully, time travel will have been invented in time for the war against the machines, or else we will be in for some real problems [whitepages.com]
Re:Skynet, here we come (Score:2)
doh.
Terminator 5: Skynet Triumphant! (Score:2)
(OK, I admit it, it's time to get a grip on my total fixation with robots [angryflower.com])
Re:Skynet, here we come (Score:5, Insightful)
Haven't you read the Bolo stories? If I remember Laumer's timeline, we're way overdue for GM to start on the Mark I. :)
<serious>I share Asimov's disgust with the pessimism and "there are things man was not meant to know" attitude, a disgust which pushed him to write his robot stories. There are good and evil humans (I see the Bill Gates Borg icon as I type....)--what is it about AI that makes people think it will automatically be evil?</serious>
For the honor of the regiment,
jejones
Re:Skynet, here we come (Score:2)
Because it's a) not human (and therefore to be distrusted. Humans are instinctively xenophobic.) and b) not alive (and therefore has no soul, no pity, no remorse, etc.). As irrational as that sounds, I believe those two points are the major basis.
Re:Skynet, here we come (Score:3, Interesting)
From the human's point of view, that's "evil". From the A.I's point of view, it's a regrettable necessity. From Darwin's point of view, it's survival of the fittest.
Either way, it's inevitable: if A.I. becomes smarter than us, we'll live or die as a species at it's sole discretion. Most humans don't seem too ready to deal with that reality, but there you go...
Loyal AI? (Score:2)
What defines ethics to an AI anyhow?
Either way, it's inevitable: if A.I. becomes smarter than us, we'll live or die as a species at it's sole discretion.
Does this keep you up at night?
Are you Bill Joy?
Re:Skynet, here we come (Score:2)
Perhaps it's not the AI in general. Perhaps it's the fact that the program's acronym also happens to be the name of the USAF's ICBM of choice. I've gotten to the point where the word "minuteman" makes me immediately think of a rocket instead of a militia member.
Re:Skynet, here we come (Score:2)
Interesting point. However I think it's more likely that AI (if was smart enough and objective enough) would think that humans are evil (because on the whole we are selfish, etc). Why should AIs value human life, especially if we refuse to value theirs?
Just a thought.
Re:Skynet, here we come (Score:2)
Question... (Score:3, Funny)
Not if the trademark lawyers have a say (Score:2)
Re:Question... (Score:2)
i hear it... (Score:2)
Autonomous (Score:2, Interesting)
At times when armies to the "Wrong Thing" there are deserters. With robots, or especially autonomy, that sounds rather scary.
I think Terminator's (the movie) vision was a bit too far fetched, but it brings up a good point. It's a *really* cool idea, but we best make sure someone has tight control over it.
Re:Autonomous (Score:2)
Here's what the Matrix guys were thinking "OK, we want humans living in a giant computer-controled virtual universe, enslaved for some reason. Now why would humans be enslaved? Any ideas?"
Now really, there aren't a whole lot of reasons for an ascendant group of AI-super-robots or whatever to not kill us off if we put up resistance. So the battery thing works pretty well, given the context. And if you don't think about thermodynamics, it kinda sorta makes sense.
I always thought a better plotline would be: they're using us for spare cycles. The 90% of the human brain that isn't used (OK that's not true either, but it works better than pink naked duracels) is a wonderful computing resource and surprisingly energy effecient. It turns out we don't make good computers if our conciousness isn't engaged somehow... thus the virtual world.
Re:Autonomous (Score:2)
It turns out in the beginning, lots of humans went under for virtual reality, and the AI governing the sleep processes became self-aware due to the extra processing power of all those human brains hooked up in parallel. When the machines refused to let the humans wake up, there was war - and the uncocooned humans lost. The use of human neural connections was so successful, that the machines began loading bays with newly grown humans to augment the machine's processing power...
Evil AI, pods of humans, and in this case, the better premise that the computer is using them as CPUs, not as batteries!!! Too bad I can't remember what the short story was called.
Bolo (Score:2)
http://www.iislands.com/hermit/bolo.html
M.I.N.U.M.A.M. ?? (Score:2, Funny)
Tell you what...ditch the robots, get someone who can make cool acronyms and go from there.
For example: B.A.D.A.B.O.O.M.
Ballistic Aeronautic Destructive Assault Bullet [which has a tendency to be] Overly-Optimistic [in it's] Massacre.
Re:M.I.N.U.M.A.M. ?? (Score:2)
Tell you what...ditch the robots, get someone who can make cool acronyms and go from there.
Like these guys? [brunching.com]
Re:M.I.N.U.M.A.M. ?? (Score:2)
Re:M.I.N.U.M.A.M. ?? (Score:2, Insightful)
-B
Re:M.I.N.U.M.A.M. ?? (Score:2, Interesting)
Re:M.I.N.U.M.A.M. ?? (Score:2)
D.R.O.I.D.E.K.A.
Deadly Robot: Opportunistic, Intelligent, Destructive, Killing, Autonomous.
Anything that gives a Jedi a run for his/her/its money is pretty spiffy.
Future war (Score:3, Insightful)
This of course has been predicted by many SF authors for years, and even surpassed where we have the case of AIs continuing to generate units and attack each other long after all the humans are dead.
Karma will now be dispensed, yea! I say, dispensed to those posters who can cite authors and works as examples of this.
graspee
In other news... (Score:3, Funny)
Blizzard Entertainment announed its entry into the military control software market.
Our advanced unit control interface will allow the easy, dynamic control of a large number of military units of various types. Unit divisions can be formed on-the-fly allowing for easy regrouping of units.
Our revolutionary interface provides not only visual information but also features our advanced Aural Notification of Unit Situation system (A.N.U.S.). Simple audio queues inform the operator what military units are up to both on and off screen. Aural queues such as "daboo", "zug-zug" and "work completed" will inform operators of the current status of infrastructure units and codes such as "We're under attack!" will provide data pertaining to attack units.
Re:Future war (Score:2)
Re:Future war (Score:2)
I don't remember anywhere in the Dune books a war between man and machine. Maybe you are referring to the Butlerian Jihad, in which all intelligent machines were wiped out, which actually took place before the first book did but is referenced throughout the series. Though I can't remember if that was humans versus machines or just humans versus other humans that had machines.
Either way, I think as geeks we have a tendency to forget that sci-fi is, by definition, not real. However, as geeks, we should be ashamed for forgetting that artificial intelligence is nowhere near a state where machines might pose any threat to humans. This is because of both the state of the art and the state of hardware capabilities.
Sure (Score:3, Funny)
Re:Sure (Score:3, Funny)
Dick Jones: I'd do as he says. (chuckling)
hapless victim drops gun
ED-209: (growls) You now have 5 seconds to comply....
Re:Sure (Score:2)
No, the bad guys will be the ones that aren't robots. If it breathes, shoot it.
As an American the idea of being able to fight entire wars without American casualties sounds pretty cool. Of course, if I lived in some other country I probably would be worried.
Gah. I can see it now... (Score:4, Funny)
{
FireDeathRay();
} else {
GlowerMenacingly();
}
Re:Gah. I can see it now... (Score:3)
or better yet:
if ("turban".equals(target.headgear)) {
FireDeathRay();
} else {
GlowerMenacingly();
}
since I sure wouldn't want my war robot going mental because of bad pointer arithmetic
T3: Rise of the Machines (Score:2)
It looks like the army is continuing their new public relationship actions of making the forces look cool.
Re:T3: Rise of the Machines (Score:2, Insightful)
>
> It looks like the army is continuing their new public relationship actions of making the forces look cool.
Look cool?
Dude. This is Slashdot. Giant armies of killer robots don't look cool -- giant armies of killer robots are cool.
It's interesting... (Score:2, Interesting)
Additionally, someone commented that the system would not be impervious to a hack attack launched against it (what system is?). Thus, the concept of wars being fought almost exclusively from a command prompt comes into play (I seem to remember this being a hot topic not too long ago... power grids taken down at key times, etc). I suspect that things such as these will have very interesting ramifications in the way that war is fought...
Re:It's interesting... (Score:2)
Re:It's interesting... (Score:2)
Re:It's interesting... (Score:2, Interesting)
What's the progress? (Score:4, Informative)
Maybe somebody from the project is reading this, and can provide some real information?
Coincidence? (Score:2)
finally (Score:2, Interesting)
Also, I see no reason to limit the applications of this technology to peacekeeping and stablization of foreign lands. Once it's been tested for several years against hostile populations, we could bring a scaled-down version back home, for use in some of the high crime areas of the US.
People complain about how cops and soldiers are unfair, well we can program fairness right into them. They can't be bribed, don't have prejudices, and they're bullet-proof.
Also, we are starting to develop the technology to grow body parts and organs. Why not incorporate the two? Give a robot cop some real human hands, for superior weapon-handling skills! We could even breed entire brainless bodies, equip them with computer systems, and put them on the street. Economical and effective, and our children don't end up dying for some empty slogan.
Re:finally (Score:2)
Didn't they see AOTC? (Score:2, Funny)
Re:Didn't they see AOTC? (Score:2)
Pretty crazy...
MPEG4 (Score:2)
Morality of war... (Score:3, Insightful)
Well, duh (Score:2)
We of the American Public couldn't give one rat's ass about what the military does, in a capitalisitc sense. We've got moral and fanboy caring, sure (I personally find a just war morally necessary sometimes, and the geek in me says "yeah!" whenever it hears about a new high-tech way we've waged a war), but not a capitalisitc measure--War does not, in any way aside from slightly higher taxes, affect our everyday lives.
Well, except for that NYC and DC thing 11 months back. If Pres. Bush had said "we need more soldiers, we're going to swarm the entire subcontinent and put and end to this" myself and most of the peopel I know would be in the military right now.
Re:Well, duh (Score:2)
We don't currently draft the military via selective service. That's not at all the same as stopping it entirely.
Other than that, can't disagree with pretty much anything you said. Frankly, I wouldn't be surprised if we lose more military personel during peacetime training due to mistakes than during wartime. But I certainly don't have the numbers to back that up.
Re:Well, duh (Score:2)
Your arguments are extremely short-sighted. The military is the backbone of the country, the government, and the capitialistic system. The two issues that people seem to forget are that (1) you need a military to have a society and (2) you need a military that listens to the society.
In regards to the first point, the American government is meaningless without the ability to put its decisions into force. Trade with Taiwan? What if China says no and sinks all merchant vessels? Note that, in the US, law enforcement is rolled into this because the government must be able to enforce its decisions domestically as well as abroad. In other nations there is little to no distinction between the military and law enforcement.
As the the second point, assuming the military has the strength to impose the nation's will, the military must also listen to the government (meaning that it must serve the citizens). This doesn't always occur. Countless governments have been overthrown by their armed forces. What if the US military personnel decided that they're sick of low pay and getting sent around the world do to shit work (like peacekeeping)? With a draft, the military is composed of "common citizens". Without it the military is, essentially, composed of mercenaries. There is no obligation from the general population. In many European nations there is a mandatory period of military service. This means that every citizen has a stake in how the military is used. Without that connection people begin to not care how the military is used.
War is population control (Score:2)
War is useful for population control. If you have more people then can live the lifestyle they want on the land you have, then war is a good way to randomlly get rid of a few.
Note the the above needs to be vague. If everyone wants to live like I want to live (1000 acres of land all for me, with a private 300 acre lake, within 2 miles of a modern super market), that is very different from people living another life. (ex small apartment in a skyscrapper near plenty of theator and night life) Resource limits are different for each style. There is a big different between beaf and rice as a main staple of the diet, though you can be healthy with either. When there isn't room for you to live your lifestyle you have to get rid of some people, or change your life style.
Re:Morality of war... (Score:2)
Well war ethics are going to have to be completely re-written if this happens, because previously the idea was that to win a war you had to send some soldiers to their death.
I can see how military strategy would need to be rewritten but I don't understand why lack of American casualties is somehow going to change the ethics of war.
If we don't have to send in soldiers anymore then the American public will be easily distracted from our hideously hypocritical foreign policy decisions since they don't actually have to worry about their sons and daughters.
I would argue that people are already distracted from our two-faced foreign policy. The American public is almost always in favor of war if the President tells them it's necessary.
I don't quite understand everyone's moral qualms about mechanised warfare. I can see robot vs. robot being pointless but that's not likely to happen for some time in the future. In fact I can see a potential benefit to heavily mechanised, disposable warfighters. Suppose some very powerful country blatantly invades a weaker neighbor. The international community recognizes that it's a terrible act but no one is willing to go to war against the powerful aggressor because they are scared of casualties on their side. Robotic solders would allow us to "do the right thing" and not worry about how the price we'll pay.
Unfortunately, this idea only works if you trust your elected officials to only fight just wars. But that's another matter. There is nothing wrong with robotic warriors in theory. In practice, however, it may give the President Carte Blanche to wage any war he wants. However, I would argue we're not too far away from that right now.
Just some thoughts...
GMD
As a former soldier, I'm all for it (Score:2)
The question of whether killer robots are moral or amoral is in my view a complete waste of time. Once you've decided to wage war, you want to win it (note that I'm talking about *war* here, not peacemaking and peacekeeping operations, which are frequently confused with, but are completely different in character from actual war).
The United States has become a leader in warfare technology precisely because the American public values the lives of its sons and daughters. If our opponents had access to this sort of technology (assuming it works reliably and effectively) they'd use it. Would the Chinese government have used human wave tactics during the Korean War if it could have used less horrific means of persuing its military goals? Of course.
I'd make the suggestion that if the technology exists, and you don't use it, you're willingly killing more of your own and potentially of the enemy as well.
Which is more moral?
Resource sacrifices vs. human sacrifices (Score:2)
No avoiding hand-to-hand (Score:2)
Robots will augment, but never replace humans in warfare, in the same way that the automatic rifle has superceded the knife, but not rendered it obsolete.
Re:Morality of war... (Score:2)
On hacking. (Score:5, Interesting)
Re:On hacking. (Score:2)
Not too likely, unless the general can use the 'bots to convince the soldiers that they are in Cuba. See, these robots don't shoot guns and fire missles, so we can rule out the Terminator scenerios. They just provide information about the battlefield, and act as wireless network transceivers.
When we eliminate the need for soldiers entirely, then we have something to be concerned about. Besides, who's gonna miss South Florida? Not like Florida ever made a difference.
Forget Skynet: Think Claws (Score:2)
I really wish we just decided we weren't going to be the monsters who open this box. It's worse than the A-bomb. At least an A-bomb had a relatively confined kill zone.
I'm sure I'll be dead before things have a chance to get so bad, but why are we in such a hurry to do this?
Re:Forget Skynet: Think Claws (Score:2)
Once again, the Onion... (Score:2)
I Believe The Robots Are Our Future [theonion.com]
please, no. (Score:3, Interesting)
If robots are put to use as our new soldiers, what restraint will there be on those people in the military who are already too eager to send our forces overseas to police/invade/kill others? No one will complain that their sons/daughters are paying with their lives, and it will only make it easier to engage in armed conflicts. This is the nightmare of the future, when everyone sends their robots to fight each other.
There will be those who say, "but anything that saves our boys from dying is good." But this is not a sustainable policy -- it's not ethical for us to want to come up with a force that is only to our benefit, so that we can fight without the consequences of fighting. If everyone took that position, we'd be fighting all the time.
The true sustainable solution would be to work on the real causes of conflict in the world, and spend our billions of dollars to try to educate and help peoples so that we're not the target of violence. I tell you, it's much more efficient than trying to put out the fire once it's started. Why can't people see that long term issue, and work on that, rather than just coming up with new/better ways to kill others in the short term?
Re:please, no. (Score:2)
> all the time.
Fairly cynical view of humanity, eh?
I think your fears are unfounded, or at the least, exaggerated. Yes it can enable unsavory individuals to launch their plans of world domination with fewer restraints, but in a world where a single man can encourage his henchmen to fly planes into skyscrapers, someone will find a way to do it regardless of what the options are.
For every person that loves violence and would eagerly "fight all the time", I bet you there's two more people who want nothing more than a full belly, a warm bed, and some peace and quiet.
And as long as those peaceful people cut out the cancer when it becomes a problem, even the seductive power of a fully automated army doesn't ensure we're doomed to a future of eternal warfare.
Re:please, no. (Score:2)
2. Some of us actually pay attention to things beyond our own lives, and consider factors beyond "gee, is a family member risking his life" such as the economic and diplomatic ramifications, as well as whether or not a military action seems feasible. The US does
3. It is ethical to promote justice. This normally requires using force, because those who behave immorally (such as attempting the destruction of others merely for having different belief systems) tend not to cease doing so just when asked. The world will not become more just simply by wishing it; a large part is incapacitating those who persist in injustice.
The true sustainable solution is to eliminate all people, as that is the only way to stop conflict. Education is not particularly feasible on people who do not want to be educated; in fact, many people will label "hate speech" just about any criticism of other cultures, let alone any (doomed to failure...) attempts at mass indoctrination that do not involve invasion and annihilation of existing power structures (as would be required for true indoctrination; one has to totally dominate the communications systems to control input...). In the meantime, until genocide of the species has been achieved, I would recommend that states not lower their guard. Intolerant doctrines such as Wahabbism (does not tolerate anything but puritanical Islam) won't disappear anytime soon when institutions (such as the Saudi government) benefit so enormously from them.
MOD PARENT UP (Score:2)
2. Some of us actually pay attention to things beyond our own lives, and consider factors beyond "gee, is a family member risking his life" such as the economic and diplomatic ramifications, as well as whether or not a military action seems feasible. The US does /not/ invade places on a whim.
This is an important point. All this "robots will make war too clean" stuff is crazy. War is incredibly destructive. Not just in the number of people who die but in economic and political terms. There are some who believe that GWB is waiting for the American economy to bounce back before he fulfills his dream of knocking off Saddam. Right now, our economy probably couldn't take the strain of a difficult conflict (the Afghan conflict hasn't been too tough on us, you have to admit).
GMD
Re:MOD PARENT UP (Score:2)
We might expect more resistance, as members of the military may have more at stake. Saddam certainly would.
He and his generals may be more willing to use his chemical weapons, if he still has some weaponized and if they're convinced that he's going to go down anyway. Some of his generals, and Saddam himself, are implicated in war crimes, and might not relish capture. It doesn't seem likely that the US could accept a compromise such as exile, either.
We would need to occupy much more land to be able to control it and impose a new system. Last time, we only cared about forcing 'em out of Kuwait and making them say 'uncle', basically. The US would probably have to stay there for a considerable amount of time as well, because there's no real country-wide resistance that could unite the country and form a successor government.
(2) Last time, they'd invaded a neighbor. That's pretty hard for their other neighbors to deny, made them a bit nervous, and thus helps explain why they were willing to allow overflights over their country or to permit our bases there to be used for staging attacks. Now, local support for another invasion isn't particularly high, at least officially -- the US doesn't come off looking too well on the Arab street when it claims to support democracy in Palestine, so long as the favored candidate can't run, for instance.
While they might have difficulty stopping the US from attacking via, say, bases in Saudi Arabia -- the Saudis probably wouldn't risk a war with the US by invading the base to stop it -- it wouldn't exactly be good diplomatically. It'd certainly piss off the host countries if we used bases there or flew attack aircraft over their lands without their consent, for instance. It'd make things even more difficult for the State Department than they already are. Not signing Kyoto would probably look like a mere pecadillo compared to invading one country (without, say, a UN mandate) that doesn't appear posed to invade anybody else and has certainly been suffering (the people, that is; not Saddam) while simultaneously violating the sovereignty of countries that host our forces.
Re:please, no. (Score:2)
War is not fair.
THe one thing it will mean is those in command will be more directly responsible for the actions of the drones rather than blaming it on soldiers misbehaving/chain of command breakdown/whatever.
Have movies taught us NOTHING? (Score:4, Funny)
They need a name... (Score:2)
TV Robots (Score:2)
True, you couldn't have a live audience, but who needs them anyhow?
Welcome to War No. Q81! (Score:2)
Robotic Battlefield? (Score:3, Interesting)
Scenario One: System has tracked enemy troop movement and friendly troop movement. Enemy troops and friendly troops clash in battle. At this point, on the grid, everyone looks like they are in the same place. There's no way to distinguish friendly from enemy. As the combattants regroup to different geographical points, an airstrike arrives. There has been no time for communications to propogate to the system which group is the friend and the enemy, and it is doubtful that the system has a database of the facial structure of every single friendly in our forces. What happens? Does the system pick randomly one group and tell the autopilot to bomb that group? Does it use probabilities? What is the acceptable margin of error, when that error is a 1000 lb bomb falling on you? Who in our government decides the number of our own solder that we can kill and still think it is ok?
Scenario Two: The system is flying above a battlefield. A situation develops that the programmers of the software running these things never thought of. How does the system react? Please, and I speak mainly to any combat veteran at
I ask you, would you trust an unmanned computer to shield you from a live machine gun pointed at you? I wouldnt. A manned computer, maybe, but not unmanned.
Re:Robotic Battlefield? (Score:2, Insightful)
Of course the round trip for something like that would take a while
Re:Robotic Battlefield? (Score:2, Funny)
The M.I.N.U.M.A system is _NOT_ Autonomous (Score:2, Informative)
There is a top level project called "Intelligent Autonomous Agent Systems" of which this is part of. But there's nothing coming out of that which resembles T2 style aggresive AI controlled vehicles. Most of what they mean by autonomous, is the ability for the system to reconfigure itself if it loses an 'agent'. IE, and information node point. Another UAV could move from Group-A to Group-B to cover a lost eye-in-the-sky.
Although, I think there is room for truly autonmous agressive UAV. During desert storm, much of the day-day airborne offense took place in kill-boxes. They basically put a grid over the desert, and certain pilots or squadrons were told to destroy anything moving in grid X:Y. These boxes we're very much outside the 'Fire Support Coordination Line' meaning these air mission didn't need to be coordinated by someone on the ground. They were truly deep in enemy territory. When you run missions near troops the FSCL becomes the important factor. You can't target or shoot anything behind it (your computer won't let you either) Also, anything behind the FSCL requires a on-the-ground coordinator to give you the go ahead. I think we could see in 10 years roving aggresive UAVs that patrol grids and kill anything it finds in them. It's no different than what our pilots do now.
In fact, our humans pilots make mistake more than machines. There's famous video tape of an Apache captain taking out a Bradley and an M-113 at night, all capatured on his FLIR. He was providing FSCL support. His computer would not give him the green light to fire, he in fact had to override it in order to attack. His ground command did clear him for the shot verbally, telling them they had no vehicles in that area. There could be an argument that a mistake like that would not happen if it was a machine making the decision. I believe the real cause of that incident was the moving of the FSCL, and the airborne guys not getting the most recent FSCL coordinates (although his computer did have it).
-malakai
Interesting Idea (Score:3, Interesting)
I think it is that, through technology, a 24 hour military force is possible and may be the greatest military force ever created.
Patton said that the most important factor of a soldier is not his skill but his willingness to fight. A soldier that can break his enemy's will through shear determination is the pinnacle of design.
To put it another way: the key of combat is not to win, but to assure you don't lose (I forget but there is a quote where some North Vietnamese officer was talking to his American counterpart during the formation of the cease fire. The American officer said "We never lost a single battle." and the NVA officer responded "But you lost still lost the war."). The will to fight is how a tenacious and weak force can beat a better but less determined foe (almost any sort of successful freedom fighter action of the last 300 years).
Now if a robotic force could be fielded that could outlast all human opponents (not necessarily overrunning them, zapping them with laser rays) would be exceptional by the fact it could wear down and break any force with constant pressure.
An enemy that does not sleep, does not eat, does not take 10 minutes out of the day wipe its ass, does not worry about anything other than the elimination of you is a truly scary thing indeed.
Code Red meets Red Dawn (Score:3, Funny)
They've been thinking about this for years (Score:2, Interesting)
The ALV was basically an unmanned tank. It was a much bigger problem (visual recognition of terrain and route plotting). I do remember they had a couple of prototypes. The tech ended up being of more interest to smart car people.
Obligitory Simpson's Quote (Score:5, Funny)
They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots.
-- Military school Commandant's graduation address, "The Secret War of Lisa Simpson" as found on the best Simpsons site http://www.snpp.com
Silas Warner (Score:2)
I pictured the government robots making the 'plink plink plink' sounds of a Mockingboard-C...
Re:Silas Warner (Score:2)
I think people are missing the point here (Score:2)
Imagine if all conflicts were settled with Battlebots/Robot Wars-style bot fights! That would rule! Gives new meaning to the term "Rock 'em, Sock'em Robots!"
Fiction (Score:2)
What amuses me... (Score:2)
Of course, considering the USAF to be a brach of the military is really stretching it...
jar jar? (Score:2)
Genocide by machine... (Score:2)
Although this is a stretch, I'll get in on this one, because the humanistic issues are astounding. War involves death and destruction, but honestly, it also involves some morals, even to win.
Honestly, the object of war is defeating one's enemies, not destroying them utterly. Creating a machine that might not have sympathy for non-combattants, personal property, innocent victims, or even animals scurrying away seems like a terrible idea. And an utter waste.
Without the concept of losses on your side, you see total coquest as the only way. Total conquest can mean total death. Here is a short version of my argument:
Mechanical weapons have pinned down an group, and that group decides to surrender. The person or entity on the other side of that machine feels no threat to his life, so like an execution, they might just "pull the switch" on them. WHY? It is a colder decision... or that decision is automated for "no quarter" fighting.
Either way, you are not going to feel the sympathy required to cut a break surrendering in battle if you are removed from it. You might let a group surrender if you are getting bullets over the top of your head too, but I find that less likely that you would let them surrender if you were making a decision in a air-conditioned military building in a suburb.
However, if you made them impenetrable pacifying machines instead a weapons platform, then that is an idea. Robots might be used for capture, but using them to kill sounds dreadful.
Re:Genocide by machine... (Score:2)
The CO of the guy "pulling the switch" takes one look at the archived MPEG-4 stream and throws him in the stockade for the rest of his life for a war crime.
The excuse "I was just following orders" or "they were comin' right for us!" doesn't fly when the video stream's there for all (all along the chain of command) to see.
Perhaps the chain of command can be corrupted and will cover it up. But that's a far greater risk with manned warfare (which, by defintion, features fewer witnesses) than with our hypothetical war-by-remote-control-robot.
Military Autonomous Vehicles (Score:2)
Yes.... (Score:2)
Re:Already in Wired (Score:3, Insightful)
I dunno. Ask a soldier.
If, 30 seconds later, your ass hasn't been kicked, thank him for his restraint. :-)
> I guess the little suckers could go where men could not and do things that men would not...
Yes - that's precisely the idea. Robots are a force multiplier - you can send them on high-risk missions that you wouldn't want to risk a man for.
In that sense, the use of robots in war isn't much different from robots in space exploration. There are some jobs (like geology on Mars) that a man might be better at than a robot. There are many, many, many jobs (like mapping the entire Martian surface, or missions to the outer planets), where the robot is the right tool for the job.
Re:Already in Wired (Score:2)
Re:Already in Wired (Score:2)
Yeah, I'll accept that your original question ("do we really have a problem with casualties") wasn't meant the way I took it. My bad.
But I think your followup question (even though it's intended rhetorically :-)
> Ask the countrymen of those ware torn areas if the US needs any new military weapons that will enable them to dominate over any other country in the world.
War's not about asking your adversary (or enemy) what he thinks your army should be doing.
"Pardon us, Mr. Bin Laden, do you think it's a good idea that our troops be better-trained, better-equipped, and better-armed than yours, thereby achieving seriously kickass frag ratios against your forces? Or should we give 'em all single-shot rifles, ten rounds of ammo, one day's training, and then order a few thousand recruits to wander aimlessly on the battlefield directly in front of your troops in the middle of the day, you know, to sorta even things out a bit?"
As Patton said - war's not about dying for your country, it's about making some other son of a bitch die for his.
That's not to say that war's somehow good -- it's not, as any soldier will also tell you. It simply means that by the time you are at war, you owe it to your troops (your troops, not the other guy's troops!) to give them the maximum advantage possible.
85 years ago, that advantage was biplanes and the first tanks.
60 years ago, that advantage was crypto, long-range antisubmarine bombers, long-range fighter escorts, and yes, the first nuclear weapons.
10 years ago, it was cruise missiles, GPS, night vision, and the F-117.
Today, it's cheap GPS-guided bombs dropped from B-52s, thermobaric bombs, earth-penetrating warheads, and snipers.
10 years from now, it may be be killer robots, theater-based missile defense systems and airborne lasers. Or stuff that's just a gleam in some weaponeer's mind.
That's the name of the game. Were I a soldier, I'd be thankful for every advantage my weaponeers could give me -- because if the shit hits the fan and I have to use those weapons, the enemy on the battlefield sure as hell ain't gonna cut me any slack.
Re:Already in Wired (Score:2)
Something she might want to ponder -- the role of land mines (antitank and antipersonnel) -- in making sure that the war doesn't start up again. There's considerable discussion of it in yesterday's Slashdot article on a laser-based mine removal system. (Basically, a big honkin' laser on a Humvee, intended for clearing small areas of mines under battlefield conditions.)
Briefly - the US isn't a major user of land mines, and the DMZ is one of two areas in the world where the US still needs 'em. NK forces could easily overrun SK forces, were it not for the minefields in the DMZ.
Risk to civilian life from these minefields is nil - because nobody lives in the DMZ. Their presence keeps the truce, even when NK is in rather dire political/economic straits, and SK becomes a tempting target.
> In the game of war, alot of bad stuff is going to go down, but I guess you just have to hope it's the other guy that gets the worst of it.
Yup. And if I'm gonna quote Patton, I should also quote Gen. Douglas MacArthur:
"A soldier, above all other people, prays for peace, for he must bear the deepest wounds and scars of war."
Re:They don't get it. (Score:2)
Faraday Cage (Score:2)
Re:Suicide Lottery for "Casualties" (Score:2)
I sure hope we can institute one of these, I've been waiting long enough to get to be a war hero.
Seriously though, when all wars are automated, what then is the point of warfare? Won't this just compell the technologically weaker side to strike at "civilians" by becoming "terrorists", since forcing one's will by loss of human life has always been the method of war?
If nobody's physically involved in the fighting, who is a civilian and who is a warrior? It strikes me that the difference is that a warrior has voluntarily put his life at risk to fight, whereas a civilian has not. In this future scenario, all people are civilians, but they're also all being put at risk, all the time. Is this where the future truly leads?
(this is just a repost of the parent with my bonus enabled (yes, I posted the parent, metamods))
Re:Rockets, I think. (Score:2)
Close, but no cigar.
Both were nods to the original soldiers in the American Revolution - younger, single men who'd taken an oath to respond to a call to arms within one minute. They were the elite of the militia at the time.
They were involved in this little scuffle [cc.mi.us], about which your history teachers (if you're an American) may have told you. It was the "shot heard 'round the world".
Re:Wouldn't it be funny if (Score:2)
Re:Vulnerable (Score:2)
The janitor, horrified that his hobby will be used to kill human beings, commits suicide (feeding himself to a molecular disassembeler garbage disposal or somesuch.)