Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI

Could an Ethically-Correct AI Shut Down Gun Violence? (thenextweb.com) 513

The Next Web writes: A trio of computer scientists from the Rensselaer Polytechnic Institute in New York recently published research detailing a potential AI intervention for murder: an ethical lockout. The big idea here is to stop mass shootings and other ethically incorrect uses for firearms through the development of an AI that can recognize intent, judge whether it's ethical use, and ultimately render a firearm inert if a user tries to ready it for improper fire...

Clearly the contribution here isn't the development of a smart gun, but the creation of an ethically correct AI. If criminals won't put the AI on their guns, or they continue to use dumb weapons, the AI can still be effective when installed in other sensors. It could, hypothetically, be used to perform any number of functions once it determines violent human intent. It could lock doors, stop elevators, alert authorities, change traffic light patterns, text location-based alerts, and any number of other reactionary measures including unlocking law enforcement and security personnel's weapons for defense...

Realistically, it takes a leap of faith to assume an ethical AI can be made to understand the difference between situations such as, for example, home invasion and domestic violence, but the groundwork is already there. If you look at driverless cars, we know people have already died because they relied on an AI to protect them. But we also know that the potential to save tens of thousands of lives is too great to ignore in the face of a, so far, relatively small number of accidental fatalities...

This discussion has been archived. No new comments can be posted.

Could an Ethically-Correct AI Shut Down Gun Violence?

Comments Filter:
  • Failure to consider failure modes and effects is, well, a failure. It's bad engineering.

  • by Casandro ( 751346 ) on Sunday February 21, 2021 @12:43AM (#61085174)

    I mean even if that would somehow be possible, why would anybody in the gun community buy such a gun? Besides what does "Ethically-Correct" even mean? For example would it prevent people from hunting animals?
    The realistic scenario is that you get some system that starts enacting rules given to it by some large corporation. The best case scenario is "The Forbin Project".
    https://www.youtube.com/watch?... [youtube.com]

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Sunday February 21, 2021 @12:50AM (#61085196)
      Comment removed based on user account deletion
    • The summary more or less suggests that the proper response to your objection is to assemble an automated surveillance state that quite literally restricts your freedom of movement, enables others to harm you, and encourages them to do so by broadcasting what may very well be false claims about the activities you’re engaging in, should it deem you to be engaging in “unethical” behavior.

      A smart gun that locks out its own user is the least objectionable thing in the summary.

      • Yes, and since "Ethics" are not a constant and objective thing, that surveillance state will be under the control of some large organizations that dictate the "Ethics" behind the machine.

    • I mean even if that would somehow be possible, why would anybody in the gun community buy such a gun? Besides what does "Ethically-Correct" even mean? For example would it prevent people from hunting animals? The realistic scenario is that you get some system that starts enacting rules given to it by some large corporation. The best case scenario is "The Forbin Project". https://www.youtube.com/watch?... [youtube.com]

      Guns are simple technology, they want to try to make them more complicated. Do you know what the simplest repeating firearm to make is? An open-bolt submachinegun. Everything needed can be bought at any decent hardware store. The Uzi and Sten were both designed to be manufactured in small shops using common tools and materials. There are even videos on how to make your own rifling benches using common materials. There are active underground craft gunsmiths in the Philippines and Pakistan, making firear

    • by phantomfive ( 622387 ) on Sunday February 21, 2021 @02:46AM (#61085396) Journal

      AI is the new magic. If you can't solve a problem, AI will solve it.

      Step 1: There is a problem.
      Step 2: AI will solve it!
      Step 3: Profit.

  • You want that making life and death decisions? Whose gun to lock and whose to unlock? Love to see how the AI racial biases work for that too.
  • I tend to resell technical books on amazon. Last week, Amazonâ(TM)s AI decided that a book about water pollution was really a pesticide, and deleted it from the sales listings until I provided an EPA registration number. With tech like this, good luck trying to divine peoples intent! Most gun deaths are suicides. After that, most are gang on gang. A very tiny sliver is mass shootings by comparison. You donâ(TM)t need fancy sensors and buggy AI tools to address those issues.

  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Sunday February 21, 2021 @12:47AM (#61085182)
    Comment removed based on user account deletion
  • Garbage idea that only appeals to one political pole. Flip the politics and put this "ethically correct" AI in another context.
    Could an "ethically correct" (who writes this crap?) AI in an abortion machine shut down unnecessary abortions?
  • My first thought on this was to think back to things like facial recognition, cheating detection, etc...

    To be blunt, the first thing the AI is likely to do is be racist against black people. Seriously, it's all over.
    Facial recognition of criminals? It's over 5X as likely to say any given black man is a felon.
    Detecting cheating? Recent articles about home testing using artificial proctoring was like an OOM higher to kill the test of black people.
    Etc...

    The dang thing will probably simply treat all black pe

  • by whodunit ( 2851793 ) on Sunday February 21, 2021 @12:58AM (#61085202)

    Lunatics looking at dystopian science fiction on the order of Psycho Pass and saying; "Boy, that's the future I want!"

  • by iamhassi ( 659463 ) on Sunday February 21, 2021 @12:58AM (#61085206) Journal
    Why do people kill? Not for fun, it’s usually for money. Gangs kill other gangs because those other gangs are invading on the areas where they do business. Sometime in their life they were taught that murder was a easier way to make money than work. We got unions starting teenagers at $20 a hour with benefits and frequent raises. You can be a master electrician making $100,000+ before you’re 30 without ever going to college but media keeps shoving $100,000 college loans down kid’s throats. We need to move away from the “everyone needs a $100,000+ education” BS and encourage more unions and trade careers.
    • by dvice ( 6309704 )

      I think most common reason for individual to kill another is:
      - By accident (hey look at this unloaded gun I found)
      - Because they were scared (there is someone in my hard with an envelope, dressed as a postman, must be dangerous)
      - Revenge ( eye for an eye)

      But I don't have any statistics to prove this.

      • Sure but it's supposed to be for:
        - duels (defending someone's honor)
        - securing democracy
      • Well over half of gun deaths in America are suicide. Accidents are not very high on the list of gun deaths. Most of the time when someone dies after being shot, the shot was intentional. The FBI attempts to track all this and publishes statistics every year.

  • Smart guns just aren't a feasible invention. When a gun needs to go bang it needs to work RIGHT NOW. There's no room for error, and that's why simplicity and low parts count are often seen is virtues in gun design (and why a lot of guns developed during war-time like during WW2 basically look like cobbled together garage projects - because simplicity works).

    Think about how often you put your thumb on your phone to unlock it and the sensor doesn't read quite right. At least 1/3 of the time I have to retry

  • by mpoulton ( 689851 ) on Sunday February 21, 2021 @01:04AM (#61085216)

    It is inherently unethical to cripple defensive tools by burdening them with a failure-prone system which can disable the device when it's needed most - especially when it's trivial to build guns without this technology, thus ensuring that people who are unconcerned with regulatory compliance will always have access to guns they can freely use against the people with AI-equipped "ethically restricted" pistols.

  • If there is one thing I've learned with the tiny amount of reading I've done on ethics in philosophy, its the fact that nearly every ethical framework is ripe with flaws. So figuring out what the "correct" ethical framework to use -- if you could theoretically do this -- will be super difficult.

    Honestly, the ideas put forth for applying this hypothetical AI, is trash. An AI like this would be better suited as an advisor system for the public to use to understand the ramifications of a policy plan that the g

  • by Arthur, KBE ( 6444066 ) on Sunday February 21, 2021 @01:09AM (#61085230)
    You "need" meat? Go to any of the grocery stores within a five mile radius of your comfortable home.

    Factory farming not ethical? Then go into that same grocery store and revel in all the cruelty-free alternatives you have.

    If you live an aboriginal existence, or in the northern reaches of Alaska, etc., then I don't see any ethical problem with hunting (and eating meat). But it's hard for me to wrap my head around why an urban/suburbanite in a first-world country needs to consume meat, when there are so many other choices available to you?

    But!! -- The Second Amendment says nothing about hunting! It's about self-defense -- I'm willing to concede that. But this is a self-fulfilling prophecy. You're probably *not* living an existence where you're hunting for sustenance -- certainly *NONE* of the US mass-shooters were. They had guns because they were fearful of other people with guns, and that's the crux of the whole situation. Other nations that have clamped down on firearms have seen a dramatic reduction in mass-shootings and gun crime. This is something so obvious it shouldn't even need an explanation.
    • Re: (Score:2, Insightful)

      by mamba-mamba ( 445365 )

      Stubborn and independent minded people often want to have guns because they feel that they are more likely to continue to exercise their independent minded nature if they are armed. People who are absolutely sure that they are right about everything seem to sometimes come to the conclusion that nobody needs guns and that society would be better off if we "cracked down" on guns. The more people talk about cracking down the more the stubborn people exchange knowing glances and stock up on rifles and ammunitio

    • Death by gunfire is a symptom, not a cause.

    • by nyet ( 19118 )

      But crime rates have remained the same or gotten worse. There is no correlation between violent crime rates and legal firearm ownership. If there was, the US would have 100x the crime rates, and the cities with the strongest gun control would have the least amount of of crime. It is almost as the root cause of violent crime is socio economic. Shocking.

      "Gun crime" is like saying "accidents involving blue cars".

    • For the record I don't Have a gun. I have never handle a gun. I don't expect to ever handle a gun.

      You know what else people don't need? They don't need computers. Peopled got along pretty well without them. Then the first 30 years about 99.9% of the population did not have access to computers and got along fine. Does that mean that we should selll computers that are in someone hindered?

  • I wonder who's ethics the AI will be implementing.
  • This will never fly. But suppose it did, how about all government "users" implement it and destroy all their "dumb" guns. Then, the populations of the world would be "safe", right? Then we can melt all of our guns in our new Nirvana, right?

    Except, you can't trust the government(s), which is why in the US we have the 2A in the first place.

    That, and it's not terribly hard to manufacture firearms, so the criminals and bad governments will always have a way to make more "dumb" guns that don't care about "eth

  • If the title is on the form of a question then the answer is most certainly no.

    This isn't necessarily about "smart guns" as they point out other possible actions.

    It could lock doors, stop elevators, alert authorities, change traffic light patterns, text location-based alerts, and any number of other reactionary measures including unlocking law enforcement and security personnelâ(TM)s weapons for defense.

    Lock doors? That will low people down for certain but an armed person can shoot through glass doors common to businesses. A common door on a modern home can be defeated with by a healthy adult male with a few good kicks or throwing their body into the door.

    Stop elevators? Stairs are a thing and required by fire code.

    Change light patterns? Peopl

  • This is just a hypothetical computer-science/philosophy thought experiment.

    Nobody can or will actually build it. What is it with Americans and guns, that they get so emotional?

  • ... ethically incorrect uses ...

    Gun violence is a culture. While an abundance of firearms is a major enabler, it is not the problem so more gun 'control' will have the same effect it already has: zero. This is the same as politicians pretending 'it's the law' means nothing bad will happen. Here, it's do-gooders pretending to control usage and consequence.

    ... lock doors, stop elevators ...

    Reminds me of Horrible Bosses, 2011, where the NavGuide/OnStar operator immobilises the car, allowing a homocidal maniac to catch the car theives. No-one died since it's not that sort

  • Blind faith in technology seems to have no bounds. That trio of computer scientists from the Rensselaer Polytechnic Institute in New York must be so high (in more ways than one) up in their ivory tower that reality completely escapes their senses. There are *millions* of "dumb guns" in the US already. Does anybody realistically think that (law-abiding or not) gun owners will willingly retrofit their old firearms with some dodgy AI and sensor apparatus? How difficult can it be to circumvent the software or h
  • That link doesn't quote sources for it's information, just makes statements.

    No (reputable) source = No reputable information.

    • by nyet ( 19118 )

      It's total trash. Just look at the first response. This notion attracts complete and total idiots.

  • Comment removed based on user account deletion
  • They can't even stop their AIs from spewing racist b.s. if allowed to interact with the world of humans, at this point they're more likely to become Skynet than a protector of the peaceful.
    It's not that the idea doesn't have some merit, but this kind of decision making is so far outside of our current capabilities I wonder if I'll ever see it become possible.
  • that can't go really really wrong.
  • It's not like there hasn't been any science fiction written that describe what might go wrong if we give the power to make life and death decisions to an artificial intelligence. We have already been warned.
  • Profoundly unethical companies.

  • um, no (Score:5, Insightful)

    by cascadingstylesheet ( 140919 ) on Sunday February 21, 2021 @10:45AM (#61086270) Journal
    Humans can't even agree on this stuff. Is it "ethically correct" to shoot someone who is invading your home? You'll find no agreement among humans about this. Who gets to set the algorithms?
  • by rossz ( 67331 ) <ogre@nOspAm.geekbiker.net> on Sunday February 21, 2021 @02:07PM (#61086818) Journal

    Yesterday I got into an argument when discussing a carjacker getting shot in the head by the victim. The other person insisted that using lethal force was not justified. There are people who say lethal force is NEVER justified. So a woman is going to have to accept she's going to be raped because some mental midget asshole thinks it's wrong for her to shoot scum.

Nothing is finished until the paperwork is done.

Working...