Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet AI Bitcoin Crime United Kingdom

Who's Responsible When Your Semi-Autonomous Shopping Bot Purchases Drugs Online? 182

Nerval's Lobster writes Who's responsible when a bot breaks the law? A collective of Swiss artists faced that very question when they coded the Random Darknet Shopper, an online shopping bot, to purchase random items from a marketplace located on the Deep Web, an area of the World Wide Web not indexed by search engines. While many of the 16,000 items for sale on this marketplace are legal, quite a few are not; and when the bot used its $100-per-week-in-Bitcoin to purchase a handful of illegal pills and a fake Hungarian passport, the artists found themselves in one of those conundrums unique to the 21st century: Is one liable when a bunch of semi-autonomous code goes off and does something bad? In a short piece in The Guardian, the artists seemed prepared to face the legal consequences of their software's actions, but nothing had happened yet—even though the gallery displaying the items is reportedly next door to a police station. In addition to the drugs and passport, the bot ordered a box set of The Lord of the Rings, a Louis Vuitton handbag, a couple of cartons of Chesterfield Blue cigarettes, sneakers, knockoff jeans, and much more.
This discussion has been archived. No new comments can be posted.

Who's Responsible When Your Semi-Autonomous Shopping Bot Purchases Drugs Online?

Comments Filter:
  • i cant even (Score:3, Funny)

    by ganjadude ( 952775 ) on Monday January 05, 2015 @04:58PM (#48740565) Homepage
    get my guy to call me back on time, now there is a bot that takes care of all the dirty work for me??!?!?!!
    • by Anonymous Coward

      I heard about this a few days ago. Law enforcement was alerted ahead of the exhibit to oversee contraband as some was expected to arrive.

    • by Anonymous Coward

      don't drop the soap judge and jury bot just give you life with no parole

  • by TWX ( 665546 ) on Monday January 05, 2015 @05:00PM (#48740593)
    The creator of a device that breaks the law because the creator either negligently or intentionally set up the device to break the law is responsible, as that creator set the conditions for the operation of the device.

    The creators knew both that they were designing something to make its own decisions without programming any real concept of legality in the process, and setting it to operate in an environment which is known to have served to facilitate criminal activity.

    The degree of responsibility is up for grabs, and that's why things like limited liability corporations exist, to attempt to shield the owners from being personally liable, but the act itself is still criminal. One can even debate the line between engineering and art, since the bot is an artificial construct that actively does something in the greater world, rather than a passive display or something contained to its own small environment.
    • Re: (Score:3, Interesting)

      by Slick_W1lly ( 778565 )

      I cannot agree with this.

      The programmers just set the thing up to 'buy whatever'. At the time, 'whatever' may have simply been a bunch of knockoff handbags. It's not illegal to buy those... the seller may get slapped for violating a trademark, or something - but no-one's going to come rip your handbag out of your hands or put you in jail.

      And, quite honestly - the feckin' article tells the submitter 'who is responsible'. If the law says : 'knowingly violated' - they are not responsible. If the law says 'reck

      • by tverbeek ( 457094 ) on Monday January 05, 2015 @05:34PM (#48740903) Homepage

        Another human that you create is not a "semi-autonomous bot". It is a self-aware person, and is held responsible for its own actions. Maybe if you can demonstrate that your bot is sentient and fully autonomous, that'll get you off the hook.

        • by jopsen ( 885607 ) <jopsen@gmail.com> on Monday January 05, 2015 @05:45PM (#48740985) Homepage

          Another human that you create is not a "semi-autonomous bot". It is a self-aware person, and is held responsible for its own actions.

          Can you prove that your teenage kid is sentient and fully autonomous?
          Actually that an interesting question :) And at what age does this happen?

          • by CrimsonAvenger ( 580665 ) on Monday January 05, 2015 @06:02PM (#48741107)
            Note that I can't prove that YOU are sentient and fully autonomous, much less my kid....
          • The law in most nations says that at the age of majority you are an independent "person" with the law assuming such a person is sentient and a aware of their actions, though again in most jurisdictions if you can prove you are incapable of rational thought and/or unable to assist with your defense that you are therefore unable to be held culpable and will be committed to a facility until such time as you are.

            There have been some rather cool early childhood studies on sentience and self awareness. From my re

          • by Mashiki ( 184564 ) <<mashiki> <at> <gmail.com>> on Monday January 05, 2015 @07:30PM (#48741737) Homepage

            Can you prove that your teenage kid is sentient and fully autonomous?
            Actually that an interesting question :) And at what age does this happen?

            Well....

            Yes. It's called mens rea. [wikipedia.org] And it depends on where you live, in some cases it's 9 years old in Canada it's 12 years old. That's the legal definition of "sentient and fully autonomous" while knowing the difference between "right and wrong."

          • Well, given that parents can be liable for the debts of their adult children [mcall.com], I would argue in those states, they will never be fully autonomous.

            So, if you live in any of these states [elderlawanswers.com]:

            Alaska, Arkansas, California, Connecticut, Delaware, Georgia, Indiana, Iowa, Kentucky, Louisiana, Maryland, Massachusetts, Mississippi, Montana, Nevada, New Jersey, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Dakota, Tennessee, Utah, Vermont, Virginia, and West Virginia

            the law says they are ne

        • by dbIII ( 701233 )
          Right after you get your Nobel Prize for creating a self aware bot then as it's gaurdian you'll still have to take responsibility for whatever crimes it commits.
          Since we haven't got that far this is a simple "my dog bit someone" or "I left the handbrake off and my car rolled away and hit something" situation.

          Personally I see this situation as a simple one of not defining tasks properly for a machine and the results are obvious in hindsight.

          If you set a machine to do a task and it fucks up due to poor task
      • No, this may depend on juridictions but buying, acquiring a counterfeit good or even owning it may be a liable offense just like doing the same with stolen goods.

      • by prelelat ( 201821 ) on Monday January 05, 2015 @06:05PM (#48741123)

        I think you are over complicating this. Use the analogy of a bridge. The bridge is designed to allow people to pass over it, and if it's designed properly it will allow people to pass over it. If the bridge collapses it becomes a question of was the bridge built properly was it checked to see if it would fail before hand was it built according to the original plan. If there is a design flaw someone(typically the engineer who stamped the documents) can be held liable.

        I think the same could be said here. If the design was flawed in that it bought illegal things that's one thing when it's suppose to be searching say amazon. Here I would question it's design because it was designed to buy things from where it did. In the design there was a decent chance it could hit illegal materials, and it did. I think if they didn't build anything to compensate for that it's on them. It's like asking, is it really assault if you close your eyes and start swinging at the air and just happen to hit people?

        also where's the originality http://xkcd.com/576/ [xkcd.com]

        • I think you're over-simplifying here. Just because a machine has a design flaw does not elevate any accident it has to a crime, or an intentional act.

          If you close your eyes and throw a punch, your intent is still to throw a punch that you know might land. You haven't changed intent at all. But if you build an experimental personal conveyance device, and it breaks, and you fall off, and your hand strikes me in the face, you're responsible for any damages but it probably not assault. It may be some sort of re

          • by gnoshi ( 314933 )

            Having your personal conveyance device break is abnormal or unexpected function. The bot buying (at random) whatever was available was the intended function.

      • If I father a child (creator) and raise it to be... less than respectful of the law... my child then robs a bank. Do they put *me* in jail? By your definition they should...

        While scripts aren't children, in any event parents are often held responsible for the actions of their children.

        Check out this article [lawyers.com].

        • Parents are civilly responsible for whatever their kids do, potentially even including signing contracts, but in modern countries they are not criminally responsible for their children's acts. Though permitting some acts may be an additional crime.

          If you read your link, it describes laws where permitting some acts is an additional violation or crime, and then civil liability. It never claims a parent has criminal liability for the laws the child breaks.

      • by ceoyoyo ( 59147 )

        A computer program is a tool. If I toss a hammer off a scaffold and it hits someone, I'm responsible. I can't just say "the hammer did it, not me." The crime or non-crime I'm responsible for may vary depending on the circumstances. If I threw the hammer on purpose or accidentally kicked it off, whether I took sufficient safety precautions, etc.

        These artists are clearly responsible for whatever their program did, and it purchased illegal drugs from a website. If Switzerland doesn't have a legal distinct

        • ... and if you accidentally drop the hammer, then you did "it" but "it" was only an accident, and you likely only have civil liability, depending on the details of the accident, your training, your expected level of training, etc.

          • by ceoyoyo ( 59147 )

            Perhaps in the US. In Canada, if you didn't take appropriate precautions you could be found guilty of things like criminal negligence or involuntary manslaughter, depending on what happened. I don't know about Switzerland.

            From what I've heard of US drug laws, I suspect if you ordered drugs in the US and your defense was "my computer did it!" you'd be convicted of a criminal offense.

            • Yeah, because if those words are your "defense" it means you don't have a lawyer. Once you translate the real scenario where the person's computer really did do it into legal terms, then I'm sure it will sound a lot more believable. ;)

      • by vux984 ( 928602 )

        The programmers just set the thing up to 'buy whatever'.

        Knowing full well that 'whatever' was a range of legal and illegal goods.

        If I father a child (creator) and raise it to be... less than respectful of the law... my child then robs a bank. Do they put *me* in jail? By your definition they should...

        When the child turns the age of majority he is elevated to independently being responsible for his actions regardless of how well or poorly you raised him. Perhaps you truly should be in jail, but it will be hi

      • Reckless is going to be harder to prove than knowingly here. It is clearly not reckless to purchase a random item in a store. For that to be the case you have to prove that the store itself was of such disrepute that the customer should expect them to sell illegal items. But if you prove that, you've proven actual intent already, since we're already stipulating that the contraband purchase was made by the robot, that the store was selected by the human, and that the robot was under the human's control. They

      • If I father a child (creator) and raise it to be... less than respectful of the law... my child then robs a bank. Do they put *me* in jail? By your definition they should...

        Yes, if your child is a minor and under your care. Then in most parts of the world, you will be held mostly responsible for the crimes they commit. And it is especially common for parents to be fully responsible for civil penalties. (that is, when you kid does something terrible that caused them to be sued for damaged)

        When they are an adult, they are now responsible.

        Of course, software is neither a child or an adult. If you wanted to make an analogy, perhaps the legal precedent around dogs would be more app

      • by TWX ( 665546 )

        If I father a child (creator) and raise it to be... less than respectful of the law... my child then robs a bank. Do they put *me* in jail? By your definition they should...

        Honestly, while I definitely have a problem with sins of the father punishment, where progeny is punished solely because they're the offspring of someone that did something wrong, I don't really have a problem with the idea that a parent, in some specific circumstances, could be held accountable for something that their offspring did.

      • >The creator of a device that breaks the law because the creator either negligently or intentionally set up the device to break the law is responsible

        If I father a child (creator) and raise it to be... less than respectful of the law... my child then robs a bank. Do they put *me* in jail? By your definition they should...

        Come on? A child is a device now! May be if the authors had been using some kind of artificial intelligence for their bot, but from the article, it does sound like they deliberately picked a gray marketplace just so they could generate some kind of contrived moral dilemma, or some social media publicity for themselves.

        In any case, $100 isn't much. Imagine if the budget had been $10,000, that could easily get you a freshly cut human head for that price.

    • by Anonymous Coward

      When a Wall Street program loses money for the owners, they eat it.

      If I fuck up and code a program that goes out and buys or trades and buys illegal shit, then it's my fault for being stupid.

      Or let's put it this way, I code a program that looks for and downloads kiddie porn. Cops nab me and I just say, "Oopsie. The robot did it, not me!" So, I should get off...I mean let go?

      • When a Wall Street program loses money for the owners, they eat it.

        Not always...

        http://money.cnn.com/2010/05/0... [cnn.com]
        http://www.ft.com/cms/s/0/a040... [ft.com]

        Not saying this is common, Knight provides a good example in the other direction and I honestly don't care enough about the markets to know of anything that didn't make national news, just that it seems it depends on the situation.

        If I fuck up and code a program that goes out and buys or trades and buys illegal shit, then it's my fault for being stupid.

        Legally of course this depends on the jurisdiction, IANAL. Morally I believe this is very grey area and depends primarily on intent. Obviously it's sort of hard to judge intent in most cases, though i

      • The reason you're not understanding the legalities is that you're conflating results with intent. Using an intermediary to achieve the same results does not change liability, and that has nothing to do with Wall Street. Having innocent intent and accidentally getting the wrong result, that is already not illegal.

        If you program your robot to download pictures of kittens, and a few child porn pictures get downloaded too, your liability rests mostly on what you did in the first moments after you discovered the

    • Oblig XKCD (Score:5, Funny)

      by khasim ( 1285 ) <brandioch.conner@gmail.com> on Monday January 05, 2015 @05:25PM (#48740845)
    • that's why things like limited liability corporations exist, to attempt to shield the owners from being personally liable, but the act itself is still criminal

      Why do people keep trotting this out? It's wrong. An LLC (and other forms of business-forming) don't shield you personally from the consequences of criminal acts.

      • No, but it certainly goes a long way. Did anyone go to jail when BP negligently cut corners on a well and Deepwater Horizon started spewing oil with no off-valve? The company even pled guilty to eleven counts of manslaughter, but all it got was a fine (though quite a large one). It appears that in this case, the corporate entity took all the consequences and left the individuals who actually made the decisions shielded from anything worse than a drop in their annual bonus.

        • It wasn't found that a person committed manslaughter in the company's name, and was shielded. There was no single person whose actions were considered manslaughter. The actions of the company in total were accused of amounting to manslaughter, and the company admitted to that.

          Without that minimal collective liability, there just would have been nowhere to hang it at all, and no entity to even fine.

    • The creators of Google knew they were designing something to make its own decisions (about what to copy) without programming any real concept of legality in the process, and setting it to operate in an environment which is known to have served to facilitate criminal activity.

      Does that set the perspective any better?

    • by mjwx ( 966435 )

      The creator of a device that breaks the law because the creator either negligently or intentionally set up the device to break the law is responsible, as that creator set the conditions for the operation of the device.

      It's not necessarily the creator, rather the operator who uses it for illegal purposes. A general purpose script or bot that is re-purposed for illegal means does not make the creator liable.

      It's like suing Toyota because someone used a Hilux in a ram raid.

      • It's like suing Toyota because Toyota used a Hilux in a ram raid.

        FTFY

        In this case the "artist" was the creator and operator, so yes they should be liable. They created the script to buy items in an area where it's possible to purchase illegal goods. They then proceeded to run it without verifying whether each item should actually be paid for (probably because it wouldn't be art if they did, or something).

        If they had created the script, and somebody else (the operator) had decided to run it, then [like you said], the operator should be liable, not the creator.

    • In many countries there is a common-law presumption that items for sale in shops are legal, and the fault for the purchase lies with the proprietor of the shop. The fault for possession after purchase might lie with the purchaser, but in the case of an automated robotic purchase, the question would be if the robot owner took possession of the items when they became aware of the nature of the purchase , or if they disposed of them immediately.

      A lot of people seem... mentally "allergic" to the idea of waitin

    • by u38cg ( 607297 )
      No, you're just saying words without knowing anything about the law or how it is interpreted.

      Cases like this turn quite precisely on the the law as written and the practice of the courts in interpreting them. It's not a degree of responsibility thing; it's a 'have you broken the letter of the law' thing, and a 'will the courts just shit on you anyway' thing.

  • by Irate Engineer ( 2814313 ) on Monday January 05, 2015 @05:03PM (#48740619)
    I got this mental picture of a robot wearing jeans, high heels, clutching a Louis Vuitton handbag and a credit card strutting by and saying "I'm Shoppppppinnng!!!!" Stupid brain.
  • by Anonymous Coward

    I welcome our new semi-autonomous shopping bot overlord to our country.

  • First there different levels of responsibility - financial and legal responsibility. Most of the time they are the same, but not always. Some laws take intent into consideration (obvious example is murder in the first degree vs manslaughter/negligent homicide). But there are some rather simple solutions

    1) If it was unmodified software sold to someone, and you were using it in the approved fashion, than the corporation that sold it is responsible. If you modified or ignored instructions that relate to

    • So I basically agree, the writer is responsible. That said, in this PARTICULAR case, I think its an excellent example of when prosecutorial discretion should apply, as they notified the police immediately etc.
  • I assure you that the Louis Vuitton handbag was a knock-off.

    • by Anonymous Coward

      I assure you that the Louis Vuitton handbag was a knock-off.

      How can you tell? Is it because the stitching in the corners and zipper don't come loose on the fake?

  • Simple answer is: (Score:5, Insightful)

    by bobbied ( 2522392 ) on Monday January 05, 2015 @05:31PM (#48740877)

    You...

    If YOUR bot buys something, it does so on your behalf so YOU are responsible. The question here really is: can your lack of understanding of what the bot was going to do provide a defense if your program buys something illegal. I'm guessing the answer to THAT question is "NO" but this is a question for the courts to decide.

  • Likewise... (Score:2, Insightful)

    by Richy_T ( 111409 )

    Who's responsible when I point my car, traveling at speed, at a bunch of pedestrians and jump out? There's just no way to know.

  • If it were autonomous, you'd be free and clear. But this "semi-autonomous" bit leads me to believe you were semi-involved.
  • Since this is playing in darknet waters, illegality is only likely to be expected. However, I have been playing with the idea of an app that buys random $1 stuff off ebay on either a daily, weekly or ad-hoc basis (Hey, that's less than people spend on cable). I hadn't really considered any issues with legality. The main thing stopping me was I wasn't sure if people would feel comfortable entering their Paypal details.

  • they are responsible legally. If they didn't want to have legal problems they should have pointed it at Amazon.com or Walmart. Just because they are "artists" doesn't make it art, and doesn't absolve them of legal responsibility. Maybe they were too stupid to anticipate illegal purchases. They are still responsible.

    If they had made a gun that randomly shoots moving objects in front of it expecting to shoot birds and squirrels, but it ends up shooting people, would they be legally responsible? Is it art

    • by Qzukk ( 229616 )

      If they didn't want to have legal problems they should have pointed it at Amazon.com or Walmart

      So they point it at amazon.com and it buys some drugs from some seller that managed to get them listed, using a code name that the bot just happens to hit at random. What now? (Actually, since magic mushroom spores are legal in large portions of the US I'm a little curious if they can be found on amazon.com, but not curious enough to actually search for them and screw up the already terrible suggestions)

      If they

  • Not a straw man honest... just highlighting what should be obvious responsibility.

    1. You get a gun for your "art project"...

    2. You program a robot to randomly fire 100 bullets per week in random directions.

    3. Your deploy your robot in an area KNOWN to contain humans.

    4. Inevitably... a human is eventually killed given enough time.

    Q: Are you responsible for being a fucking moron?

    A: yes

    • I think the question really is can you claim a viable defense from a first degree murder charge by saying you didn't intend to kill anybody. Surely you are guilty of manslaughter or perhaps something more, but the question here is how much can they convict you of.

      Being Stupid? Surely.

      Manslaughter (killing w/o intent)? Most likely.

      ....

      Premeditated murder in the first degree? Unlikely

      • by tomxor ( 2379126 )
        Yes but within reason... how stupid are you required to be to not see the blatantly obvious. This isn't something easily quantifiable, more likely something a judge with have to defer to "common sense".
      • I think the question really is can you claim a viable defense from a first degree murder charge by saying you didn't intend to kill anybody.

        I would imagine you'd find yourself dealing with second degree murder: "A killing that results from an act that demonstrates the perpetrators depraved indifference to human life". That's what they get you for if you fire into a crowd without intending anybody actually die, but someone does.

  • If my browser sends an order to buy drugs, based on me clicking things like "Submit Order", I used my computer and browser to make the order. Clearly I'm responsible. Whether I place the order by using cash, a telephone, or a browser, the person running it made the purchase.

    If my bot infects your computer, based on me typing code like:
    for each ip in network
    do
          try_to_infect(ip)
    done

    I used a Word macro to infect you. Clearly, I am responsible. It doesn't mater if I use a Word macro, a boot-sector virus, or a hammer to destroy your computer - I did it, the hammer or macro is just the tool I used.

    If I use my computer to submit an order for illegal drugs by typing:

    while true
    do
        buy_random_item(piratebay)
    done

    Then once I again, I bought drugs using a program I wrote as the tool. I'd be the one who chose to order random stuff from someone selling illegal stuff. The bot I wrote is just the tool I used to place the order.

  • by rossdee ( 243626 ) on Monday January 05, 2015 @06:58PM (#48741503)

    My hovercraft is full of eels

  • So next they take a pipe bomb, place it in a bathroom stall behind the toilet with a fuse that will randomly detonate in the next 24hrs. Maybe there's someone in the stall when it goes off, maybe there's not. How can they be held responsible?

  • by TarPitt ( 217247 ) on Monday January 05, 2015 @10:09PM (#48742755)

    Then one night shows up at your back door with your neighbor's heroin stash.

    Did you break the law or did the cat?

    Is the cat effectively your bot?

    And can Schrodinger's cat pass the Turing test?

  • If these people had created anything resembling an artificial mind with free will, there might be a question here. But they haven't. All they have created is a mechanical device that randomly pushes buttons; the creators of mechanical devices are responsible for what their creations do.

  • Why is this even a question? The people running the bot are responsible. If a carpenter nails my hand to the door with his nail gun, is the nail gun responsible?

  • Not sure if anyone agrees, but it seems simple enough to me.

  • If I programmed my own self-driving car to, for instance, not bother detecting pedestrians (they shouldn't be on the road!) and went off for a drive knocking over a kid or two, I don't think any jury would buy my defense of it not being my fault due to the people illegally being in the roadway.
  • At least the bot didn't pay a hitman to have random people killed.

Fast, cheap, good: pick two.

Working...