Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Software

How Responsible Are App Developers For Decisions Their Users Make? 152

itwbennett writes: In a blog post, Rado Kotorov, Chief Innovation Officer at Information Builders asserts that the creators of enterprise apps implicitly assume some of the responsibility for other people's decision making. He says it's not just developers, but anyone who is involved, from defining the concept, to requirements gathering, to final implementation. Thus, the creators of the app have an ethical obligation to ensure that people can reach the right conclusions from the facts and the way they are presented in the app.
This discussion has been archived. No new comments can be posted.

How Responsible Are App Developers For Decisions Their Users Make?

Comments Filter:
  • The creators of the app have an ethical obligation to ensure that people reach the same conclusions as the developers.

    • by hsmith ( 818216 ) on Wednesday May 13, 2015 @09:40AM (#49681687)
      Good luck with that. They keep building better idiots.
      • by cayenne8 ( 626475 ) on Wednesday May 13, 2015 @11:42AM (#49682785) Homepage Journal

        Good luck with that. They keep building better idiots.

        Yep, to this day, I"m still amazed that we have to have warning tags on hair blow dryers, that not only have it in print, but also with cartoon like diagrams warning you to NOT use the blow dryer while in the bathtub filled with water.

        Seriously, I wonder if the depth of our litigious society has started to seriously interfere with natural selection, by keeping idiot genes in the pool when they should have weeded themselves out years ago.

        • You do realize it doesn't work that way.
        • Yep, to this day, I"m still amazed that we have to have warning tags on hair blow dryers, that not only have it in print, but also with cartoon like diagrams warning you to NOT use the blow dryer while in the bathtub filled with water.

          I'm not amazed at all. People who put their pets into the microwave to dry them will complain to the microwave manufacturer. People who use a hair dryer in the bathtub filled with water don't complain afterwards.

      • Who? Rado Kotorov is just trolling, who cares what he says, not like he's a judge or senator that can make a difference. So tired of "omg look this guy that doesn't matter said this thing on a blog, we should all care!"
      • that's the dilemma really.

        the developers don't try to make the right choice for the _user_ they try to make the user choose the right choice for the _company_.

        like with windows 8/8.1. it tries to make the choice for you to create a microsoft account to use the computer with - you can create a local account but the UI is made so that you don't consider it even as an option if you don't know you can do it beforehand.

        also consider tickboxes for optional things. are they ticked for you already or not? most ofte

    • by Jamu ( 852752 ) on Wednesday May 13, 2015 @10:06AM (#49681913)
      Users are responsible for their own decisions. If developers have an ethical obligation, then it's to inform and train users, so that they can make better decisions.
      • by Anonymous Coward

        Users are responsible for their own decisions. If developers have an ethical obligation, then it's to inform and train users, so that they can make better decisions.

        It's just the ongoing effort to destroy and bury the concept of personal responsibility. That's all. The rest is just window dressing.

        It's amazing how much happier and more satisfied someone is when they're not perpetually someone else's victim, when victimhood status is reserved for those who suffer through no fault of their own.

      • Wait...developers inform and train users??? Are you fucking kidding me. You can't even get them to write coherent documentation. Inform and train users my ass.
    • Or, alternatively, the author of the article has a similar ethical obligation that the readers reach the same conclusion as the author.

  • I can sue Ford if I get run down by one?

    • Re:So then (Score:5, Insightful)

      by Anonymous Coward on Wednesday May 13, 2015 @09:54AM (#49681821)

      No, but if Ford put a turn signal in a non-standard place or labeled the signal 'apparatus for signifying intention to create a curve' it would definitely confuse and possibly lead to accidents. Or if the factory tires weren't tested and fell off after 1 day, then your analogy would work.

      • No, but if Ford put a turn signal in a non-standard place or labeled the signal 'apparatus for signifying intention to create a curve' it would definitely confuse and possibly lead to accidents.

        Well, yes, that is assuming previous experience with a 'standard' car. A neophyte wouldn't know the difference. And besides, I don't believe we are talking about 'accidents' here. This is somebody doing something intentional, and somebody out there is trying to blame the tools and their makers just because they facil

        • Well, yes, that is assuming previous experience with a 'standard' car. A neophyte wouldn't know the difference.

          in most civilized countries you have to take driver's education class in a normal car before you can get your driver's license, so these people you speak of will not have driver's licenses

          • Key word: normal car. They are reasonably standardized. If that is all that is being called for from the developers, fine. But if the user does something 'bad' with the program, it is not the developer's fault. The original question is, How responsible are app developers for the decisions the user makes? (easy, short answer; 0, zilch, not at all) It is not asking if the apps fail, crash your computer, or erase your drive by accident.

            • Have you noticed that, in any Slashdot article that mentions C++, there will be several arguments (not all well-informed) about whether the language is instrumental in causing bugs, or whether they can be avoided by proper use. Do you tend towards the camp that says that C++ can be used safely?

  • by royallthefourth ( 1564389 ) <royallthefourth@gmail.com> on Wednesday May 13, 2015 @09:32AM (#49681621)

    In the real world "an ethical obligation" is no obligation at all. Nice circlejerk of an article, though.

    • In the real world "an ethical obligation" is no obligation at all.

      That is not necessarily true. If you sell a product to the public, and it harms someone, you can still be held responsible even if you followed all legal requirements, if a plaintiff can show that you failed to follow common best practices, or made a design decision that was different than what most conscientious engineers would do.

      • In the real world "an ethical obligation" is no obligation at all.

        That is not necessarily true. If you sell a product to the public, and it harms someone, you can still be held responsible even if you followed all legal requirements, if a plaintiff can show that you failed to follow common best practices, or made a design decision that was different than what most conscientious engineers would do.

        my guess is that he's a cop

    • So you're saying the developer isn't responsible for making sure the data presented to the user is "accurate” and “data complete.” Nice.
  • Usability 101 (Score:5, Insightful)

    by Anonymous Coward on Wednesday May 13, 2015 @09:33AM (#49681631)

    Jeesh. Why is this even a question?

    Anyone who has ever done any reading on usability knows that we need to craft the interface to the user.

    That usually means different interfaces for different cultures.

    For example, Japan and Germany have general populations that are far more used to multi-choice, complex UIs than the US and UK. They tend to prefer their UX to be a bit more technical than other cultures.

    Engineers tend to design for themselves; not for others.

    Read The Design of Everyday Things [jnd.org]. It's quite life-changing.

    • That must be why every Japanese videogame is on-rails and looks like every other Japanese videogame, because the Japanese are so used to choices.

      • by Anonymous Coward

        Who cares about videogames? I suspect they use templates.

        Try some of the apps they use for other stuff.

        There's some very neat, very geeky stuff out there that I (as an engineer) like, but most folks don't.

        However, it ain't just those cultures. That was just a strawman.

        Here's a very cool app [pairsite.com], designed by a brilliant chap [pairsite.com].

        It's a well-done, awesome app that most folks wouldn't even be able to begin to use.

        It's not designed for them. That's fine. However, there's lots of apps with similar complexity that are de

    • Engineers tend to design for themselves; not for others

      Some engineers design for Satan. Ever used SAP?

    • by WarJolt ( 990309 )

      Engineers tend to design for themselves; not for others.

      Engineers tend to design to the specification, which ideally should be designed by an HCI expert. They design for themselves because the specification is sorely lacking.

      • That's what I thought about the article too. It's not an issue if there are good requirements, unfortunately we are usually trying to cobble something together that makes sense by channeling a customer's insane wishes.

  • by Anonymous Coward on Wednesday May 13, 2015 @09:38AM (#49681669)

    "In a blog post, Rado Kotorov, Chief Innovation Officer at Information Builders asserts that the creators of enterprise apps implicitly assume some of the responsibility for other people's decision making. He says it's not just developers, but anyone who is involved, from defining the concept, to requirements gathering, to final implementation. Thus, the creators of the app have an ethical obligation to ensure that people can reach the right conclusions from the facts and the way they are presented in the app."

    I call bullshit. This is simply another step down a slippery slope that removes more personal responsibility.

    This is the very definition of the nanny State.

    • by Shoten ( 260439 ) on Wednesday May 13, 2015 @11:01AM (#49682381)

      "In a blog post, Rado Kotorov, Chief Innovation Officer at Information Builders asserts that the creators of enterprise apps implicitly assume some of the responsibility for other people's decision making. He says it's not just developers, but anyone who is involved, from defining the concept, to requirements gathering, to final implementation. Thus, the creators of the app have an ethical obligation to ensure that people can reach the right conclusions from the facts and the way they are presented in the app."

      I call bullshit. This is simply another step down a slippery slope that removes more personal responsibility.

      This is the very definition of the nanny State.

      RTFA.

      If you look at the article, you'll see just how blatantly Slashdot has mislead us with their summary of the article. The article isn't about "apps" or even just "enterprise apps." It's specifically and only about business intelligence (BI) applications, which are intended to lead their users to make decisions and conclusions. What he's saying, fundamentally, is that "as the makers of business intelligence applications, we have a responsibility to actually not make apps that suck, since the conclusions our users will come to have major ramifications." I agree with him, in that context.

      Take it and apply it to a specific situation like cancer research, and the difference between meeting his ethical standard and failing it is the difference between saving lives or losing them. And this is actually a real example; recent cancer research has largely focused upon big-data mining and BI around specific characteristics of various forms of cancer, and matching up with an incredible degree of precision which combinations of treatments work best on certain kinds of cancer. They go so far as to actually examine the genome of tumors...it's fucking cool. This is the kind of use that a BI system can fulfill, if it works. But if it doesn't work, everyone can go down a bunch of rabbit holes and it takes years to figure out that they've been chasing the wrong approaches all along.

      • But if it doesn't work, everyone can go down a bunch of rabbit holes and it takes years to figure out that they've been chasing the wrong approaches all along.

        1. Come up with a thousand approaches to a problem
        2. Crowdsource incorrect approaches
        3. Simultaneously discover 999 things that don't work
        4. Success!
        • by Shoten ( 260439 )

          But if it doesn't work, everyone can go down a bunch of rabbit holes and it takes years to figure out that they've been chasing the wrong approaches all along.

          1. Come up with a thousand approaches to a problem
          2. Crowdsource incorrect approaches
          3. Simultaneously discover 999 things that don't work
          4. Success!

          Come up with a thousand approaches to a problem.
          Try all of them at once.
          Discover that you just broke your statistically-valid group that has the problem into a thousand groups so small that you can no longer detect the difference between success and failure for any one group.
          Also realize that you just doomed 99.9% of the total test population to failure...and in my example, these are cancer patients, so you also just got yourself barred from practicing medicine. What are you, Dr. Mengele?
          Fail!

      • If that's what he's saying then it doesn't need to be said, so why is he saying it?

        Coming up after the break, how inaccurate rulers mean that shit won't fit.

        • by Shoten ( 260439 )

          If that's what he's saying then it doesn't need to be said, so why is he saying it?

          Coming up after the break, how inaccurate rulers mean that shit won't fit.

          I asked myself that same question, but I think there's an answer. If you write a version of Angry Birds that sucks, then meh...some people waste a buck each on a crappy game, give it a bad review, and life goes on. If (as actually happened) you radically change the UI on a ubiquitous application *cough*Microsoft Word*cough* then it frustrates a lot of people and wastes a lot of time, but still not necessarily the end of the world. But BI apps drive decision making at a scale that boggles the mind. Thing

          • I get what you're saying. But is BI different to stuff that controls whizzy bangy things? Or is it really the general case where the bigger the consequences of it being wrong, the less likely to be wrong it should be; that's common sense, isn't it?

    • by Kjella ( 173770 ) on Wednesday May 13, 2015 @12:15PM (#49683095) Homepage

      I call bullshit. This is simply another step down a slippery slope that removes more personal responsibility. This is the very definition of the nanny State.

      Well, the article is just a fluff piece saying that how you build the interface affects the results and that this can have consequences. Which is actually not such an unreasonable thing to say, as long as you don't take the concept too far. For one concrete example I know of from a hospital system, the software said pretty much "Nothing more to register so closing healthcare contact" when it actually meant to say "Warning, hospital visit registered but no further patient follow-ups scheduled. If you proceed the patient's treatment will end and case will be closed."

      This was in production code found in a review trying to find how the hospital could "lose" patients. The message was technically correct, but it was also extremely misleading when the nurse had forgot to register a follow-up. One seemingly harmless confirmation and the patient could end up not getting chemo for their cancer unless the doctor noticed the patient was missing or the patient followed up himself. So the developers of the system should absolutely take some responsibility for making sure the system makes it easy to do the right thing and very hard to do the wrong thing, not just technically correct.

      Another much hotly debated topic is defaults, because people have a tendency to overuse defaults. The problem is when 99.9% are the default but the 0.1% is actually important to register. Did you skip past the point with allergies when the patient actually is hyperallergic to peanuts? Ouch. People are not machines, they hate doing things that are 99.9% unnecessary even if you tell them that you checking that box is their proof that you remembered to ask the patient and a default won't do that. Like security, completeness and correctness often comes at a cost in usability too. It all depends on what matters more.

      • ... the system makes it easy to do the right thing and very hard to do the wrong thing, not just technically correct.

        This is an obvious thing to strive for, and something most developers aim for regardless of legal or ethical frameworks.
        But hindsight is 20/20, I bet that in your example, it did not even occur to the developer that the message could be misleading. Any software is going to have stuff like that, and you just have to do usability testing from time to time to find these usability bugs (I would call this a bug, but you can call it whatever you like).

        There is a slippery slope in the "take some responsibili

    • by tomhath ( 637240 )

      Suppose your application says "20% of Americans can't find the United States on a map of the world.

      You really should help the user understand what that means:
      20% of Americans are too stupid to find the US on a map
      20% didn't speak English well enough to understand the question
      20% of the sample were infants, blind or otherwise had no chance of reading a map

  • Information Builders is not getting their money's worth out of this idiot.

    Thus, the creators of the app have an ethical obligation to ensure that people can reach the right conclusions from the facts and the way they are presented in the app.

    Who decides what the right conclusion is? Why waste time and money creating an app if you already know the "right" conclusion; just send it to all employees via a one-time email.

  • by Translation Error ( 1176675 ) on Wednesday May 13, 2015 @09:47AM (#49681759)
    From a philosophical or pure cause and effect approach, sure, the makers of an app have some responsibility for the effect it has on people and what they do as a result. From a legal/liability standpoint, generally not.

    The article writer is just saying to keep in mind that how your app behaves, how it looks, how it presents data, can have a real effect on its users, so you should consider the implications of your decisions.
    • Yet the only way I can imagine meeting that goal is to have a self-aware AI that can in real time (actually it might have to be prescient) determine the users thought process so that it can morph the UI into a state that will present the information to the user in a manner that will lead them to the right conclusion. We can now move on to a discussion of perception and truth.
      • apparently software developers are stupid dolts incapable of actually doing their jobs and so we need AI systems to do their work for them

        who programs these AI systems again?

    • The more I think about the article, the less sense it makes.

      Data accuracy and completeness are of course important issues, but if you're going to add measures to improve data quality, that should be done when acquiring and processing the data, not when presenting it. So an app sounds like the wrong place to start worrying about the integrity of the data.

      Doing statistics properly is not easy: it's not an intuitive subject and it requires a good mathematical background, a lot of care and some common sense. He

      • If you don't think UI doesn't matter, you've been huffing glue. And the information presented to the user matter in that you better damn well be sure you are presenting the correct information that the user requested, something that is within the control of the developer. The actual data, no, but the information requested and the its' presentation, yes.

        It's not rocket science but its clear people failed basic reading comprehension skills and focused on a single word "ethical" and got their panties in a
  • If the description of the app were "this is buggy adware that crashes all the time and steals all your personal info and can barely fulfill its nominal function" then more people would be able to reach the right conclusions.

    • so you assert that the old adage "let the buyer beware" doesn't apply to software?

      you sound like the the police captain in casablanca, "I am shocked, shocked to see anything less than perfect bug free products"

      • The old adage "let the buyer beware" doesn't apply in lots of areas. Lying about a product in order to sell it is normally illegal. Mislabeling a product is normally illegal. There are often warranties, either legally required or used as selling points, and those have to be honored.

        The legal system doesn't care what you pay for something or what you buy, as long as you're a competent adult. It does care that you have the ability to know what you're buying.

  • They're the conclusions that *I* have come to!

  • by captnjohnny1618 ( 3954863 ) on Wednesday May 13, 2015 @09:55AM (#49681827)
    First off, I think what the author is claiming is bullshit. He's just reiterating stuff that we already have laws and protections for. We don't need a new bank of BS intellectual property laws, or CEO protection laws to throw at developers when the first thing doesn't go correctly.

    Now on to my main point:

    The summary makes it sound like the author was talking about ALL apps and app developers, however after reading the article, it's clear that he's referring to business analytics and applications that people would use to gather data and make business decisions. There is a little bit of language that makes it sound like he might secretly wish that it applied to all app developers, but that's not really the takeaway from the article.

    His claims are still completely moronic: if an app pretends to offer a service and then can't deliver, or provides data that leads to bad decisions, then (1) people will stop using it once this is discovered and (2) we have consumer protection laws if it is found that the developers did this intentionally and then deliberately misrepresented what they had to offer, would protect the people they screwed.

    This isn't a question of "apps" or "applications" or "data," this is an old idea that has been around for literally ages and someone wrote an article while masturbating to the words "big data," "analytics," and "apps."

    What scares me is that idiot politicians and business majors will see this and think "hey, yeah! I don't have to be responsible for bad business decisions in a new way!" Fucking idiots. How much lower can we go on the idiot totem pole?
    • by Qzukk ( 229616 )

      if an app pretends to offer a service and then can't deliver, or provides data that leads to bad decisions

      It's more than that. How about an app that offers the ability for a doctor to purge the record of a certain bad decision? How about a financial app that allows double bookkeeping, if someone was so inclined to hide their embezzlement? How about a default password of 12345, after all it's the user's responsibility to fix it?

      There are plenty of ways to make apps that do the wrong thing, correctly.

  • App developers aren't some special magical people that have extraordinary powers to influence users.
    They have a moral responsibility not less or more than that of a store clerk, a fashionmodel, a garbage collector or pretty much everyone else.

    • And the author isn't saying that. He's saying that the developer knows what the fuck he is doing and not some drooling moron who doesn't give a flying fig if the information presented to the user is accurate and complete.
  • Visual effects (Score:5, Insightful)

    by Fortran IV ( 737299 ) on Wednesday May 13, 2015 @10:00AM (#49681877) Journal
    From TFA:

    The goal is absolute clarity and lack of ambiguity so that decisions can be made quickly. Visual effects can obscure the facts and misrepresent proportions and ratios, thus leading to incorrect conclusions.

    Of course! That's why every damn application I use these days has its own "skins" and its own custom layout. Using a standard, familiar window layout would allow me to actually get some work done without having to search for the menus and buttons. Can't have that, can we?

    • Using a standard, familiar window layout would allow me to actually get some work done without having to search for

      Microsoft Office added PDF-generating functionality which is good, BUT they put it under File -> Export instead of File -> Save-As, which is where everybody goes to look for it. I've seen this trip several users already.

      (If they had put it under both, that would be acceptable. Unnecessary redundancy is usually a smaller UI sin than misplacement.)

      WTF were you thinking, Microsoft? I'd reall

  • ... opinion.

    Just walk away everyone. Nothing to see here. (Taken from: http://www.informationbuilders... [informationbuilders.com])

    "Dr. Rado Kotorov is vice president of Product Marketing for Information Builders and works both with the Business Intelligence and the iWay product divisions to provide thought leadership, analyze market and technology trends, aid in the development of innovative product roadmaps, and create rich programs to drive adoption of BI, analytics, data integrity, and integration technologies. He strive
  • If you are a Decision Maker you are the sole person with any ethical responsibility regarding the decisions you make. You, as a user are the sole party responsible for ensuring the completeness and accuracy of the Data Sets provided and to ensure that you understand how the software you use for your job works. Anyone who does not act on their own to proof their Data Sets and to completely understand how their Software works is acting negligently and is the sole party responsible for any issues that arise. I
    • The decision maker has responsibility for the decision, and this includes selecting people and software to gather information and project the future. This doesn't mean that nobody else has responsibility. The decision maker isn't going to proof the data sets, audit the reports, and debug the software, but has to trust other people to do their jobs. Everybody who has input into the process has responsibilities involving their own job.

  • This is another way of saying that "everyone's responsible" (and therefore no-one's responsible.)

    Insuring that a tool (app) suits a business process (and vice versa) can be a non-trivial process - but is one that the business itself is ultimately responsible for.

  • ... but put in a EULA that you are not.

    I think application developers should try to design things as if they are driving the final users decisions - and in their own minds should feel like they are responsible for bad decisions. I have seen way too many apps that are slapped together by a code-monkeys who ignores to understand the importance of clarity (such as units and legends on a graph - "Put in a feature request, and we'll see if we can get it in the next sprint; it is not a critical bug."), or designs

    • people get what they pay for, if you pay crappy developers to write crappy code... if you sign a development contract that doesn't stipulate a working solution...

      if users are incapable of communicating their desires to developers then they can't really complain about what they get

      the onus of responsibility for a quality product lies in the consumer. there will always be snake oil salesmen selling snake oil, and it will always be up to the consumer to figure it out

  • Can I then say that an app led me to make bad decisions and I am no longer responsible?

    I'm sorry that I drank 2 bottles of whiskey and ran over that family of 4, judge. That free app I downloaded said it would be ok.

    • the bad decision was deciding to use the app in the first place, everything after that is ripple effect

  • Just another reason to GPL everything you create.
  • Nobody can just own anything any more can they, nor can they accept we live in an imperfect world where mistakes happen.

    An app developer should do their best to provide users with concise, but complete, accurate, and timely information to the extent the technology allows. Perhaps developers/vendors have some responsibility to set realistic expectation about the quality of the information, but that is as far is can possibly go.

    Beyond that people/users just have to make decisions and bear the responsibility.

  • This is imaginary thoughtcrime again. Good luck prosecuting (sorry, "holding responsible") someone who's been long dead.

    The culprit obtained, kept, and operated the tool, device, machination, etc. that's pertinent. MAYBE the operator caused the events unknowingly, in cases where they were unaware of a strange or faulty capacity (the latter is pre-t-t-y hard to predict the first time), but that affects the measure of intent, which some areas of law are rational enough to outright forgive.

    What it doesn'
  • OK, IMNAL... However we have all heard about:

    A) The definition of insanity is performing the exact same action multiple times but expecting different results, and
    B) The courts hold that if proven insane, they are not legally responsible for their actions.

    I manage several enterprise applications. I have personally seen users, incorrectly doing something which makes a mess out of data, then attempting to fix it using the exact same method, then doing it again, and again, and again, etc...

    Clearly if the user c

    • I have personally seen users, incorrectly doing something which makes a mess out of data, then attempting to fix it using the exact same method, then doing it again, and again, and again, etc...

      so when the instructions don't work, should the employees strike out on their own and start inventing stuff to do?

      if the server is down and the user tries five times to enter the data, is it insanity when they try the same thing again for the sixth time when the server is back up again?

  • Comment removed based on user account deletion
  • No, in the general case you're not responsible for making sure your users make the right decisions. Imagine doing that for a dating app. Should you date this person? How should I know? All I can do is present you with information.

    The article, though, is about software that specifically exists to help businesses make better decisions. So yeah, if you're writing software that's supposed to help people make better decisions, you do have some ethical duty to write software that leads people to make better

  • 1.How responsible is Craftsman if someone uses a hammer to murder someone?
    2.How responsible is Smith & Wesson?
    3.How responsible is the tool creator when a tool is used by someone for a purpose that it wasn't designed for?

    Now, I know people will come in on different sides for all of those questions. My own $.02 is that the tool creator would only be responsible for the tool failing...exploding Pintos for example. I'm guessing that many of you aren't old enough to remember Ford's issue with that model t

    • I doubt that there has ever been a single case in recorded history where a customer has been able to actually write a proper specification for the software that they want.

      So therefore there has never been a single case in recorded history where software developers have known what they needed to know to do their jobs. How can you POSSIBLY expect them to do the job right?

      • I doubt that there has ever been a single case in recorded history where a customer has been able to actually write a proper specification for the software that they want.

        So therefore there has never been a single case in recorded history where software developers have known what they needed to know to do their jobs. How can you POSSIBLY expect them to do the job right?

        In a reasonable world, you pick the right person to do the right job. Gathering requirements and turning them into a proper specification needs a lot of talent, practice, and hard work. Your customer is very unlikely to have the talent or practice. That's the job of the company producing the software to supply the person who can do it.

        The job of the software developer is to recognise rubbish specs, push back when the specs are rubbish, and otherwise implement the spec. It's not that difficult.

      • Some users know enough to write a proper specification, particularly in conceptually simple fields like accounting. My wife does business software, and always liked working with Accounting.

  • I don't write apps, so perhaps I'm missing something, but it seems patently unfair to hold someone writing an app for the moral choices made by someone using that tool?

    I mean, are we going to hold hammer-makers responsible if someone murders someone else with a hammer?
    I even think it's ridiculous to hold gun manufacturers responsible for the misuse of their products.

    • Iit seems patently unfair to hold someone writing an app for the moral choices made by someone using that tool?

      If the patient monitoring equipment gives faulty information to the doctor, it's the doctor's fault for trusting the equipment? The doctor makes moral choices depending on what the equipment tells him.

      • That's hardly a moral choice, and that's simply an error. If a developer programs an app and there's a MISTAKE in it, then of course he/she's liable.

        If a medical device tells the doctor that a patient's heart has stopped, accurately, the doctor has to make a moral choice about whether to restart it or not. Delivering that news to the doctor accurately CANNOT be implicated logically if the doctor decides then to kill the patient.

        Nevertheless: if you expect to live in a world that works 100% of the time, I

  • Have not RTFA yet, but this sounds a lot like Matt Gemmell's talk on "Making Mistakes Impossible", which is really good and which you can view here: https://vimeo.com/84322659 [vimeo.com]

    -- Nathan

  • Next question?

Avoid strange women and temporary variables.

Working...