Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Software Technology

Taking On Software Liability - Again 382

An anonymous reader writes "You may remember an article in which a BBC correspondent wrote an article criticising current software licenses. In answer to the huge discussion that this brought about, he has written another article defending his views. From the article: 'It is possible to make error-free code, or at least to get a lot closer to it than we do at the moment, but it takes time and effort. Doing it will probably mean that commercially-available code is more expensive and cause major problems for free and open source software developers. But I still believe that the current situation is unsustainable, and that we should be working harder to improve the quality of the code out there.'"
This discussion has been archived. No new comments can be posted.

Taking On Software Liability - Again

Comments Filter:
  • yeah (Score:2, Informative)

    by jomynow ( 552972 )
    not gonna happen its like asymtotical or something. you keep spending money developing and finding buys and keey going yet getting less returns out of it.
  • by BoomerSooner ( 308737 ) on Sunday October 09, 2005 @08:41PM (#13753441) Homepage Journal
    I've got an idea. For non-software developers with great ideas. You program some piece of software for 5 years and then warranty against any bugs or failures. Oh btw, it must be priced competitively with current offerings. This guy can go wank himself in a corner somewhere. Perfect software doesn't exist. If you want something done right, your best bet is to do it internally to your company instead of outsourcing. Walmart is a perfect example. Do it right with people that feel they have ownership in the software they are creating and you'll get a better product. Plus, Arkansas (and my state too) are like Bangladesh anyway in the wages paid to software developers.
    • Bullshit (Score:4, Insightful)

      by EmbeddedJanitor ( 597831 ) on Sunday October 09, 2005 @08:49PM (#13753477)
      You have this attitude because you're a programmer. If civil engineers said "so what, bridges fall down" everyone would be up in arms.

      Bug free software is possible, so long as it is done right and people are prepared to pay for it. Right now, software is mainly "good enough" and "cheap enough". What is "good enough" and what is "cheap enough" will depend on what is being done.

      • Re:Bullshit (Score:2, Insightful)

        by Anonymous Coward
        You have this attitude because you're a programmer. If civil engineers said "so what, bridges fall down" everyone would be up in arms.

        If a bridge falls, people die.

        If an order entry system fails, it gets rebooted/patched/datafixed and it's back within minutes/hours, good as new. Some time is lost, but no lives.

        For software that's life-critical, the quality bar is set much, much higher.

        Having non-programmers tell programmers that they expect all software to be as reliable as a bridge is ridiculous, particula
        • Re:Bullshit (Score:4, Insightful)

          by Anonymous Coward on Sunday October 09, 2005 @10:23PM (#13753865)
          If a bridge falls, people die.

          If an order entry system fails, it gets rebooted/patched/datafixed and it's back within minutes/hours, good as new. Some time is lost, but no lives.


          Okay, forget bridges. Think appliances.
          I heard about a case against Hamilton-Beach because a nut was falling off on their blenders. To paraphrase you, "spin the nut back on, it's back within seconds/minutes". People don't take that kind of crap from things they understand, why should they take it from software simply because they don't understand it?

          For software that's life-critical, the quality bar is set much, much higher.

          One would hope so, but where are the programmers and managers going to learn how to work that way when the other 99% of software is made shit-poorly? I heard about a $20,000 accounting package that was done in VB. I have nothing in particular against VB, but it's not an appropriate tool to do a large, serious mission-critical system like that. Yet they get away with it because nobody holds them accountable.

          Having non-programmers tell programmers that they expect all software to be as reliable as a bridge is ridiculous, particularly since they don't appreciate the cost of what they're asking for. Those programmers silly enough to try and meet those requirements will quickly find themselves out of business when they first ask for $300 million dollars to develop an order entry system.

          How about programmers doing it?
          All software does not need to be as reliable as a bridge. Mission-critical or life-safety software does. Software sold in high volume should be reliable, because the cost can be amortized, and small defects that only cost a minute or two are multiplied by millions of users to become big problems. That's what class action is all about. Simple stuff like an order entry system should be done simply, and therefore not have problems.

          If I buy a product that doesn't work, or that has obvious defects, I have a right as a consumer to compensation from the company that sold a shoddy product. That's part of how we keep companies from knowingly selling crap and pretending it's good. Now, the libertarian view is that if a company is selling crap then the consumers will stop buying from it, but when the whole industry is selling crap and the average consumer doesn't understand the situation well enough to recognize that, what is a consumer to do?

          Analogy: picture the auto industry in the 70s. American cars weren't terrible, but the quality control was bad enough that the cars were totally inconsistent. The big three would tell you that making defect-free cars would raise the prices to the point that nobody could afford a car. People accepted this, because they didn't know better. Then the Japanese showed up. They delievered cars that, while not perfect, blew away the big three in terms of quality, and at very reasonable prices. It can be done.

          will quickly find themselves out of business when they first ask for $300 million dollars to develop an order entry system.

          Now, at the risk of being a Slashbot(tm), I can think of a major software company which has historically been known for low quality, high volume consumer software. I seem to recall that they have something like $40bn in cash on hand. Seems to me that they could afford an extra $300m on each and every product they have ever put out without jeopardizing their company financials. As an industry leader, perhaps that would force other companies put out better software.

          Then again, it's always nice to have the easy excuse when my software crashes.
          "It's a Windows bug, what do you want me to do about it?"
        • More reasons (Score:4, Informative)

          by alan_dershowitz ( 586542 ) on Sunday October 09, 2005 @11:32PM (#13754113)
          The hugeass elephant in the room here is that for centuries builders have been relying on reusable components and clear standards, while massive numbers of programmers shun these despite their availability, and constantly reinvent the wheel. I'm looking directly at every dink on Slashdot that bitches XML is too complicated and trashes (ha!) on automatic garbage collection. (if someone has some obscure exception, keep it to yourself. The exception isn't the rule.)

          Another difference is, typically if an engineer says something is unsafe, people actually fucking listen to her.

          Oh yeah, and you can't hide how a bridge works. Proprietary code encourages cut corners.

          I believe that good software is attainable. But that won't necessarily come from legislation, it'll come from the industry growing up.
          • On the other hand, getting from the first log-over-the-river kind of bridge to the bridge building standards you speak of took thousands of years. Digital data formats / algorithms / standards are a few decades old at most.

      • no, the problem is opening the flood gates of ligitation. software firms mostly don't have enough money to defend against this kind of shit.

        and also, for the most part bugs AREN'T costly. 99% of software no one dies if it crashes. and software that IS that critical does get that kind of treatment and never does fail. so your analogy with the bridge is the only bullshit here mate.

        this guys problem, is he expects complex software to never crash. he also has no idea about just how much that extra testing he

        • Re:Bullshit (Score:3, Insightful)

          by servognome ( 738846 )
          and also, for the most part bugs AREN'T costly. 99% of software no one dies if it crashes. and software that IS that critical does get that kind of treatment and never does fail.

          Exactly, it's the customer's responsibility to demand a certain level of quality they feel comfortable with and pay accordingly. Just as you don't use the same cheap metal for a skyscraper that you do for a back yard fence. There are markets for high quality programs as well as low quality programs, it's up to the customer to fi
      • Re:Bullshit (Score:5, Insightful)

        by interiot ( 50685 ) on Sunday October 09, 2005 @09:37PM (#13753681) Homepage
        Bug free software is possible, so long as it is done right and people are prepared to pay for it.

        BINGO. Why not let the market decide?

        If it's like earthquake-prone apartment buildings in Tokyo, then it's reasonable to step in and mandate that everyone, no matter how poor, should pay for software designed to a government-mandated quality standard. Until then, why not let buyers and sellers decide on their own?

      • Re:Bullshit (Score:3, Insightful)

        by Anonymous Coward
        Civil engineers don't warranty their bridges against hostile attacks (DDOS, worms, trojans), for multiple planets and gravities/atmospheres (Win XP, 2K, ME, 98, GNU/Linux, FreeBSD, OS X, i386, x86-64, PPC, Abit, ASUS, generic) or make it do anything but sit there, not having to interact in any way but to hold things up. What's the software equivalent of a bridge? cp? Let me know when civil engineers make anything as complex as Firefox. The only engineering equivalent of modern software is the Space prog
        • Re:Bullshit (Score:3, Insightful)

          by Anonymous Coward
          Let me know when civil engineers make anything as complex as Firefox.

          Okay, take your bridge. A few thousand rivits. A few thousand cables. A few hundred major steel members. Lots of concrete. These things come from different quarries and foundries where they are heavily processed to make them pretty close to what they are supposed to be. A couple dozen different welding machines run by a couple dozen different welders. Thousands of welding rods, each with a slighly different chemistry.

          The bridge sits on a p
        • Re:Bullshit (Score:3, Insightful)

          by ZenShadow ( 101870 )
          Every time this discussion comes up, all I hear is "think of the poor programmers!". If you want to cry like a baby every time someone suggests that you can do better or that you should be held accountable for your work, then IMNSHO you don't belong in this business.

          The fact is, this industry is built on the ability to ship crap-quality software specifically because they can get away with it. Reliable, high-quality software and hardware (from operating systems to major enterprise-class databases to whatev
      • Re:Bullshit (Score:5, Insightful)

        by narrowhouse ( 1949 ) on Sunday October 09, 2005 @10:23PM (#13753866) Homepage
        Large software companies are now getting to a point where they would LOVE this. Current software companies has had 35+ years to build market share with EULAs that say that their products are not guaranteed usable for any particular purpose. The opportunity to change the rules now gives a huge advantage to current market leaders by creating an enormous, artificial, barrier to entry into the market. This would be the best way to kill the growth and competition in the software market. Look at all the other businesses that are encumbered with huge legal liability requirements and you will find business sectors that contain huge, multinational, 50-100 year old companies.

        If a company wants to shop around and find a guarantee, fine. Requiring legal liabilty of all software vendors will just create another mess of goverment regulatory groups, certification boards and happy insurance salesmen.
      • by Midnight Thunder ( 17205 ) on Sunday October 09, 2005 @10:34PM (#13753908) Homepage Journal
        I thought about this the other day, asking myself why we can't have the same approach in software development as bridge building, or other engineering disciplines. The difference seems to be that of prototypes. When you build a bridge you create a prototype, test it as much as possible, tweak it where necessary and let the cycle continue until there is a working solution. Once that is done you are ready to build the bridge, based on specifications that in a certain sense are easier to follow than what software does.

        Look at software and ask yourself where that prototype is, that can tweaked reworked until all obvious and so obvious issues have been tested for? You will end up noticing that the prototype and the final product is the same thing. While a bridge can be tested based on a number of complex mathematical formula, I am not so sure that software can be tested in the same way. Software is designed and developed based on a number of philosophies and sometimes these even have to interface with other programs based on other philosophies. Over time the complexity grows to a point where testing it 100% is like trying to predict what the stock market is going to do next week. I would like to give a figure to what we are able to predict, but that I will leave that for someone else, since I am not sure I am qualified to do so.

        At the same time I will say that there are a good number of things for which you can create unit tests for and these help avoid the most obvious issues. The non-obvious issues, based on difficult to reproduce scenarios, variable dependencies are a little trickier.

        Things are also improving thanks to libraries that implement much in the way of reusable code, but here too there is an issue. Imagine that you designed your program to be dependent on libraries x, y and z, and then the user adds libraries that effect the libraries you depend on, how can you predict what is going to happen?

        You will notice that most mission critical systems are designed to have only the most essential features (as compared to desktop software) and are often coded with very precise memory management and sometimes even avoid the pointer type and instead using only primitives. Trying to develop most applications this way would be long and laborious and your users would be complaining that his complex office software doesn't do what (s)he wants (remember they can't agree on what they want), even if it is 99.999% stable.

        I am not saying it is impossible, its just that I have yet to see an approach that is 100% effective and for 100% of cases. Yes I am a software developer, so I do have a certain bias.
        • What you say may be true, but I don't think it's the use of prototypes and up-front planning that separate true engineering fields from software "engineering". Those are merely the processes that have been found to work effectively in other disciplines, and we know many processes that work and many that don't for software development, too.

          I think what really separates engineering from most of today's software development is that in real engineering, you have an engineer. This is a highly trained, experien

          • You raise an interesting point, however, let's look at how a bridge is built versus how software is built.

            When you build a bridget, an architect designs every detail of that bridge. An engineer ensures that the bridge is structurally sound, and develops the methods used to build it.

            The people that actually BUILD the bridge, are, for all intents and purposes, monkeys. Skilled monkeys, to be sure, but monkey's no less. They do what they're told, and have no "creative input" into the building of the bridge.
            • I agree with much of what you say as things stand today, but I think you're making an unstated assumption that this is the only way things can work.

              A lot of programming is donkey work, and requires little more than joining the relevant library code together in the appropriate pattern. IME, the key to getting this right is that you usually need:

              • a small number of very good people at the top of this process, co-ordinating the design;
              • a small number of very good people at the bottom of this process, writin
              • While I agree with you that things will not always be this way (I did lay out the criteria I believe will solve the problem), I don't agree that it's possible today.

                When you build a bridge, you need a human to make decisions about various things, but those decisions are based on how to build the bridge, not how the bridge will operate once built. Programmers make decisions every day that effect how the software runs even after it is built.

                A bridge builder might have to decide whether to use a shovel or a b
      • Bug free software is possible, so long as it is done right and people are prepared to pay for it.

        It is impossible to guarantee the reliability of complex algorithmic software. This is something that Frederick P. Brooks has shown in his famous "No Silver Bullet" paper. However, Brooks' arguments fall apart in one important area. Although Brooks' conclusion is correct as far as the unreliability of complex algorithmic software is concerned, it is correct for the wrong reason. Software programs are unreliable
      • Re:Bullshit (Score:5, Interesting)

        by Maxo-Texas ( 864189 ) on Monday October 10, 2005 @12:08AM (#13754247)
        You are a civil engineer.

        I want you to build a bridge.

        I won't say where- or what the end conditions are on each end- because this bridge needs to work in about 2 million different places.

        Now- as to what will cross the bridge. I won't tell you that either. It might be a car- it might be a convoy of tanks.

        Now... as to the basic laws of the universe (the operating system). I can't tell you much about them either. For example, gravity may change at any time to be higher or lower. The tensile strength of various materials may change unpredicatably with various patches to reality.

        Your work force will be available to work 2 to 16 hour days and may or may not comprehend instructions written in english.

        The bridge needs to be built from scratch from materials using new refining methods so you cannot use any reference materials to analyze how strong it has been historically.

        Finally, this bridge must be made of at least 9 million different pieces (opcodes). The subunits will be assembled by a robot of some kind (Compiler) so you will not know the details of how the units work- only how they are supposed to work as units.

        ---

        I'm sorry but you really do not understand what you are talking about.

    • While you are correct, that perfect does not exist, it is also true that the way most software is developed, with arbitrary deadlines, poor testing and deathmarch coding, is responsible for much of the bugginess of modern software.

      If software companies spent the time and money quality takes, then they would produce software that is less buggy. Not bug-free, but much less buggy.
      • Yeah, you can point the finger at management issues, but I say competency is another. Letting anyone and his cousin's brother develop software is another major cog in the wheel.

        Unlike almost every other branch of engineering, software has no accreditation standands or process. Totally unlike, say, those civil engineers who built and designed the bridges we're using as a comparison. You'll notice that the vast majority of those don't fall down after a day's use.

    • Perfect software doesn't exist.

      Avoiding liability isn't about producing a perfect product. There are no perfect products. A company can avoid liability (in cases where liability laws haven't been modified by law to create strict liability schemes) when that company shows that they took efficient measures at preventing harm arising from the use of their product. If $1 of effort prevents $5 of damage and you fail to make your product safer, you will be held liable for damages suffered by your users. If $5
  • by Namronorman ( 901664 ) on Sunday October 09, 2005 @08:42PM (#13753448)
    This guy sounds like he's just full of hot air because of a bad Norton AV installation. If one program causes something "devastating" to happen, who is to decide that it's not the user's fault, the compiler's fault, the programmer's fault, the OS creator's fault (and if it's OSS, who's package etc?), or the hardware's fault?

    The computer world if full of many variables and I don't see this happening anytime soon, though with recent laws you never know.
    • Lawyers and Judges would decide.
    • by Anonymous Coward
      but that is not the issue. He is pointing out that companies EULA's exclude liability even if the fault is their own. You also seem to be getting hung up on who's to blame instead of who is liable.

      As most commercial software is shipped precompiled it isn't an issue for the end user is the compiler buggered it up or not. Standard contract law means you sue the company you brought the product off that is faulty and they then sue the people who created the fault and exposed them to the liability. This is as le
    • by kannibal_klown ( 531544 ) on Sunday October 09, 2005 @11:01PM (#13754006)
      If one program causes something "devastating" to happen, who is to decide that it's not the user's fault, the compiler's fault, the programmer's fault, the OS creator's fault (and if it's OSS, who's package etc?), or the hardware's fault?

      Let's not forget "another piece of software's fault." Installing Software package B might overwrite a registry setting or DLL needed by software package A. On top of that, software package B might leave something running in the memory as a service that conflicts with something software package A does.

      You are correct, there are WAY too many variables when dealing with software failures. And if this guy were actually a software developer he'd know that it's pretty much impossible to make something completely bug free. The most you can hope for is something that rarely has a bug or recovers if it encounters ones without losing its place/data.

  • by hummassa ( 157160 ) on Sunday October 09, 2005 @08:44PM (#13753452) Homepage Journal
    is stale software. Bit rot guarantees that all users will migrate from error-free, real stable software, to new-full-of-bells-and-whistles but error-ridden software in 0 time.
    • by Concerned Onlooker ( 473481 ) on Sunday October 09, 2005 @09:27PM (#13753643) Homepage Journal
      A couple of quarters ago I was taking a software engineering course. Our instructor told the story of a debugging competition which used a mature piece of software that was known to be error-free for the test case. A fixed amount of bugs were then introduced into the code and the teams all had a crack at it. At least one of the teams found bugs in the code that were not the ones intentionally introduced. I'm paraphrasing here, but in other words they took a piece of software that they knew to be bug free due to its having been intensely examined by many programmers, yet another bug or two was found.

      Truly error free is not a likely state for software.

      • by fbjon ( 692006 ) on Sunday October 09, 2005 @09:58PM (#13753762) Homepage Journal
        There was an analogy with a bridge earlier. Bridges are designed with redundant security, you can (usually) put a lot more weight on them than what they are rated for.

        In the same vein, instead of trying to make every part of the code perfect, how about designing some redundancy into the code?

        I leave it as an exercise for the reader to figure out what the hell that means.

  • The Market Decides (Score:5, Insightful)

    by the eric conspiracy ( 20178 ) on Sunday October 09, 2005 @08:46PM (#13753462)
    The fact is that the market has already decided the answer to this. People buy the least expensive software they can get away with. If the application is unreliable enough to regularly lose data it gets flushed out of the market. If it works well enough and is for the desktop it becomes popular. If it is used in critical applications where data loss is not tolerated they you have stuff like Oracle which people pay $50,000 per CPU for.

    • by Husgaard ( 858362 ) on Sunday October 09, 2005 @09:06PM (#13753539)
      The fact is that the market has already decided the answer to this.
      And the problem with this guy is that he doesn't like what the free market has decided.

      He wants laws to be passed that would make some (or all?) kinds of disclaimers on warranty and fitness for a particular purpose illegal for software.

      He wants it in the name of "consumer protection", but he does not realize that the consumers are not interested in paying the higher price tags this would put on software.

      The only ones whom this would really protect would be corporations big enough to buy costly insurance against claim. They would be protected against competition from Open Source software and smaller companies that would drop out of the software market because of the risk of liability.

      • And the problem with this guy is that he doesn't like what the free market has decided.
        Wouldn't you say it is a perfectly valid position, though? "Decision" of the free market is essentially the decision of the majority, but there's always an (unhappy) minority too.

        For the record, I do believe that he is right, to an extent. Software should be less buggy and there are ways to improve the situation. And yes, I am a programmer.

        • The decision of the majority will also affect the laws to be implmented. Every industry once it hase matured always ends up being forced to adhere to a set of legal standards. Thanks to an extreme amount of lobbying by a greedy few in the software industry they have managed to hold off the inevitable legislation (I mean get real, billions and billions of dollars in profit because they won't spend it making sure the code is a fault free as possible, together with warranties that in any other industry would l
    • The fact is that the market has already decided the answer to this. People buy the least expensive software they can get away with.

      That's because quality and security are properties of software that are difficult to evaluate for most buyers; people end up with worse software than they actually need. This is a standard example where markets fail to reach the overall optimal outcome.
    • This is exactly right.

      If you look beyond the x86 desktop market, theres a LOT of software thats close to bug free. and the companies that Pay for things like high performance Oracle soloutions, massively parralel Solaris on Sparc systems, "continuous computing" (ULTRA high availability with high levels of disaster tollerance) OpenVMS on Alpha or Iatanium...

      Companies that will pay more than $ 250 000 USD on a single sytem demand the highest quality of code, and these companies DO deliver it.

      OpenVMS is renoun
    • People buy the least expensive software they can get away with.

      No, people don't buy the least expensive software otherwise more people would be using FOSS not proprietary software. The'd also keep using the same software instead of upgrading both the software and the new hardware the software requires.

      If the application is unreliable enough to regularly lose data it gets flushed out of the market.

      I disagree here too, I don't know how many tymes people loss data because Windows crashs, yet it's the

  • by Lehk228 ( 705449 ) on Sunday October 09, 2005 @08:47PM (#13753463) Journal
    There is also a big difference between consumer software like word processors and web browsers, and the massive information systems used internally in large companies.

    The companies writing the large systems usually have contracts which mean they are liable for damages, and this increases both the cost and the reliability of the resulting programs.



    I must assume he doesn't work with internal apps much.
  • by twitter ( 104583 ) on Sunday October 09, 2005 @08:48PM (#13753468) Homepage Journal
    it will probably mean that commercially-available code is more expensive and cause major problems for free and open source software developers.

    Everyone knows that most free software, by virtue of peer review, has fewer bugs and errors than commercial code does. If what he means is that you have to be licensed, bonded and "protected" by a corporate staff of 800 pound gorillas to write code, then free software will have problems. Such a missallocation of resources still won't buy him better code.

    This whole issue is a troll the non free software companies come up with every few years. It's a mistake for them, however, and will blow up in their faces. Free software will overcome such nonsense the same way Good Samaritans do. Worse, what kind of society would outlaw exchanging of advice on how to do something? That's what sharing source code it. Why not outlaw engineering texts instead?

    • Everyone knew that the Earth was the center of the universe.
  • by Anonymous Coward on Sunday October 09, 2005 @08:49PM (#13753476)
    I've said this years ago: software liability should apply on programs you pay for but for which you don't get the source. If money you pay goes to make something you don't have source level control over then that implies the vendor thinks its of sufficient quality that you, the end user, should not have to fix it. If you get the source then there is no guarantee and the distributor should have no liability. This doesn't mean you have to have the right to re-distribute the source -- but you have to have the right to re-build it using commonly available tools so liability can't be limited to one "magic" libarary.
  • that must be the article with the least content in my entire slashdot "career".

    no thesis, no argument, no concrete examples of HOW to make software better or HOW to implement such liability.

    i do understand this is a follow-up, but why exactly should ANYONE care about this mindless piece of crackpot-tery?
  • by Captain Perspicuous ( 899892 ) on Sunday October 09, 2005 @08:54PM (#13753496)
    [ ] vendor guarantees that software works as advertised
    could be another checkbox that all software companies are trying to reach.

    "What? You don't guarantee works-as-advertised? Well, then I'm looking for a different product."

    If computing magazines would update their testing methods and added this one checkbox, Microsoft just might say "oh, hey, we haven't covered that checkbox yet. We need to have every checkbox. Let's quickly drop by the legal department get this in order..."
  • Great (Score:5, Insightful)

    by LWATCDR ( 28044 ) on Sunday October 09, 2005 @08:57PM (#13753505) Homepage Journal
    The Lawyers will love it. They will launch massive class action law suites and will make millions. If you are part of that class action you will get one dollar.
    The software vendors will not fix bugs because to fix them they have to admit they have them and will get the daylights sued out of them.
    • I think you're spot on.

      This is a laywer's wet dream. They've sued the living daylights out of car companies, tobacco, and drug companies... now they're after new blood. If robots ever get really popular, they'll be suing them next.

      Now, don't get me wrong. There are plenty of good reasons to hold car companies, tobacco companies and drug companies accountable for things they've done. It's the lawsuits that happen when those companies did NOTHING wrong.... that ticks me off. (Well what a sec.... I find
  • The keys are:

    * Tell users to stop asking for tons of new features in unrealistic timeframes.
    * Tell software managers to actually give individual developers time to develop software the write way instead of insisting that they slam code out.
    * Get compentent testers who can help catch any aggregious problems before it goes to market.
    * Stop hiring assholes who just have certificates and get some degee holding professionals who actually know what the f*ck they are doing.
    * Stop outsourcing to india where most pr
  • by MerlynDavis ( 637066 ) on Sunday October 09, 2005 @09:02PM (#13753526)
    The author has a point here. We accept a lot more ... "bugginess" in software than we do in any other product (Cars, Banks, Tools, etc.) And it's pretty much become the norm that if there are problems, folks just shrug, claim it's just software and move on. But if the folks building bank vaults left as many holes in their products as software, people would be screaming bloody murder. I've done software development as a hobby myself, and don't release my code to the public, because I know it's not even up to my own standards of stability, reliability, security. Programmers/developers need to take more time with their products, and think security & reliability from the start of a project, not as an afterthought. With as many products requiring patches within the first couple weeks of release, consumers do need to start getting angry about this stuff. Or, at the very least, start challenging software companies when the products they do release require more MB in patches than the software was originally....
    • What do you mean? (Score:4, Insightful)

      by Sycraft-fu ( 314770 ) on Monday October 10, 2005 @01:28AM (#13754528)
      My car is way buggier than my software. My car is horrible at dealing with unexpected siutations and abuse. If someone attacks it, say by breaking a window, the window is broken and I have to pay to have it fixed. With software, I get mad and demand that they should fix the bug so the attack CAN'T break it. Likewise the car is not forgiging to unexpected operation. If I floor the gas in neutral, the engine will seize up. However I expect that software can deal with unexpected input and not have any ill effects. Also my car costs money for matenance. I have to regularly pay for things like oil to keep it working, however software I expect updates at no charge.

      So all in all it seems I expect MORE out of my software than my car.

      They are different things, you really can't compare them.
    • We accept a lot more ... "bugginess" in software than we do in any other product (Cars, Banks, Tools, etc.)

      In exchage for much more rapid development than other products. Cars today aren't hugely different than they were 20 years ago, when we were using DOS.
  • by bbk ( 33798 ) on Sunday October 09, 2005 @09:03PM (#13753529) Homepage
    Ah, so he wants people who right software to guarentee their work?

    Things will then just never make it out of beta, for fear of the law. If the software breaks "Tough luck, it's still in beta, what were you doing using it for mission critical work anyway?"

    This "eternal beta" is also used to avoid other sorts of legal wrangling . The most obvious example is Google News - it's "beta" still because google is worried about capitalizing on other people's news content. While unrelated to software quality, because it's an "unfinished beta", it doesn't get sued out of existance.

    So, welcome to using software versons 0.9.9 forever... I can't wait.
    • This "eternal beta" is also used to avoid other sorts of legal wrangling . The most obvious example is Google News - it's "beta" still because google is worried about capitalizing on other people's news content. While unrelated to software quality, because it's an "unfinished beta", it doesn't get sued out of existance.

      Ah but some French news sites were suing Google.

      Faclon
    • You know that just because someone sticks the word "beta" next to a product, that doesn't actually remove any of the ethical or legal consequences for producing that product, right?

  • by G4from128k ( 686170 ) on Sunday October 09, 2005 @09:05PM (#13753535)
    What people want is:
    1. The latest whiz bang feature to impress their friends
    2. The latest feature copied from a competitor's software
    3. The latest feature to be compatible with everyone else
    4. The most feature checkmarks for the PHB to authorize the purchase or selection of a software application

    None of these demands fosters reliability. It fosters a frantic race to add features and ship stuff ASAP. Everyone seems caught in a massive vicious cycle of upgrades so that nothing ever stabilizes or matures.

    Perhaps if/when people stop finding new uses, new formats, new file types, and new applications, then the industry will mature and people will turn their attention to stability and reliability.

  • by NZheretic ( 23872 ) on Sunday October 09, 2005 @09:06PM (#13753541) Homepage Journal
    By myself [slashdot.org] from June 14 2002 [google.com]

    However relatively bad the security of Microsoft's products are in comparison to what the free licensed and open source communities ( as well as practically every other vendor on the planet ) provide, Microsoft is not alone in the presence of vulnerabilities, this is a major issue for Linux/BSD and Unix as well as ever other OS and vendor.

    From the Plimsoll Club history [plimsoll.com]

    Samuel Plimsoll brought about one of the greatest shipping revolutions ever known by shocking the British nation into making reforms which have saved the lives of countless seamen. By the mid-1800's, the overloading of English ships had become a national problem. Plimsoll took up as a crusade the plan of James Hall to require that vessels bear a load line marking indicating when they were overloaded, hence ensuring the safety of crew and cargo. His violent speeches aroused the House of Commons; his book, Our Seamen, shocked the people at large into clamorous indignation. His book also earned him the hatred of many ship owners who set in train a series of legal battles against Plimsoll. Through this adversity and personal loss, Plimsoll clung doggedly to his facts. He fought to the point of utter exhaustion until finally, in 1876, Parliament was forced to pass the Unseaworthy Ships Bill into law, requiring that vessels bear the load line freeboard marking. It was soon known as the "Plimsoll Mark" and was eventually adopted by all maritime nations of the world.

    The risks,issues and solutions for providing a more secure operating and application enviroment have been known for decades.

    Those who do not already comprehend the issues and are willing to learn, should take some time out to listen to some of the speeches at Dr. Dobbs Journal's Technetcast security archives [ddj.com], starting with Meeting Future Security Challenges [ddj.com] by Dr. Blaine Burnham, Director, Georgia Tech Information Security Center (GTISC) and previously with the National Security Agency (NSA)

    The design and implementation of some applications and servers are just too unsafe to use in the "open ocean" of the internet.

    Numerous security experts have railed against Microsoft's lack of security, best summed up by Bruce Schneier Founder and CTO Counterpane Internet Security, Inc who rightly said: [schneier.com]

    Honestly, security experts don't pick on Microsoft because we have some fundamental dislike for the company. Indeed, Microsoft's poor products are one of the reasons we're in business. We pick on them because they've done more to harm Internet security than anyone else, because they repeatedly lie to the public about their products' security, and because they do everything they can to convince people that the problems lie anywhere but inside Microsoft. Microsoft treats security vulnerabilities as public relations problems. Until that changes, expect more of this kind of nonsense from Microsoft and its products. (Note to Gartner: The vulnerabilities will come, a couple of them a week, for years and years...until people stop looking for them. Waiting six months isn't going to make this OS safer.)

    However Microsoft's products are not alone in the presence of vulnerabilities, this is a major issue for Linux/BSD and Unix as well as any other OS and vendor.

    In a recent speech "Fixing Network Security by Hacking the Business Climate", also now on Technetcast [ddj.com], Bruce Schneier claimed that for change to occur the software industry must become libel for damages from "unsecure" software

    • The abstract notion of a "Plimsoll line" for apps is very appealing, but the problem is that we really don't even know what the analogous standard would look like, much less where it should be drawn and how it should be enforced. Software isn't like boats or cars or bridges -- many small variations on a well-defined solution. There are commonalities between pieces of software, but the differences are huge. A payroll system is so different from an embedded RTOS as to make any kind of consistent standards

  • The reason you can't use critical systems development techniques to develop applications software is because the cost/benefit analysis is still unbalanced heavily on the side of cost. If you're a company that does critical systems development you have a greater chance of success if you find a client that requires critical systems as the benefit (often, "people don't die") far outweighs the costs. But Open Source turns cost/benefit analysis on its head. When developers volunteer their time the costs can't
  • I do not think we should automatically exclude free/open source software from our analysis simply because it is produced by teams of programmers working for nothing, and the fact that it is given away does not, of itself, provide legal immunity.

    I do, at least to the full extent of the law.

    Expecting anything from someone who gave you free/free software isn't reasonable. The fact is, the licenses are there not only to save the developers necks but also to serve as a warning. When something says "AS IS"
  • amazing ignorance (Score:2, Informative)

    by youngjohn14 ( 921664 )
    "The companies writing the large systems usually have contracts which mean they are liable for damages, and this increases both the cost and the reliability of the resulting programs." As an IP attorney working in the industry for the last 14 years, this statement is just so....amazingly....stupid I would have thought the editors of the BBC would have caught it. It is wrong on so many levels. No non-on-the-ropes software developer will bet the company on error-free code. At the most, a developer will ag
  • I haven't taken the time to read the prior article carefully, but whatever the point he was originally trying to make has been completely lost in his attempt to shift the goalposts of the argument. (As far as I can tell, his original article said we should be allow to sue programmers for bugs.)

    This second article says "people should write better code". Well, um, I disagree! Wait, no. Of course not. Yes, the quality of code should improve, and should always be improving.

    The analogy to automobiles seems
  • by idlake ( 850372 ) on Sunday October 09, 2005 @09:17PM (#13753595)
    It doesn't make economic sense to create some kind of liability for the authors of software; there is no single level of quality that everybody needs.

    The best thing we can do to increase software quality is to hold the people responsible who can actually do something about it: the people who buy software.

    If your Windows PC crashes and you lose data, that's your responsibility; you could have gotten something different.

    If the bank's Microsoft-based database server has a serious security hole and someone breaks in and defrauds customers, then the bank should be held fully responsible for that; they shouldn't be able to shift responsibility to either Microsoft or the person who broke in. That will force institutions like banks to negotiate contracts with software vendors that ensure an appropriately high level of correctness. And there is no need to burden our courts with "hackers"--you won't be able to find and lock them all up, so locking up some of them is not a rational strategy for making computers secure.

    In any case, if one wanted to, one could easily make legal distinctions betwen FOSS and Microsoft/Apple when it comes to liability. First, expert users generally have to accept a higher level of responsibility than non-experts. Arguably, FOSS users are, by definition, expert users. Also, for-pay software involves an actual sale, which can easily and sensibly be regulated differently from non-sale distribution when it comes to liability.
    • by Fastolfe ( 1470 ) on Sunday October 09, 2005 @10:36PM (#13753917)
      I agree, to an extent. It makes no economic sense to shoot for as perfect-as-possible for all software. The reason we have minimum standards for other industries, such as as automobiles, is because a defect in an automobile can kill people.

      But what we have today is practically anarchy. There's no way of telling if a product will work properly, or will work at all, and software vendors are allowed to get away with that.

      A middle ground here might be forced labeling. Require software vendors to place a label that, in a standard fashion, describes how safe the software is, whether it is guaranteed to work as labeled and advertised, and maybe something about the known defects it has, or estimated failure rate. Don't let the vendor hide this in the fine print. And then hold them to it with legal measures.

      That way, if a piece of software is targeted for home use, the labeling should make it clear that it's going to have significant defects, and will fail at a high rate. You might have a more expensive variant for office use, with fewer defects. And then you might have a stripped down, very expensive version intended for critical applications, in hospitals or infrastructure. The end user can then choose which one they want to buy, and instead of feeding a market where the customer buys the cheapest product because they think all products are buggy, they can buy the product that meets their needs, with the assurance that they will have legal recourse if the product fails to meet the expectations indicated by labeling.
  • Not entirely new... (Score:5, Interesting)

    by cperciva ( 102828 ) on Sunday October 09, 2005 @09:24PM (#13753629) Homepage
    Dan Bernstein has offered a guarantee for many years that djbdns and qmail are secure. Now, this is a rather vague guarantee, since the task of deciding if a reported problem is a security flaw lies with Dan Bernstein himself; but it's a start.

    I'm currently writing some cryptographic code, and I intend to go considerably further: I intend to offer a guarantee not only that my code operates as specified, but also that it is not vulnerable to any side channel attacks within certain classes.

    As the time-to-exploit of security flaws continually decreases, I see only one solution: Writing code which is correct in the first place. If you can do that, you can offer a guarantee. And hopefully once security becomes as larger issue to consumers, people will start looking for guarantees.
  • Remeber IEFBR14 (Score:5, Informative)

    by sk999 ( 846068 ) on Sunday October 09, 2005 @09:24PM (#13753631)
    Making bug-free software is much harder than anyone can imagine.

    Let us not forget the very modest program IEFBR14 - arguably the shortest
    program ever written for use in a production environment. It ran on IBM's
    System/360. (I rans it many times myself.) Its sole function was to
    exit - nothing else. It was a whopping one machine instruction long - 2
    bytes. It was even Open Source (BR14 is the assembly language version of
    the instruction, which is the standard way programs exited). It was the
    simplest possible program that one could write. If ever there was a
    program that was going to be bug-free this was it!

    It had a bug.

    When a program exits on OS/360, it is expected to have set some bits to
    indicate any errors. When a program is called, those bits are in an
    unpredictable state. IEFBR14 had to be modified (doubling its length) to
    clear the bits first.

    Sigh...
  • by autopr0n ( 534291 ) on Sunday October 09, 2005 @09:24PM (#13753633) Homepage Journal
    When I ran Autopr0n, hooo... that code was awful. But there really was never any kind of economic incentive to fix it, I could just keep restarting my JVM (the thing was coded in java).

    Or, look at metafilter.com. That site goes down like a $2 hooker, yet it's so successful that the maintainer was able to quit his day job and support himself based on the site. People don't care.

    Even when you get to a desktop OS back in the '90s, quality just wasn't that important. Would you rather pay $10,000 for an OS, or $90 and loose work once in a while.

    If the cost of the lost work due to software errors is less then the cost of writing the code so that it works perfectly, then it's not worth doing. Sure, for some programmers there's not a tradeoff, but those programmers probably cost a lot more to pay then 90% of the coders out there (who are idiots, IMO, just look at the existence and popularity of Visual Basic).

    When the cost of the error increases, you'll find much more stable software (like on medical equipment, airplanes, and so on).

    The secretaries spreadsheet just ain't mission critical.

    Of course, now that all computers are connected together, they need to be at least secure and not targets for worms and trogens, etc. I predict that we move towards web services, the software quality will get worse and worse, but people will just pay a sysadmin to sit there and reboot the machine whenever it goes down, so people won't notice everything...
  • by The Famous Brett Wat ( 12688 ) on Sunday October 09, 2005 @09:32PM (#13753665) Homepage Journal
    unsustainable - (adj.) 1. Following a pattern which can not continue indefinitely due to the inherent limitations of the system. "Present growth is unsustainable in the long term." 2. A term expressing distaste, annoyance, and a personal desire to change things. "The current situation is unsustainable."
  • by Anonymous Coward
    There has been a lot of discussion about my call for software liability in a column entitled Whose fault is it anyway?, and it shows that this is an issue which needs some serious attention.

    "it" is an unclear variable reference. Does the pronoun "it" refer to the call for software liabilty or the column itself? Also, the title of the column should be italicized, underlined, or capitalized for clarity. Finally, the phrase "a lot" is depreciated.

    There is also a big difference between consumer software like
  • by cicho ( 45472 ) on Sunday October 09, 2005 @09:51PM (#13753731) Homepage
    Okay, so we've had the predictable reponses about how building software is different from building bridges, and then others point to the respective difference in cost. All true. But if bridges and buildings are so much more reliable than software, it's not only because they cost more. It's also because when they are designed and built, all procedures must conform to known standards (and not a few regulations). The specs are open and auditable, and architects actually have their work inspected all the way.

    Should every word processor be built in this way, with open specifications, norms and audits? I don't know. Now how about vote-tallying software?
  • Good software costs (Score:5, Interesting)

    by Angst Badger ( 8636 ) on Sunday October 09, 2005 @09:56PM (#13753752)
    First off, I should issue a disclaimer that I'm an oldbie. I started programming in assembly language on punch cards, but no, this isn't going to be a rant about youngsters and their newfangled languages. (At least it better not be; my current job has me living, breathing, and eating PHP.)

    The problem with bad software today -- just like it was thirty years ago -- is bad engineering. It's not because of the methodology du jour (or its absence), licensing, choice of language, or toolsets. You can write brilliant, bug-free, efficient software in COBOL using the basic procedural structured programming paradigm. You can write awful, buggy, resource-hungry software in object-oriented Java using XP. None of that shit matters.

    Good engineering requires, among other things, a detailed understanding of the problem, thorough planning, the sheer experience required to distinguish between the clever and overcomplicated on one hand, and the lucid and elegant on the other, excellent communication between developers, foresight (also borne of experience), and rigorous debugging. All of these things, including the many other prerequisites not mentioned, require lots of time and effort. Too much time and effort, in fact, for most commercial software outfits to invest and still turn a profit.

    That's the rub, really. All the methodology and language fads aside, the basic principles of good software engineering were worked out decades ago, and sometimes further -- good generic engineering practices in the abstract were worked out long before we harnessed electricity. It all comes down to this: the more time, effort, and care you put into a product, all other things being equal, the better the product will be. It's easy (and well-deserved) to mock Microsoft for the shoddiness of their major products, but that very shoddiness is why you can buy MS Word for less than ten grand. If MS built word processors the way engineers built the Golden Gate Bridge, the prices would be comparable.

    The market does not reward that kind of quality. In the first place, no one is willing to pay thousands of dollars for a supremely excellent product when one that is good enough can be had for a couple hundred. Most folks couldn't afford that kind of software engineering even if they wanted it. In the second place, once you have the perfect all-in-one software package, why would you ever buy another one? Microsoft is in this position already with its good-enough products. No one needs an upgrade, so remaining profitable requires MS to churn out new versions of its increasingly resource-intensive operating system so that you at least have to buy new copies as you replace your older machines.

    FOSS is at least theoretically invulnerable to these pressures. In theory, there will eventually be all-singing all-dancing FOSS packages covering all of the major software categories, and the age of commercial mass-market software will be at an end. I've been waiting for this day to come since well before the first release of Linux. I'm surprised that it hasn't come yet. I'm surprised that the majority of FOSS software is still as buggy, poorly designed, and -- almost without exception -- undocumented as its commercial equivalents.

    I suppose I shouldn't be surprised. Excellence in software engineering is like excellence in any other field: it's really fucking hard. It's even harder when you have a day job; time constraints aside, after 8-12 hours coding at work, the last thing many developers want to look at when they get home is compiler output. Many of the remainder are either amateurs or students -- not to diss either category, but often the necessary experience is lacking, and the lone hacker often lacks the knowledge or the inclination to produce code that's easy for other developers to work with. I remain confident that we'll get there, though. (I am less confident that I will still care by then, but it will still be a boon to those who live to see that day.) I am equally certain, for the reasons
    • this is not true. its not really fucking hard. you just have
      to have sufficient time and not be so lazy. it
      make take 4 times what you normally think, and be boring as
      all hell, but its totally possible.

      let this be a lesson to all of you free marketeers..you know,
      the invisible hand. here is a whole population of lazy whiny
      bastards who provide almost no intrinsic value to anyone and
      get paid more than most.
    • People won't pay to develop good code. Period. There is no demand for perfection. I was part of a three man team that wrote a prototype media viewer for early release movie content. We provided the backend encrypter application and ancillary libraries under license. Our "proof of concept" was finished in 8 weeks and was so successful that we had our code in live in air airline flight tests with real customers. Awesome work. Very stable. The encrypter application we wrote had a few issues with some non-stand
    • by stretch0611 ( 603238 ) on Monday October 10, 2005 @02:59AM (#13754781) Journal
      Excellent, I agree with you. I also consider myself an oldbie. (20 years of programming, 12 years being paid for it) Fortunately in the early years I had a teacher that actually emphasize design and comments.

      Unfortunately the environment in the business world today prevents truly bug-free programming. A lot needs to change:

      1 - Fire all the programmers and developers that can't program. We all know which ones in the group fit into this category. Unfortunately our bosses don't know. They're the ones that cause the majority of the bugs. They came into the industry just for money (pre-2000 bust) and they have no real feel for programming yet they know how to email the boss. Keep the ones that are naturals. The real code warriors. The good ones know when to code new source, when to copy old source, and how to clean up old source when they copy it into their new modules.

      2 - Get rid of the bosses that don't know tech people. (i.e. the ones that don't know the difference from #1 above) The boss doesn't need to know tech (it does help) but they do need to know their people. They also need to know how to keep office politics and beauracracy away from their people.

      3 - Get rid of separate New Development and Maintenance groups. People will code better when they know they will have to fix their own code when it goes into production. They will care more about stability instead of features. Also, a programmer learns the difference between good and bad coding techniques when they are forced to maintain both.

      4 - After the requirements are requested and the specs/design is created don't let users change them. I can't change everything just because a user changes their mind. If I have to change, the release date is pushed back as if I just started the design today. I can't complete a program until you are done knowing what you want it to do.

      5 - Procedural vs. Object Orientated programming. The huge developement debate. I admit I am biased toward Procedural programming. However, you should use whatever works better for your project. A GUI works better when you design using OOP, but when you need to crunch numbers on 10 million records procedural will work a lot better. I know a lot has been said about the poor code quality of OOP in particular, but if you get rid of the idiots in #1, the logic should be easy to follow.

      6 - KISS - Keep It Simple Stupid - I used to work with someone very intelligent, but his code was terrible. He would program elaborate functions just to add two numbers together. My honest belief is that he tried to impress us with his "coding ability." If someone needs a simple program give them a simple program, don't redesign the wheel.

      7 - Shoot and KILL everyone that sponsors or participates in a unreadable source code competition. (sorry personal peeve) We need to promote legible code with indenting and good, clear, and relevant variable naming.

      8 - Quality. CMM, ISO, TQI. These are nothing more than BULLSH!T. While there some occasional insights coming from these "Quality" initiatives I disagree with most of the methods. Commenting and documenting your code is a good thing. Unfortunately, most of this initiatives are nothing more than feelgood bs for clueless management.

      9 - Admins and Tech Writers. Hire all the good ones back. The improve our ability to code by letting us use admins to do the less technical aspects of our jobs. Their hourly cost is less than ours and by offloading some of our work to them we have more time to develop the system that managent wants done yesterday. This creates more cost effective development even though it raises headcount.

      10 - Pay. Simple answer. You get what you pay for. If you offer good pay for good programmers you will get good code in return provided your managers need know their programmers (see #2 above)

      11 - Overtime. Don't do it. An overworked, stressed developer is a poor quality developer. A little OT before a release isn't terrible, but 50+ hour weeks for months on end will cause poor code. Also, if a little OT before a release happens, compensate the developer with pay or comp time to make them happy.

      12 - TEST TEST TEST TEST TEST TEST. Then test some more. Make sure your users test also. This is the most important step.
  • I don't think that we can simply say that the market has decided against software vendors accepting liability. Part of the problem is that so much software comes from Microsoft, which has refused to assume liability for its software. A company that tried to distinguish itself by selling a product that competed with Microsoft at a higher price in return for better quality assurance and a real warranty would probably not survive, not because people didn't like less buggy software with a warranty bug because

  • by Jaime2 ( 824950 ) on Sunday October 09, 2005 @10:02PM (#13753782)
    people demand that it sucks.

    Seriously. For nearly every case, if there are two available pieces of software (OSS or not), most people will choose the one that is more feature rich. Sure, those in a mission critical situation or the poor people that get to install and support the software long-term will demand quality and maintainability. But, those people are far outnumbered by the masses that use software casually.

    So, given a limited set of resources, quality will always be just barely up to what people will tolerate. Yes, even in open source software. Example: Mozilla Thunderbird -- They have a feature schedule out right now. About half of the planned features are in the current build. Do you think they'll wait until the code is 99.99999% error free in all situations before comitting time to add features? They have no deadlines, no financial burdens, no one telling them to ship the software. Yet, they will ship it. If they don't, their user base will entirely desert them and switch to a horrible, buggy, alternative (probably Outlook Express). This is simply because people demand cool crap. That's why they buy half the crap they buy, that's why the US has a $250 billion trade deficit with China. We collectively love crap.
  • by quentin_quayle ( 868719 ) <quentin_quayle&yahoo,com> on Sunday October 09, 2005 @10:09PM (#13753812)

    Sure, let's have liability. The software must perform substantially as advertised - counting all advertisements, press releases, interviews given by publisher's officers, etc.. But make the amount of damages simply equal the price paid.

    This would keep free-as-in-beer software in the clear. It would also have the side benefit of forcing Microsoft to reveal its OEM prices. :D

    I like the source code as condition of immunity suggestion above too, but it would be futile without a licence like those the FSF approves, which would actually allow you to fix problems without violating copyrights and patents.

  • Bugs and security holes can be as simple as a typo - e.g. if (uid = 0) { instead of if (uid == 0) {.

    Now imagine that the BBC could get sued for every typo that made its way into their news articles. Sounds unappealing? That's essentially the standard this clown is holding software developers to.

  • Software insurance (Score:2, Interesting)

    by click2005 ( 921437 )
    It makes me wonder why no insurance companies offer insurance against loss via bad software. House insurance is dependant on suitable locks and security. Software insurance could be made available on the condition that suitable AV/Spyware/Firewall software was installed and patched.
  • by Arandir ( 19206 ) on Sunday October 09, 2005 @11:15PM (#13754051) Homepage Journal
    But I still believe that the current situation is unsustainable, and that we should be working harder to improve the quality of the code out there.

    This is a very different thing that legislating mandatory guarantees on software. Yes, we SHOULD be working harder to improve the quality of our code. But not at the price of authoritarian government.

    There are few things in life that are truly a free market, but software comes close. It's no surprise then that spoilsports want to come in and regulate it. That happens wherever freedom begins to bloom. Let me clue you in: the marketplace has decided on a low (as in almost non-existant) demand for guarantees and warranties on consumer software. It's not developers doing this, it's the users.
    • Let me clue you in: the marketplace has decided on a low (as in almost non-existant) demand for guarantees and warranties on consumer software. It's not developers doing this, it's the users.

      Which is precisely where regulatory practices are born. I can understand the general point you are making, however the statement "But not at the price of authoritarian government" is a little over the top. Name one regulartory control that seeks to govern quality rates that has not come about as a result of consumer

  • by Julian Morrison ( 5575 ) on Monday October 10, 2005 @12:26AM (#13754321)
    Computer software has been mostly unregulated. This has allowed us to watch the "invisible hand" of the market in its purest form. Commodity programs have disclaimers, buy bespoke and you get guarantees, pay yet more and you get formally certified code. The cost of risk and the cost of the program are in effect two seperate purchases - product and insurance.

    If you force programmers to carry the risk cost, you don't magically get bugfree code. You just delete the no-guarantees market. In effect you're forcing programmers to bundle insurance with every installation. "Free" disappears. "Libre" might survive in an attenuated form - edit "open source" and you become the liability carrier. You might do it in house, but few could afford to publish.

    The guy points out that other industry sectors have this sort of law. Yup, they do, and I contend we're all worse off as a result. Amateurs are frozen out, because they can't afford to jump insurance hoops. Innovations are stifled. Saleable skills are wasted. Personal self-expression is denied. Even though all parties are willing, the law stands in between saying "no". This is nothing to emulate!

    Nanny liberals would contend they are protecting buyers from risk. As an adult you have to accept that the universe has dangers. You can't wish it safe, and the utopia of your childhood was an illusion. Who then is best placed to decide when you should gamble and when hedge? Philosophically, no action can be said to be "better" or "worse" without a reference to a person whose goals it serves or thwarts. No person can know another's mind. Therefore, you alone are properly placed to weigh the options and decide on your own behalf. At best a law commands you to take your best choice. At worst, bans it. Neutral or harmful, and (given diversity) certain to be harmful to some. This is why regulation is never better than a free market, even in risk.

E = MC ** 2 +- 3db

Working...