Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming IT Technology News

Barbara Liskov Wins Turing Award 187

jonniee writes "MIT Professor Barbara Liskov has been granted the ACM's Turing Award. Liskov, the first US woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking. She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the 'Nobel Prize in computing.'"
This discussion has been archived. No new comments can be posted.

Barbara Liskov Wins Turing Award

Comments Filter:
  • Turing test (Score:5, Funny)

    by ignishin ( 1334571 ) on Tuesday March 10, 2009 @02:32PM (#27139395)
    Does this mean she passed the turing test?
  • by Em Emalb ( 452530 ) <ememalbNO@SPAMgmail.com> on Tuesday March 10, 2009 @02:32PM (#27139401) Homepage Journal

    I bet she has some stories from "the old days" of being about the only female geek around.

    Good for her.

  • by interkin3tic ( 1469267 ) on Tuesday March 10, 2009 @02:32PM (#27139413)

    She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the 'Nobel Prize in computing

    Did they give $250,000 wallets to the men who won previously?

  • by Chris Mattern ( 191822 ) on Tuesday March 10, 2009 @02:38PM (#27139499)

    ...we can't tell her apart from a computer over a teletype link?

    No, wait...

  • by Baldrson ( 78598 ) * on Tuesday March 10, 2009 @02:41PM (#27139539) Homepage Journal
    Liskov says: "Today the field is on a very sound foundation."

    If only it were true.

    I recall, in fact, the point in time when I first ran across Liskov's CLU in the context of working one of the first commercial distributed computing environments for the mass market, VIEWTRON, and determining the real problem with distributed programming was finding an appropriate relational formalism.

    We're still struggling with the object-relational impedance mismatch today. The closest we are to finding a "solid basis" for computer science is a general field of philosophy called "structural realism [wordpress.com]" which attempts to find the proper roles of relations vs relata in creating our models of the world.

    If anything, our descriptions should be "relations all the way down" unless we can find a good way, as some are attempting, to finally unify the two concepts as conjugates of one another.

  • Coincidentally (Score:5, Informative)

    by counterplex ( 765033 ) on Tuesday March 10, 2009 @02:43PM (#27139577) Journal
    I happen to have a printout of an article on "The Liskov Substitution Principle" and was wondering just yesterday how it is that as programmers we use these principles in everyday life yet don't know their names or the stories of how they came about. As the first US woman to earn a PhD in CS, I'm sure there are some interesting stories to tell about it.

    For those who might not have her original text handy, the Liskov Substitution Principle states (rather obviously):

    If for each object o1 of type S there is an object o2 of type T such that for all programs P defined in terms of T, the behavior of P is unchanged when o1 is substituted for o2 then S is a subtype of T

    which, when stated in the words of Robert "Uncle Bob" Martin as something we probably all intuitively understand from our daily work, is:

    Functions that use pointers or references to base classes must be able to use objects of derived classes without knowing it

    • Sadly, too many people still think it's a guideline, not a rule. Sorry, if your code violates the LSP, you've got a bug, it just hasn't bitten you yet.

      She deserves recognition for the vast number of latent defects she has effectively removed from the worlds software with the LSP alone, I'm glad she got the award.

      • by geekoid ( 135745 )

        "if your code violates the LSP, you've got a bug, it just hasn't bitten you yet. ..."

        False.

        • Re: (Score:3, Interesting)

          by Catiline ( 186878 )

          "if your code violates the LSP, you've got a bug, it just hasn't bitten you yet. ..."

          False.

          Proof, please; you are contesting an award-winning theory, and I for one side with prevailing theory until further evidence is provided.

          • Re: (Score:3, Interesting)

            by Workaphobia ( 931620 )

            What are you talking about? It's not a theory, it's a definition. One specific definition out of several, I would imagine. Moreover, nothing in it says "OBEY ME OR YOU'RE BUGGED"; it's a very good guideline for sensible program design, but it has nothing to do with the completely independent definition of a bug in the general sense. Of course, you can always *make* the LSP part of your design criteria so that violations of it do in fact constitute bugs for your project, but that's a different matter.

            If for each object o1 of type S there is an object o2 of type T such that for all programs P defined in terms of T, the behavior of P is unchanged when o1 is substituted for o2 then S is a subtype of T

            If you

          • How about a billion dollars for violating LSP. http://developers.slashdot.org/article.pl?sid=09/03/03/1459209 [slashdot.org] If you think about it, That's exactly what Tony Hoares mistake was. A violation of LSP. Sometimes the thing pointed to by "T *" does not behave as an instance of T. When it doesn't, as all too often it doesn't. Bad Shit happens. Sorry, type checking compiler can't help you thanks to Hoare's mistake.
      • by vishbar ( 862440 )

        Okay, so, stupid question. Wouldn't the very existence of virtual methods violate this principle? For example, if I have a method:

        void notifyUsers(Publisher pub) { pub.publish(); }

        where class Publisher contains a void method publish(). You have two subclasses of Publisher: EmailPublisher and SmsPublisher (functionality should be obvious, let's just say that it sends a message to users via email/sms). Would the behavior not be different based on whether I passed either of those two subtypes (assuming

        • Re: (Score:2, Informative)

          My understanding is that OP's formulation of the LSP just won't work, because the term "behavior" is too broad. (For reference, here's the formulation I'm referring to: "If for each object o1 of type S there is an object o2 of type T such that for all programs P defined in terms of T, the behavior of P is unchanged when o1 is substituted for o2 then S is a subtype of T.")

          The Wikipedia article [slashdot.org] has a better formulation, in terms of properties proveable of objects of the types in question (which I suspect

    • Re:Coincidentally (Score:4, Insightful)

      by shutdown -p now ( 807394 ) on Tuesday March 10, 2009 @04:55PM (#27141641) Journal

      I happen to have a printout of an article on "The Liskov Substitution Principle" and was wondering just yesterday how it is that as programmers we use these principles in everyday life yet don't know their names or the stories of how they came about.

      To be honest, I would consider anyone who does not know what LSP is, to be OO-ignorant, even if (s)he does code in an OO language. It is a very fundamental rule, much more important that all the fancy design patterns. I guess it's possible to "invent" it on your own, just as it's possible to normalize databases without remembering, or even knowing about the strict NF definitions, but in either case, chances are high you'll get it wrong eventually.

      • Really, the LSP is the kernel of wisdom behind most design patterns. If you already know OO design, DPs form a nice augmenting toolbox. But you could always craft your own design for your specific situation. It may very well be that this overlaps with an existing design pattern - just further proof that you didn't need the DP to begin with.

        • Re: (Score:3, Insightful)

          On a side note... I remember a lot of people complaining about how Java 5 generics were oh-so-unobvious and hard when it came to lower and upper bounds etc. Meanwhile, the topic "why can't I cast List to List - this must be broken!" is recurring on microsoft.public.dotnet.languages.csharp probably about every two weeks. Both cases are absolutely trivial when one understands and carefully applies LSP to the problem.

          Which, I guess, just shows that many Java and .NET programmers don't really understand the the

      • by Raenex ( 947668 )

        (s)he

        This just sucks. How do you even pronounce that when reading aloud? Singulary they" [wikipedia.org] sounds so much more natural.

      • Re: (Score:3, Interesting)

        I survived 8 years as a OO programmer without hearing this term. Even then, only as a question on a recruitment agency test.

        So either I went to the wrong university, and consistently the wrong employers, or it's one of those self-evident principles that just didn't have a name before Barbara turned up.

        • Re:Coincidentally (Score:5, Insightful)

          by shutdown -p now ( 807394 ) on Wednesday March 11, 2009 @03:37AM (#27147391) Journal

          So either I went to the wrong university, and consistently the wrong employers

          Employers aren't there to teach you these things, and a lot don't care so long as you crank out code that mostly works (and, let's face it, it's not always easy to tell for them, and the concepts of "formal correctness" and "readability" and "maintainability" are often not even on their radar, unless a developer brings that up). And as for university - it may well be. If they had an OOD design course and never mentioned it, or at least described it in a formal way, then they wasted your time.

          Of course, it had somehow become an established norm that "object-oriented design" course in the uni is basically just applied Java programming; from what I've seen, the best you can expect from a typical graduate when it comes to OOD theory is to be able to recite "encapsulation, inheritance, polymorphism" when asked what OO even is. It's especially ironic as only one of those three is actually a required ingredient, and even that is wrongly named. The very notion of MI gets people educated that way thoroughly confused when they first meet it, and you can forget about multimethods...

          or it's one of those self-evident principles that just didn't have a name before Barbara turned up.

          It was formulated [acm.org] by her in 1987. Unless your 8 years were mostly in Simula or Smalltalk, it did have a name by the time you've started working with OO. Definitely so if you've been doing Java or .NET.

          As for self-evidence... I sort of wish it was, and it really is very simple conceptually, but the fact that so many people still get it wrong over and over again shows that, apparently, it's not all that self-evident for your typical coder.

          • That any developer over 38 or thereabouts would have not heard about this while in University then (I know I didn't).

            It is like demanding that all physicists become conversant with relativity theory in the 1920s...

    • I love the substitution principle. I don't understand the extent of her contributions to the field: particularly, how novel they were and what the world of computer science would be like today if she never existed. Anyone care to enlighten me?

  • 1968 (Score:5, Informative)

    by MoellerPlesset2 ( 1419023 ) on Tuesday March 10, 2009 @02:50PM (#27139695)

    Since it's not in the article, I looked it up. She got her PhD in 1968.

    I initially thought that kind of sucked (Cambridge's 'Diploma in Computer Science' has been awarded since 1954), but apparently the first US PhD in CS named as such was in 1965 (University of Pennsylvania).

    The field could still use more women though.

    • Re: (Score:2, Insightful)

      > The field could still use more women though.

      Why?

      Do you complain that we need more pregnant men also?

      • Re:1968 (Score:5, Insightful)

        by MoellerPlesset2 ( 1419023 ) on Tuesday March 10, 2009 @03:51PM (#27140631)

        Why? Do you complain that we need more pregnant men also?

        Men aren't capable of becoming pregnant. I however, happen to believe women are just as capable of being good computer scientists as men are.
        The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.

        I think that's a pity.

        • Re: (Score:2, Troll)

          The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.

          Either that, or most of our women are much too sensible to waste time in a field like CS.

          In other words, just because you think a field is important, doesn't imply that everyone agrees with you.

        • The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.

          I think that's a pity.

          Is it a pity that upwards of half of our best nail salon talent is going to waste as well?

      • Re: (Score:2, Insightful)

        by Kerrigann ( 1401847 )

        I don't know why in recent times there is such a disparity of men and women in C.S., but I imagine your attitude might have something to do with it...

        I really don't know how to respond to this... you're either trying to be funny (but got modded +3 insightful), or are seriously trying to imply that a woman who's good at C.S. is as much of a freak as a pregnant man.

        All of a sudden I feel very alone in this field.

        • Re: (Score:2, Interesting)

          > you're either trying to be funny (but got modded +3 insightful), or are seriously trying to imply that a woman who's good at C.S. is as much of a freak as a pregnant man.

          Try neither. I had no emoticon, and I had no implications -- I just asked a simple question, in order to why find out where this assumption is coming from.

          I have seen this slippery-slope type of Political Censorship before and all it does is lead to reverse discrimination. Replace Women with Ethnic / Religion / X of your choice.

          This c

      • Re: (Score:3, Interesting)

        by Lunzo ( 1065904 )

        The field could still use more women though.

        Better rhetorical questions:
        * Do you complain we need more male nurses?
        * Do you complain we need more male teachers?
        * Do you complain we need more female garbage collectors?

        Gender equality is not the same as having a 50/50 male/female split in every field.

    • by pjt33 ( 739471 )

      Cambridge's 'Diploma in Computer Science' has been awarded since 1954

      It would be more accurate to say "was first awarded in..." because they recently shut it down after years of trying without much success to keep up the numbers.

    • Re: (Score:3, Insightful)

      by geekoid ( 135745 )

      "The field could still use more women though."

      Why?
      Not that there shouldn't, but you are blindly stating something without any argument.
      WHat does a women bring that a man doesn? or vise versa?

      When we can determine that, then maybe we can find out why the field continues to attracts so few women. Even in the presence of programs that push very hard to get door open and to give women priority in the education there just aren't a lot of women.

      I am genuinely interested in why?
      Yhe more I think about it, the more

      • Re:1968 (Score:5, Funny)

        by turing_m ( 1030530 ) on Tuesday March 10, 2009 @06:39PM (#27142899)

        Why?

        Just a guess, but maybe his tastes don't lean towards guys with beards and questionable personal hygiene.

      • Re: (Score:2, Insightful)

        by Charan ( 563851 )

        WHat does a women bring that a man doesn? or vise versa?

        A different perspective. And maybe a less-confrontational attitude.

      • Your reply is a perfect example of why we need more of them.

        Only they know all the reticence and outright discrimination suffered by women in the workplace, sometimes disguised as "curiosity" funnily enough.

  • by erroneus ( 253617 ) on Tuesday March 10, 2009 @02:54PM (#27139743) Homepage

    Software is ALWAYS reliable. It is the code that people write that sucks.

    I don't know how many people come from the "old school" of programming, but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.

    Most fatal program flaws are ridiculously easy to prevent, but bad programming habits prevail and short of creating some human language interpreter that writes code as it should be written, nothing will replace lazy programmers who trust library functions too much. And yes, I know about deadlines and not having time to waste and all that stuff. But there is something most people are also missing -- pride! I know that when I do something, I am putting my name on it whether it is directly or otherwise. And if my name gets associated with something, I make damned sure that it works and is of good quality. With the stuff that goes out these days (especially SQL injection?! PLEASE! What could be more fundamental than screening out acquired text data for illegal characters and lengths?!) it is clear that pride in one's own work is not something that commonly exists.

    For those of you out there who agree with me, it probably doesn't apply to you. For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?

    • by Anonymous Coward on Tuesday March 10, 2009 @02:58PM (#27139813)

      Software is ALWAYS reliable. It is the code that people write that sucks.

      No, computers are reliable. They'll do exactly what you tell them to do. Software, however, sucks, since it is simply a representation of the code that people write, which also sucks.

    • Re: (Score:2, Insightful)

      but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it.

      Functions? Back when I started, we didn't have functions. We had jump instructions.

      You kids and your newfangled 'functions' and 'libraries'. Now get off my lawn!

      • We just had dirt. Young whipper-snapper!

      • Re: (Score:3, Interesting)

        by Ardeaem ( 625311 )

        Functions? Back when I started, we didn't have functions. We had jump instructions.

        When I first learned to program as a kid, I taught myself how to write pseudo-functions using goto. I looked back at those programs a few years later, and they were completely unreadable. Now, my wife does a little programming on the side (we're both researchers) and she loves goto. I keep trying to tell her NOT TO USE GOTO, but she never listens. It's painful to read her code. I think we might need counseling for this...

      • Re: (Score:3, Funny)

        by spacefiddle ( 620205 )

        Now get off my lawn!

        10 PRINT LAWN
        20 GOTO CURB

    • by n3tcat ( 664243 )
      As with all things, pride comes at a cost. If it's worth it to you to "finish" fewer projects at the expense of less $dollars, good for you. I'm with you on that one. But I totally respect anyone elses decision to go balls to the wall and ignore all proper coding technique so they can Get The Job Done and get paid. Especially with teh current economic climate.
    • "And when we wrote it, we checked for overflows, underflows, error status and illegal input."

      And you also wrote bugs that you didn't immediately realize you were creating.

      Not all bugs are due to laziness. Sometimes people make mistakes, or misunderstand requirements, or the operating parameters change, or, or, or...

    • Re: (Score:3, Insightful)

      I don't know how many people come from the "old school" of programming, but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.

      That's great. Now that you guys built up the roads, bridges, and traffic lights... The rest of us are interested in actually using them to GET SOMEWHERE.

      Rewriting OpenGL, a

    • by oGMo ( 379 )

      Software is ALWAYS reliable. It is the code that people write that sucks.

      Yeah right. Even a simple ADD instruction will give the wrong result when the hardware fails. And hardware will fail.

      Software isn't "reliable, but." It's only as reliable as it can be. "Those damn kids and their fancy functions" isn't the problem. The problem is fundamental complexity; no magic wand will make that go away.

      For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?

      Sure, you can w

      • Re: (Score:2, Informative)

        by jc42 ( 318812 )

        Even a simple ADD instruction will give the wrong result when the hardware fails.

        True, but the reality is much worse than that. A simple ADD instruction will also give a wrong result, on all current "popular" CPUs, when the hardware is working exactly as designed.

        To the people who design CPUs, adding two positive integers and getting a negative result is exactly what the hardware should do in some cases, depending on the values of the integers. This wreaks havoc with software designs that assume the mathe

        • The difference between the hardware failing and a two's complement number overflowing is that in the former case the software can do absolutely nothing to guard against the non-zero possibility of undefined behavior, while the latter behavior is completely predictable in principle. Whether it's easy is another matter (I would argue "Yes: Stop being a panzy" ;) ).

      • Even a simple ADD instruction

        10 POKE RITALIN
        20 GOTO WHEEE

        ... i'll get me coat.

      • Sure, you can write provably error-free code... but you have to solve the halting problem [wikipedia.org] first.

        No, all the Halting Problem does is prevent you from writing an algorithm to prove that programs written in a Turing-complete language are error-free. This is due to Rice's Theorem, which casts a similar shadow on any other problem that tries to determine a non-trivial property of a Turing machine's language.

        You can still write specialized programs in weaker non-Turing-complete languages if you badly need automated proofs of correctness. Of course, you could also eliminate the automation requirement and jus

    • by geekoid ( 135745 )

      Becasue the spec wasn't written well.

      I'm tlaking about millions of lines here. Software that effectivly gets the [programmers compartmentalized becasue no human can track everything everyone is doing at the same time.
      So a poorly defined document dictating the layers is a bug waiting to happen.

      Now some bugs are inexcusable. Divide by 0 crashes spring to mind as the most glaringly obvious inexcusable bug.

      • Re: (Score:3, Insightful)

        by Workaphobia ( 931620 )

        Now some bugs are inexcusable. Divide by 0 crashes spring to mind as the most glaringly obvious inexcusable bug.

        Why is everyone talking about all these obvious elementary bugs? Division by zero, integer overflow, illegal input... I haven't even heard anyone mention anything as sophisticated as a NULL pointer dereference, much less a memory deallocation problem or God forbid a race condition.

        It's just that everyone around this thread is trying to have this profound discussion of fundamental software errors, yet talks like they're in a high school programming class.

    • when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.

      Have fun coding a modern operating system from scratch. Monolithic all-original code might work fine when you're working in tens of kilobytes, but we're in a different world now. And a different economy - why reinvent when the future (present too I guess) is about linking components together?

      For those of you out there who agree with me, it probably doesn't apply to you. For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?

      It's both fixable and preventable. But at what cost? You mention SQL injection errors - that's the kind of thing that I'd be pissed off about, and maybe we can expect more from vendors on that front. But I have higher g

  • ...which carries a $250,000 purse.

    A woman's purse!! [youtube.com]
  • by Eil ( 82413 )

    Liskov, the first US woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking.

    Clearly, she's never worked for Microsoft.

    Zing!

  • Liskov is a horrible author, and given my experience with her thoughts from "Program Development in Java," I would guess she is a horrible coder as well. Don't be conned into buying books based an award; her works are conflicting where they aren't simply wrong.
    • Re: (Score:2, Funny)

      by geekoid ( 135745 )

      HAHAHahahhaha...
      Someone with your sig has the gall to write that about a book?

      Irony is rich today.

      Oh, and how about an example of where she is wrong? I don't think I ahve ever read her stuff but I would like to see an example of what you are talking about.

    • I'm surprised this comment didn't cause mysql, or whatever back-end /. uses, to die in a huge embarrassment overload.

      I look forward to next week on /.: Quantum Electrondynamics: Feynmann's work is "conflicting where not simply wrong" and other inspiring comments.

  • by addininja ( 1496577 ) on Tuesday March 10, 2009 @05:01PM (#27141701)
    Frances E. Allen got the turing award in 2006 http://en.wikipedia.org/wiki/Frances_E._Allen [wikipedia.org] http://en.wikipedia.org/wiki/Turing_Award [wikipedia.org]
  • by dpigott ( 793386 ) on Wednesday March 11, 2009 @01:20AM (#27146609)

    CLU drew on the lessons learned with both Alphard and Vers. Alphard was from CMU and written by Wulf and Shaw, Wulf also writing the famous BLISS. Vers was made by Jay Early, whose parser was hugely important in all the compilers of the time. THe language itself (and its V-graphs) was heavily influenced by the Mem-theory of Anatol Holt (who was on the Alogl Committee and was a principle in designing the astonishing GP and GPX systems for the UNIVAC - first languages to explicitly feature ADTs per se. That became ACT, the adaptable programming system for the Army's Fieldata portable computers (portable in a completely different sense to the modern usage. He also hated Unicode, but that was a rival programming system back then. So reading the reports at the time can be misleading - "don't use Unicode on Portable Computers!"). Holt's ideas permeate computing, the notion of making any system of data representation as abstract as possible goes back to him.

    CLU was written using MDL (pronounced muddle) which was a protoreplacement for LISP which featured ADTs. MDL was cowritten by Sussmann of LISP fame as a basis for PLANNER which became Scheme, and perhaps more geekly interesting is that is was also used for writing ZIL (and if you don't know about ZIL, you shouldn't be reading Slashdot)

    CLU evolved into Argus, but the ideas were also used in the Theta programming system for the Thor OO database, and was also in PolyJ which was (as it suggests) a Polymorphic Java

    Another fascinating development of the CLU ideas the SPIL system that Liskov co-wrote at the USAF-sponsored MITRE corp, which was in turn used for writing the VENUS operating system

    Liskov has pioneered the notion of abstraction per se in language design for 40 years, and this generics-based approached is now taken for granted. She fully deserved the award for her insights as well as for her determination in fighting the reductionism represented by previous recipients (eg Dijkstra) although opposed by others (eg Iverson)

    I have extracts for reports for all language of these on the HOPL website HOPL.murdoch.edu.au (too many URLs to paste in individually). Find CLU http://hopl.murdoch.edu.au/showlanguage.prx?exp=637&name=CLU [murdoch.edu.au] and follow the genealogy links. And if you haven't yet seen my 4000-strong programming-language family tree it is worth printing out for wallpaper if you have an A1 plotter.

Genius is ten percent inspiration and fifty percent capital gains.

Working...