Barbara Liskov Wins Turing Award 187
jonniee writes "MIT Professor Barbara Liskov has been granted the ACM's Turing Award. Liskov, the first US woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking. She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the 'Nobel Prize in computing.'"
Turing test (Score:5, Funny)
Re:Turing test (Score:5, Funny)
I hope not. MIT professors are not human.
Re: (Score:3, Interesting)
While there's no doubting her accomplishments, I will say that my enjoyment of 6.170 was in spite of her.
Not surprised.
I had no use for her in 1978. She actively assisted in flunking a good friend of mine out of the PhD program. She turned down a thesis idea I had, called it "totally the wrong direction" -- and three years later a guy got a PhD and an award with the same idea at Waterloo.
Maybe she mellowed with age, but given your comment, I guess not.
Re: (Score:2, Funny)
At MIT, they give the test to the professors the award to the machines. Yeeeaaahhh [instantrimshot.com]
Re:Turing test (Score:4, Funny)
http://xkcd.com/329/ [xkcd.com]
Re:Turing test (Score:4, Funny)
Re: (Score:2)
I guess that means she can halt her research now. But it's hard to say for sure. If she kept going and won it again she's be a turing-machine...
Good for her... (Score:5, Funny)
I bet she has some stories from "the old days" of being about the only female geek around.
Good for her.
More women in the old days (Score:5, Interesting)
Apparently there were far more women in computing in "the old days". The dominance of the male geeks is a relatively recent phenomenon.
Re:Good for her... (Score:4, Funny)
Yeah, there are like twice as many now.
Re: (Score:2)
There weren't in geeks in those days, just nerds.
In technology, there were certianly circus geeks.
And that 'comic book' guy? he was a dork.
But there wasn't the market to have a geek in the way we think of it now.
Only the women (Score:2)
We judge all of them on their appearance. Deny it all you want but it's true. Women do it as well.
Their are only two kinds of women...
Re: (Score:2)
...the kind with PhDs in computer science and the kind without.
Purses and wallets? (Score:5, Funny)
She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the 'Nobel Prize in computing
Did they give $250,000 wallets to the men who won previously?
Re:Purses and wallets? (Score:5, Funny)
Re: (Score:3, Insightful)
So does that mean... (Score:3, Funny)
...we can't tell her apart from a computer over a teletype link?
No, wait...
Relations all the way down (Score:4, Informative)
If only it were true.
I recall, in fact, the point in time when I first ran across Liskov's CLU in the context of working one of the first commercial distributed computing environments for the mass market, VIEWTRON, and determining the real problem with distributed programming was finding an appropriate relational formalism.
We're still struggling with the object-relational impedance mismatch today. The closest we are to finding a "solid basis" for computer science is a general field of philosophy called "structural realism [wordpress.com]" which attempts to find the proper roles of relations vs relata in creating our models of the world.
If anything, our descriptions should be "relations all the way down" unless we can find a good way, as some are attempting, to finally unify the two concepts as conjugates of one another.
Yes (Score:3, Informative)
Re: (Score:2)
Coincidentally (Score:5, Informative)
For those who might not have her original text handy, the Liskov Substitution Principle states (rather obviously):
which, when stated in the words of Robert "Uncle Bob" Martin as something we probably all intuitively understand from our daily work, is:
LSP it's not a guideline, it's a rule. (Score:5, Interesting)
She deserves recognition for the vast number of latent defects she has effectively removed from the worlds software with the LSP alone, I'm glad she got the award.
Re: (Score:2)
"if your code violates the LSP, you've got a bug, it just hasn't bitten you yet. ..."
False.
Re: (Score:3, Interesting)
Proof, please; you are contesting an award-winning theory, and I for one side with prevailing theory until further evidence is provided.
Re: (Score:3, Interesting)
What are you talking about? It's not a theory, it's a definition. One specific definition out of several, I would imagine. Moreover, nothing in it says "OBEY ME OR YOU'RE BUGGED"; it's a very good guideline for sensible program design, but it has nothing to do with the completely independent definition of a bug in the general sense. Of course, you can always *make* the LSP part of your design criteria so that violations of it do in fact constitute bugs for your project, but that's a different matter.
If for each object o1 of type S there is an object o2 of type T such that for all programs P defined in terms of T, the behavior of P is unchanged when o1 is substituted for o2 then S is a subtype of T
If you
Proof, hmm, would you like a price tag on that? (Score:4, Insightful)
Re: (Score:2)
Okay, so, stupid question. Wouldn't the very existence of virtual methods violate this principle? For example, if I have a method:
void notifyUsers(Publisher pub) { pub.publish(); }
where class Publisher contains a void method publish(). You have two subclasses of Publisher: EmailPublisher and SmsPublisher (functionality should be obvious, let's just say that it sends a message to users via email/sms). Would the behavior not be different based on whether I passed either of those two subtypes (assuming
Re: (Score:2, Informative)
My understanding is that OP's formulation of the LSP just won't work, because the term "behavior" is too broad. (For reference, here's the formulation I'm referring to: "If for each object o1 of type S there is an object o2 of type T such that for all programs P defined in terms of T, the behavior of P is unchanged when o1 is substituted for o2 then S is a subtype of T.")
The Wikipedia article [slashdot.org] has a better formulation, in terms of properties proveable of objects of the types in question (which I suspect
Re: (Score:2)
But without it you'll end up with situations where a function will not accept a pointer to a triangle, because it will only take pointers to data type "polygon" from which triangle is derived.
If the rule is correctly applied, the relationships between base and derived classes should also be a lot clearer. If it doesn't make sense for a derived type to be passed as a base type, you've probably made a wrong d
I have nothing against LSP (Score:2)
"But without it you'll end up with situations where a function will not accept a pointer to a triangle, because it will only take pointers to data type "polygon" from which triangle is derived."
Yes, I understand the implications.
"If the rule is correctly applied, the relationships between base and derived classes should also be a lot clearer. If it doesn't make sense for a derived type to be passed as a base type, you've probably made a wrong decision in your code design."
There are no "wrong decisions" unle
Re: (Score:2)
That's what protected and private inheritance are for, or even more often just plain aggregation.
Object Oriented programming has become a vague religion rather than a specific tool. People are equating the mechanics of it with actual good practice. I just tutored someone who had their first OO class and it's all about the mechanics without any of the reasoning. And all that leads to me earning my p
Re: (Score:2)
The point the parent was making is that it's conceivable to have a design spec in which *there shouldn't be any triangles* and thus the fault (bug) is not in the code that doesn't work with triangles.
That is, enumerating subtypes may be bad design style but doesn't constitute a tangible bug in and of itself.
Re: (Score:2)
Contrariwise. The requirement is clear. When you specified a parameter type, you are specifying the requirement that the parameter passed IS one of those types.
Inheritance means "IS A", if you specify that something uses the parent class, then you have specified the requirement that it works with the subclass since an instance of the subclass "IS (also) A" instance of the parent class.
If it doesn't, then you have violated
Re: (Score:2)
The mere fact that you have a class hierarchy does not a requirement make.
Polymorphism is a capability that most if not all OO languages support, but like any capability it's entirely optional.
Re:Coincidentally (Score:4, Insightful)
I happen to have a printout of an article on "The Liskov Substitution Principle" and was wondering just yesterday how it is that as programmers we use these principles in everyday life yet don't know their names or the stories of how they came about.
To be honest, I would consider anyone who does not know what LSP is, to be OO-ignorant, even if (s)he does code in an OO language. It is a very fundamental rule, much more important that all the fancy design patterns. I guess it's possible to "invent" it on your own, just as it's possible to normalize databases without remembering, or even knowing about the strict NF definitions, but in either case, chances are high you'll get it wrong eventually.
Re: (Score:2)
Really, the LSP is the kernel of wisdom behind most design patterns. If you already know OO design, DPs form a nice augmenting toolbox. But you could always craft your own design for your specific situation. It may very well be that this overlaps with an existing design pattern - just further proof that you didn't need the DP to begin with.
Re: (Score:3, Insightful)
On a side note... I remember a lot of people complaining about how Java 5 generics were oh-so-unobvious and hard when it came to lower and upper bounds etc. Meanwhile, the topic "why can't I cast List to List - this must be broken!" is recurring on microsoft.public.dotnet.languages.csharp probably about every two weeks. Both cases are absolutely trivial when one understands and carefully applies LSP to the problem.
Which, I guess, just shows that many Java and .NET programmers don't really understand the the
Re: (Score:3, Informative)
Yes, of course... I hate this thing. It should have been List<Derived> to List<Base>, of course.
The even sadder part of the story is that Java 5 allows the cast because of type erasure (with a warning, but still...).
Re: (Score:2)
Casting from a container of Derived to a container of Base should be safe. In fact I believe that's one of the forms of covariance. Did you mean the other way around, or that the language just didn't happen to support that form of covariance?
Re: (Score:2, Informative)
Not if your container supports an operation to add elements to the collection. If you could do that cast, you could cast the Container of Derived to Container of Base and add an arbitrary element of type Base, but not of type Derived.
More generally, if X(Y) is a type parametrized over Y, and Z is a subtype of Y, then asserting that X(Z) is a subtype of X(Y) fails, because X might have an operation that takes an argument of the param
Re: (Score:3, Informative)
The correct way to handle this sort of thing is covariance (and contravariance) at point of use, not at point of type declaration. So you declare a method as taking "a List where T is derived from Base" -
Re: (Score:2)
(s)he
This just sucks. How do you even pronounce that when reading aloud? Singulary they" [wikipedia.org] sounds so much more natural.
Re: (Score:3, Interesting)
I survived 8 years as a OO programmer without hearing this term. Even then, only as a question on a recruitment agency test.
So either I went to the wrong university, and consistently the wrong employers, or it's one of those self-evident principles that just didn't have a name before Barbara turned up.
Re:Coincidentally (Score:5, Insightful)
So either I went to the wrong university, and consistently the wrong employers
Employers aren't there to teach you these things, and a lot don't care so long as you crank out code that mostly works (and, let's face it, it's not always easy to tell for them, and the concepts of "formal correctness" and "readability" and "maintainability" are often not even on their radar, unless a developer brings that up). And as for university - it may well be. If they had an OOD design course and never mentioned it, or at least described it in a formal way, then they wasted your time.
Of course, it had somehow become an established norm that "object-oriented design" course in the uni is basically just applied Java programming; from what I've seen, the best you can expect from a typical graduate when it comes to OOD theory is to be able to recite "encapsulation, inheritance, polymorphism" when asked what OO even is. It's especially ironic as only one of those three is actually a required ingredient, and even that is wrongly named. The very notion of MI gets people educated that way thoroughly confused when they first meet it, and you can forget about multimethods...
or it's one of those self-evident principles that just didn't have a name before Barbara turned up.
It was formulated [acm.org] by her in 1987. Unless your 8 years were mostly in Simula or Smalltalk, it did have a name by the time you've started working with OO. Definitely so if you've been doing Java or .NET.
As for self-evidence... I sort of wish it was, and it really is very simple conceptually, but the fact that so many people still get it wrong over and over again shows that, apparently, it's not all that self-evident for your typical coder.
Do you realize.... (Score:2)
That any developer over 38 or thereabouts would have not heard about this while in University then (I know I didn't).
It is like demanding that all physicists become conversant with relativity theory in the 1920s...
Re: (Score:2)
I love the substitution principle. I don't understand the extent of her contributions to the field: particularly, how novel they were and what the world of computer science would be like today if she never existed. Anyone care to enlighten me?
1968 (Score:5, Informative)
Since it's not in the article, I looked it up. She got her PhD in 1968.
I initially thought that kind of sucked (Cambridge's 'Diploma in Computer Science' has been awarded since 1954), but apparently the first US PhD in CS named as such was in 1965 (University of Pennsylvania).
The field could still use more women though.
Re: (Score:2, Insightful)
> The field could still use more women though.
Why?
Do you complain that we need more pregnant men also?
Re:1968 (Score:5, Insightful)
Men aren't capable of becoming pregnant. I however, happen to believe women are just as capable of being good computer scientists as men are.
The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.
I think that's a pity.
Re: (Score:2, Troll)
Either that, or most of our women are much too sensible to waste time in a field like CS.
In other words, just because you think a field is important, doesn't imply that everyone agrees with you.
Re: (Score:2)
The fact that only a small minority of computer scientists are women, means that upwards of half our best CS talent is going to waste.
I think that's a pity.
Is it a pity that upwards of half of our best nail salon talent is going to waste as well?
Re: (Score:2)
(Disclaimer: i am willing to be open that the above links may be hoaxes...)
*cough cough* [wikipedia.org]
np: 808 State - Goa (808 Archives Part IV)
Re: (Score:2)
And just to add a note:
Girls are in general not worse at math then men. If you look at the math grade after 7 years in school there is no real difference between the genders.
Re: (Score:2)
You are correct as far as you take it.
There is no difference in math ability between the genders until puberty.
Put another way girls are just a good as boys at math.
Women on the other hand are generally worse then men at math.
I see three possible explantions: 1. Testosterone improves math ability 2. Estrogen decreases math ability 3. Some pretty girls figure out how to bounce their tits and get boys to do the hard work for them so they don't learn tough subjects after puberty.
The difference isn't
Re: (Score:2, Insightful)
I don't know why in recent times there is such a disparity of men and women in C.S., but I imagine your attitude might have something to do with it...
I really don't know how to respond to this... you're either trying to be funny (but got modded +3 insightful), or are seriously trying to imply that a woman who's good at C.S. is as much of a freak as a pregnant man.
All of a sudden I feel very alone in this field.
Re: (Score:2, Interesting)
> you're either trying to be funny (but got modded +3 insightful), or are seriously trying to imply that a woman who's good at C.S. is as much of a freak as a pregnant man.
Try neither. I had no emoticon, and I had no implications -- I just asked a simple question, in order to why find out where this assumption is coming from.
I have seen this slippery-slope type of Political Censorship before and all it does is lead to reverse discrimination. Replace Women with Ethnic / Religion / X of your choice.
This c
Re: (Score:3, Interesting)
Better rhetorical questions:
* Do you complain we need more male nurses?
* Do you complain we need more male teachers?
* Do you complain we need more female garbage collectors?
Gender equality is not the same as having a 50/50 male/female split in every field.
Re: (Score:2)
Cambridge's 'Diploma in Computer Science' has been awarded since 1954
It would be more accurate to say "was first awarded in..." because they recently shut it down after years of trying without much success to keep up the numbers.
Re: (Score:3, Insightful)
"The field could still use more women though."
Why?
Not that there shouldn't, but you are blindly stating something without any argument.
WHat does a women bring that a man doesn? or vise versa?
When we can determine that, then maybe we can find out why the field continues to attracts so few women. Even in the presence of programs that push very hard to get door open and to give women priority in the education there just aren't a lot of women.
I am genuinely interested in why?
Yhe more I think about it, the more
Re:1968 (Score:5, Funny)
Just a guess, but maybe his tastes don't lean towards guys with beards and questionable personal hygiene.
Re: (Score:2, Insightful)
WHat does a women bring that a man doesn? or vise versa?
A different perspective. And maybe a less-confrontational attitude.
Oh the irony. (Score:2)
Your reply is a perfect example of why we need more of them.
Only they know all the reticence and outright discrimination suffered by women in the workplace, sometimes disguised as "curiosity" funnily enough.
making software more reliable? (Score:4, Interesting)
Software is ALWAYS reliable. It is the code that people write that sucks.
I don't know how many people come from the "old school" of programming, but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.
Most fatal program flaws are ridiculously easy to prevent, but bad programming habits prevail and short of creating some human language interpreter that writes code as it should be written, nothing will replace lazy programmers who trust library functions too much. And yes, I know about deadlines and not having time to waste and all that stuff. But there is something most people are also missing -- pride! I know that when I do something, I am putting my name on it whether it is directly or otherwise. And if my name gets associated with something, I make damned sure that it works and is of good quality. With the stuff that goes out these days (especially SQL injection?! PLEASE! What could be more fundamental than screening out acquired text data for illegal characters and lengths?!) it is clear that pride in one's own work is not something that commonly exists.
For those of you out there who agree with me, it probably doesn't apply to you. For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?
Re:making software more reliable? (Score:5, Insightful)
Software is ALWAYS reliable. It is the code that people write that sucks.
No, computers are reliable. They'll do exactly what you tell them to do. Software, however, sucks, since it is simply a representation of the code that people write, which also sucks.
Re:making software more reliable? (Score:5, Funny)
No, electrons are reliable. They'll do what you tell them to do. Hardware engineers however design crappy hardware.
Re:making software more reliable? (Score:4, Interesting)
No, quantum mechanics is reliable. It defines physical uncertainties in a robust way. Electrons suffer from crappy positional and momentum certainty.
Re: (Score:3, Funny)
No, quantum mechanics is reliable. It defines physical uncertainties in a robust way.
Only when you're watching. Behind your back it's complete chaos.
Re: (Score:3, Funny)
http://xkcd.com/485/ [xkcd.com]
No, Brian Greene is reliable. He's been knitting furiously since the beginning of the universe, and isn't likely to quit anytime soon.
Quantum mechanics suffers from being far more difficult to understand than a tiny man controlling reality.
Re:making software more reliable? (Score:5, Funny)
No, electrons are reliable. They'll do what you tell them to do.
I, for one, am never sure quite what my electrons are doing. After that Heisenberg guy, they've been a bit flaky...
Re: (Score:2)
Eh, I know exactly where my electrons are. I just have no clue where they're going.
Re: (Score:2)
That is the reason I don't have a cat. (Score:2)
It scares me shitless to think about a cat that can annoy me either dead or alive.
Re: (Score:2, Insightful)
but when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it.
Functions? Back when I started, we didn't have functions. We had jump instructions.
You kids and your newfangled 'functions' and 'libraries'. Now get off my lawn!
You had a lawn! (Score:2)
We just had dirt. Young whipper-snapper!
Re: (Score:3, Interesting)
Functions? Back when I started, we didn't have functions. We had jump instructions.
When I first learned to program as a kid, I taught myself how to write pseudo-functions using goto. I looked back at those programs a few years later, and they were completely unreadable. Now, my wife does a little programming on the side (we're both researchers) and she loves goto. I keep trying to tell her NOT TO USE GOTO, but she never listens. It's painful to read her code. I think we might need counseling for this...
Re: (Score:2)
Make here debug here own code, keep your hands off.
Re: (Score:3, Funny)
Now get off my lawn!
10 PRINT LAWN
20 GOTO CURB
Re: (Score:2)
Re: (Score:2)
"And when we wrote it, we checked for overflows, underflows, error status and illegal input."
And you also wrote bugs that you didn't immediately realize you were creating.
Not all bugs are due to laziness. Sometimes people make mistakes, or misunderstand requirements, or the operating parameters change, or, or, or...
Re: (Score:3, Insightful)
That's great. Now that you guys built up the roads, bridges, and traffic lights... The rest of us are interested in actually using them to GET SOMEWHERE.
Rewriting OpenGL, a
Re: (Score:2)
Yeah right. Even a simple ADD instruction will give the wrong result when the hardware fails. And hardware will fail.
Software isn't "reliable, but." It's only as reliable as it can be. "Those damn kids and their fancy functions" isn't the problem. The problem is fundamental complexity; no magic wand will make that go away.
Sure, you can w
Re: (Score:2, Informative)
Even a simple ADD instruction will give the wrong result when the hardware fails.
True, but the reality is much worse than that. A simple ADD instruction will also give a wrong result, on all current "popular" CPUs, when the hardware is working exactly as designed.
To the people who design CPUs, adding two positive integers and getting a negative result is exactly what the hardware should do in some cases, depending on the values of the integers. This wreaks havoc with software designs that assume the mathe
Re: (Score:2)
The difference between the hardware failing and a two's complement number overflowing is that in the former case the software can do absolutely nothing to guard against the non-zero possibility of undefined behavior, while the latter behavior is completely predictable in principle. Whether it's easy is another matter (I would argue "Yes: Stop being a panzy" ;) ).
Re: (Score:2)
Even a simple ADD instruction
10 POKE RITALIN
20 GOTO WHEEE
... i'll get me coat.
Re: (Score:2)
Sure, you can write provably error-free code... but you have to solve the halting problem [wikipedia.org] first.
No, all the Halting Problem does is prevent you from writing an algorithm to prove that programs written in a Turing-complete language are error-free. This is due to Rice's Theorem, which casts a similar shadow on any other problem that tries to determine a non-trivial property of a Turing machine's language.
You can still write specialized programs in weaker non-Turing-complete languages if you badly need automated proofs of correctness. Of course, you could also eliminate the automation requirement and jus
Re: (Score:2)
Becasue the spec wasn't written well.
I'm tlaking about millions of lines here. Software that effectivly gets the [programmers compartmentalized becasue no human can track everything everyone is doing at the same time.
So a poorly defined document dictating the layers is a bug waiting to happen.
Now some bugs are inexcusable. Divide by 0 crashes spring to mind as the most glaringly obvious inexcusable bug.
Re: (Score:3, Insightful)
Now some bugs are inexcusable. Divide by 0 crashes spring to mind as the most glaringly obvious inexcusable bug.
Why is everyone talking about all these obvious elementary bugs? Division by zero, integer overflow, illegal input... I haven't even heard anyone mention anything as sophisticated as a NULL pointer dereference, much less a memory deallocation problem or God forbid a race condition.
It's just that everyone around this thread is trying to have this profound discussion of fundamental software errors, yet talks like they're in a high school programming class.
Re: (Score:2)
when I started, we didn't have all these libraries to link to. When we wanted a function to happen, we wrote it. And when we wrote it, we checked for overflows, underflows, error status and illegal input. We didn't rely on what few functions that already existed.
Have fun coding a modern operating system from scratch. Monolithic all-original code might work fine when you're working in tens of kilobytes, but we're in a different world now. And a different economy - why reinvent when the future (present too I guess) is about linking components together?
For those of you out there who agree with me, it probably doesn't apply to you. For those that disagree, tell me why? Why is a programming error FIXABLE but not PREVENTABLE?
It's both fixable and preventable. But at what cost? You mention SQL injection errors - that's the kind of thing that I'd be pissed off about, and maybe we can expect more from vendors on that front. But I have higher g
Re: (Score:2)
You must not be terribly familiar with some of the nasty exploits that affected numerous programs because they all used the same faulty library eh? This goes for Linux/Unix as well as Windows.
Ceremony for prize was like this... (Score:2)
A woman's purse!! [youtube.com]
oblig. (Score:2)
Liskov, the first US woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking.
Clearly, she's never worked for Microsoft.
Zing!
Don't let the award fool you. (Score:2)
Re: (Score:2, Funny)
HAHAHahahhaha...
Someone with your sig has the gall to write that about a book?
Irony is rich today.
Oh, and how about an example of where she is wrong? I don't think I ahve ever read her stuff but I would like to see an example of what you are talking about.
Re: (Score:2)
I didn't read his sig. I just saw the Bible citation and assumed it was something witty and sarcastic.
Re: (Score:2)
I'm surprised this comment didn't cause mysql, or whatever back-end /. uses, to die in a huge embarrassment overload.
I look forward to next week on /.: Quantum Electrondynamics: Feynmann's work is "conflicting where not simply wrong" and other inspiring comments.
She was not the first woman to get the turring awd (Score:3, Informative)
In context of history of languages (Score:3, Interesting)
CLU drew on the lessons learned with both Alphard and Vers. Alphard was from CMU and written by Wulf and Shaw, Wulf also writing the famous BLISS. Vers was made by Jay Early, whose parser was hugely important in all the compilers of the time. THe language itself (and its V-graphs) was heavily influenced by the Mem-theory of Anatol Holt (who was on the Alogl Committee and was a principle in designing the astonishing GP and GPX systems for the UNIVAC - first languages to explicitly feature ADTs per se. That became ACT, the adaptable programming system for the Army's Fieldata portable computers (portable in a completely different sense to the modern usage. He also hated Unicode, but that was a rival programming system back then. So reading the reports at the time can be misleading - "don't use Unicode on Portable Computers!"). Holt's ideas permeate computing, the notion of making any system of data representation as abstract as possible goes back to him.
CLU was written using MDL (pronounced muddle) which was a protoreplacement for LISP which featured ADTs. MDL was cowritten by Sussmann of LISP fame as a basis for PLANNER which became Scheme, and perhaps more geekly interesting is that is was also used for writing ZIL (and if you don't know about ZIL, you shouldn't be reading Slashdot)
CLU evolved into Argus, but the ideas were also used in the Theta programming system for the Thor OO database, and was also in PolyJ which was (as it suggests) a Polymorphic Java
Another fascinating development of the CLU ideas the SPIL system that Liskov co-wrote at the USAF-sponsored MITRE corp, which was in turn used for writing the VENUS operating system
Liskov has pioneered the notion of abstraction per se in language design for 40 years, and this generics-based approached is now taken for granted. She fully deserved the award for her insights as well as for her determination in fighting the reductionism represented by previous recipients (eg Dijkstra) although opposed by others (eg Iverson)
I have extracts for reports for all language of these on the HOPL website HOPL.murdoch.edu.au (too many URLs to paste in individually). Find CLU http://hopl.murdoch.edu.au/showlanguage.prx?exp=637&name=CLU [murdoch.edu.au] and follow the genealogy links. And if you haven't yet seen my 4000-strong programming-language family tree it is worth printing out for wallpaper if you have an A1 plotter.
Re: (Score:3, Insightful)
tangents that were largely unrelated to software development.
Tangents are related to geometry, not software development. Besides, professors write textbooks so they can make their students buy them, and the professors get some of the students' money; not because they're any good at it. I thought everyone knew that.
Re: (Score:2)
Then how do you explain Michael Sipser?
As long as we're talking about MIT people, I also like Scott Aaronson's informal writings, particularly his essay on large numbers. It's a must read for anyone with a remote interest in math, just as Feynman's "Personal Observations on the Reliability of the Shuttle" is required for engineers. (http://www.scottaaronson.com/writings/bignumbers.html)
Re:Translation for Americans (Score:4, Informative)
We don't call it that. We call it toilet paper like normal people. Makers of toilet paper call it bathroom tissue, I guess because they want a name that's a little more distant from "ass wipe" or less evocative of a porcelain bowl filled with crap or something, though they'll talk about their "bathroom tissue" in advertisements while showing cartoon bears (chosen because as everyone knows, bears shit in the woods) with little scraps of toilet paper all over their fat bear asses, which I can't help but wonder who the fuck has this problem and why, but I'm afraid of the answer, and apparently the right brand of ass wipe will solve it so lets just try to forget about that okay?
What were we talking about? Oh right. It's called "pop". "Soda" is okay too I guess.