1361477
story
Slamtilt writes
"Dennis Ritchie has posted source for 2 ancient C compilers here. They date from '72-'73, and he says he feels "an element of embarrassment about displaying it", but also that they may be of some historical interest... "
Re:Wait a second (Score:1)
Re:intresting (Score:1)
Isn't that what Torvalds does? He uploads so people will mirror it, so that he doesn't have to back up. ;-)
---
Have a sloppy night.
Re:Quines (Score:2)
The world's very first 'C' compiler was written in 'C', it was then 'bootstrapped' by running it through a 'C' interpreter.
Cute, and effective. K&R were gods in their day.
Re:Time travel (Score:1)
Another link of interest (Score:2)
http://plan9.bell-labs.com/7thEdMan/index.html
I was gratified to find it there, since my old (printed, Bell Labs original) V7 manuals are yellowed and cracking and on the verge of dying.
-E
Re:nothing to be ashamed of... (Score:1)
If we wrote software as tight as they did back then, a 486 running linux would make an 8 processor PIII-550 machine running NT look slowwww.
I cant wait for the grass-roots movement to start tighen up what we use today.... it's only gonna get faster and better!
Login trojans ... ah, the memories (Score:1)
Heh. I remember back in high school, I put some trojans in the login program for RSTS/E running on a PDP11-34.
Normally, we would run "power programs" (programs that let non-privledged users do privledged things) by putting trojans in one of the shared directories (account [1,2]) with the privledge protection bit set. (Analagous to putting a setuid root program under /bin in unix) Then we'de just hope that the sysadmin never noticed. Or sometimes we would create privledged accounts. But we usually got caught.
Then one day, I found out that logged out jobs were always privledged. So I put a trojan in the login program that would make it run other another program that was in my directory. My program, although it didn't have the priv big set in the protection code, would be running as a privledged job. Then I'de have it log in as me. When the sysadmin looked at me with SYSTAT, he'd see me logged in on a non-priv account running a non-priv program, so he was never the wiser...
Until he found my copy of the modified login source code! ;-)
Ah, the memories...
---
Have a sloppy night.
Interesting! (Score:1)
PDP 11 emulators / UNIX V[567] binaries (Score:2)
Boy haven't we learned a lot... (Score:2)
An idea for those with disk space, and some unf.. a software repository for old code?
Re:Perspective and Teaching (Score:1)
Actually, Linux (the kernel) frees the memory used for initialization code/data even today, so this is not all history. Anybody know of userspace code doing this ?
The login trojan (Score:2)
He then modified the compiler to recognize when it was compiling the compiler so it inserted the login trojan _and_ the code to modify the compiler when it was compiling the compiler.
Was it this version of the compiler?
http://www.cs.umsl.edu/~sanjiv/sys_sec/security
Re:Interesting! (Score:1)
Lord, lord, I remember those days... (Score:2)
Later, someone came out with a PDP-11/40 C compiler that used 11/40 floating point instructions.
It was a lot of fun trying to match up C compilers with machines that would "sort of" run them as long as you didn't declare any floating point variables. The compiler itself was "float safe" in that if you didn't declare floating point, it wouldn't use it. You had to use a "-f" flag to compile floating point, though, so that the right library stuff would get loaded to save & restore the floating registers.
C programmers had to know assembler in those days. Believe it.
ambrosia from heaven (Score:1)
I think it's wonderful too... I hope that somewhere, they are able to find code from original Unix(es). _That_ would be interesting. I'd think that surely, somewhere, there's still an ancient box that has that stuff on it. Still chugging along, attesting to the power and glory of it all.
This made me giggle. (Score:2)
{
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
waste(waste(waste),waste(waste),waste(waste));
}
At least someone does care about History (Score:2)
Anyway the move is fundamental. We have been quite careless on keeping bits of code and source for the future. I do keep some ol'stuff around and probably many people do it. But it looks much like the stuff I forget in the attic.
It is fundamentally important to keep these things for future generations. Well, it were such things that gave birth to the world we have now. Frankly our frenetic mood of "go forward" and the backward incompatibility games M$ plays, may lead us to loosing the roots where we all stand. What will my granchildren face when they will try to see my life? A few broken pieces of a CD with "Windows 95" labelled? Who was Torvalds? A mad finnish hacker making UNIX to fit in 3 inches?
I wonder if anyone notes this problem. We are loosing some good pieces of History. Software is highly volatile in terms of preservation. We might have lost already 80% of it. Maybe it is mostly worthless stuff but we all have to keep in mind that no one has a Future without knowing its Past.
I think it is time to call for such things. To give rebirth to old software. To build museums where one can be able to look at the ol'days. To someone smile on thinking that 15 years ago "I used THAT thing".
Let's greet Ritchie's move. And also Borland. As
Perspective and Teaching (Score:2)
A second, less noticeable, but astonishing peculiarity is the space allocation: temporary storage is allocated that deliberately overwrites the beginning of the program, smashing its initialization code to save space. The two compilers differ in the details in how they cope with this. In the earlier one, the start is found by naming a function; in the later, the start is simply taken to be 0. This indicates that the first compiler was written before we had a machine with memory mapping, so the origin of the program was not at location 0, whereas by the time of the second, we had a PDP-11 that did provide mapping.
I find that this passage is one of the more interesting comments made on the two compilers. Many of the people reading Slashdot (myself included) have probably heard in class of the days when programmers would write code in such a way as to conserve memory, and of course, be as efficient as possible.
I was once showed a board with a Motorola 6800 (if memory serves) series microprocessor on it, with a hexadecimal keypad and a led display. It had no internal clock which would of make keeping time quite a challenge to someone who is used to simply grabbing it using predefined functions.
This old code has many a use. Primarily as a teaching tool, make the students looks at the code, and give them specifics on how the compilers worked. In many ways, it's like looking at an antique car, you seem to be able to relate more to what is being taught to you... Since you have something concrete from which to apply your knowledge to.
Personally, it's hard to imagine a computer functioning without mapped memory... Being able to see that early C compilers functioned without it, to me, brings a whole new perspective to computing. My 0.02c
Wonderful news humor (Score:2)
Re:intresting (Score:2)
I don't feel so bad about my lack of comments (Score:1)
-russ
Re:The login trojan (Score:1)
> recognize when it was compiling login -
> it added a trojan.
As far as I know this was talked about in a paper published in the early to mid eighties.
I'm not sure wether the actual code was written. It may not have been published back then, but only released recently.
Roger.
Re: disappearance of History (Score:1)
... but what was the quote?
Microsoft paid me three cents for accepting Encarta '99. It was priced $24.97 and it had a $25 rebate coupon inside the box.... and you paid the post office $0.33 cents to mail the rebate back
-BrentRe:Lord, lord, I remember those days... (Score:2)
My personal feeling is that they should know assembler TODAY, as well, so as to have some frigging idea as to the consequences of their coding. Doesn't hurt to solder up a micro from scratch, using only one's wits and a data book or two, either. I did that in '83 w/a 8085 (my first computer!) and it simply rocks!
Not to mention the manly confidence that understanding a sytem from the metal up gives one!
8-}
Helps when writing fastloaders for C64's, lemme tellya!
Found a nifty use for that XOR gate on the 1541's ATN line, for sure!
(if you understood what I just said, you really scare me)
I've counseled a number of newbies to take this exact tack in their own educations, contrary to popular wisdom, and each and every one thanked me profusely for suggesting it. They eventually ran rings around their classmates, and actually GOT something out of (Acck!) DeVry!
Re:Lord, lord, I remember those days... (Score:2)
Moving ahead 10 years, in one of my CS classes, the teacher was trying to explain why you can't return a pointer to a local variable. I was dumbfounded, thinking, "duh! It's on the stack, dummy!" But nobody else in the class except for the teacher had ever really done anything with assembly language, and they were pretty much clueless about what actually goes on behind the scenes. In another class, where part of the grade was the speed of the executable, again few people had used ASM, so it wasn't too hard to stay in the top 6 for most of the programs. Nobody else even considered how long it takes to divide, or how to organize the arrays (or use pointers) so that a multiply isn't needed on array access (or at least keep it to one multiply, not two or three).
On the other hand, the optimizing compilers are making this a confusing issue. Some tricks still work, but I am no longer sure of the fastest way to do things.
Nowadays, hardware is getting cheaper while programmer time is getting more and more scarce. Often, it seems that it is more important to make the code readable and logical than to make it fast and efficient. Or is that just an excuse...
Re:Quines (Score:1)
write-only, CaSe sEnSeTiVe 'language'
that is popular only because 'if its hard
to understand, it must be powerful'
they are boobs compared to
chuck moore - inventor of forth.
Re:This made me giggle. (Score:1)
Re:Perspective and Teaching (Score:1)
this is based on my knowledge of the OS, and may be different from reality. there are other possible memory-management behaviours for differently-optimized systems.
matju(@)sodom.dhs.org
intresting (Score:2)
will the internet be a valid backup device ?
in the way that letting people have copys of the source? Will that be enough to make sure that the code is safe for all time my gut is saying NO but my head keeps muttering about nodes and US DOD inventing it to survive the BIG one.
what do you think ?
a poor student @ bournemouth uni in the UK (a deltic so please dont moan about spelling but the content)
Re:At least someone does care about History (Score:1)
When sites have existed for 10-20 years or more, it would be interesting to take a 'tour' of the site through time, and see how it evolved.
Re:Wonderful news humor (Score:1)
Re:This made me giggle. (Score:4)
a software repository for old code? (Score:2)
Re: disappearance of History (Score:3)
"How did this happen?" - he asked. Even behind glasses his eyes were icy cold.
"Well it is an astronomical damn trick that is creating havoc. It seems that gravitational lenses also do have a good deal on reflecting radiowaves..."
"So?"
"Well, the deal is that we are getting back radiowaves dropped into Cosmos in the ending of the XXth century!"
"Damn! - that's the worst problem we ever had since Reno IV had banned paper for the danger of its use by criminals and terrorists... We have to do something with it."
"Well your Holy Highness we have very little time. As far as I know some people are already in current of the existence of Linux source code. Well we keep claiming that Linus was absolutely mad and the code meaningless. But the worst is about to come. You see, the data that's coming down is nearing the time when US commemorated 30 years of Moon Landing."
"No one has ever landed the Moon!" - Gates cried while his face looked like if his internal processor became overheated.
"Well you know perfectly as me that this is not exactly the case. It was your grandfather who wiped out all records about the Moon before 2100, when Surveyor I landed with Windows9999 on it. He tought it would be a great marketing move to sell the new OS with the label: "A small step for a OS a big leap for Mankind..."
"Well, well, well - ok. I probably got a little bit over my nerves. Anyway that's not too critical. We can explain all that as another blockbuster Holywood made in those times. Anyway people would hardly believe that anyone can rise up from a Microsoft(TM) ChairMouse to take even three steps to the fridge... So what's really worrying you?"
"Well... uh... Ritchie..."
"WHAT!!!?"
"You see the founder of C had published the underlying code he used to create UNIX, somewhere near this time. This can have terrible consequences for us. People will know that there were other languages beyond QuickBasic. They will know about UNIX. They may then link all that with the cryptic meaning of Linux source code. And then they will know that Linus Torvalds was not mad at all..."
"We are in deadly danger..."
"What shall we do?"
"Well pick up the M$ Windows Central MegaServer and blow it up with a GPF. Meanwhile we will explain people that this is due to the CdC and "Richard Stallman" Front guerrillas trying to undermine our society once again. Besides gather every GUI's of those who managed to see Linux code and track them. We need to isolate them from everyone else. As for me I'll try to divert the public from this by making the announcement of the new M$ HyperOffice Application Server."
Some years later, the M$ MegaServer downtime was noted on M$ Encarta2000 as : "It was not a bug, just a feature"
Re: disappearance of History (Score:1)
"You see the founder of C had published the underlying code he used to create UNIX, somewhere near this time. This can have terrible consequences for us. People will know that there were other languages beyond QuickBasic. They will know about UNIX. They may then link all that with the cryptic meaning of Linux source code. And then they will know that Linus Torvalds was not mad at all..."
micro$haft was (is) a major player in making c as popular as it is. most of windoze is c. remember that next time you wanna write a mission critical application in c.
Re:intresting (Score:1)
My friend was given the club's "Hack of the Year" award that year for this exploration of distributed backup schemes... *grin*
Wait a second (Score:1)
Re:A bit of perspective (Score:1)
"Long bygone days" when we were still landing men on the moon. (Interesting that this came up so soon after the 30th anniversary of Apollo 11's landing, eh?)
The thing to remember is that computing's pace of change is an aberration compared to most other technologies (like the space program). We don't have hyperspace travel. We don't have manned interplanetary travel. We don't even have the capability to do a lunar mission these days! The computing equivalent would be having dumped Unix and its progeny in the dustbin, leaving the last two Unix machines ever built as museum pieces displayed at two of the national Computing Centers where they now use some really neat looking (and "reusable") minicomputers that can communicate with each other as fast as 9600bps (the bandwidth equivalent of low Earth orbit) or maybe even 57600bps (geosynch) with the right boosters.
Depressed yet?
Don't be. At least computing is that exception; it gives us an interesting, useful field that changes daily (just read /. to see how much it changes). Yes, too much change can be overwhelming--but the lack of change can be stifling. There may not be a middle ground, either (this isn't Goldilocks and the Three Bears).
Re:Lord, lord, I remember those days... (Score:2)
My first computer was a C64, but I didn't know enough back then to tinker with it on that level.
I don't solder, but I do read the assembler output from compilers. It's instructive, but no programming teacher in the world would pass me if I wrote code like that... (well, I just knew where the memory was allocated, so I used it, and I knew what the offset was, what's wrong with that? Oh, and those jumps and increments work just like a for loop, what's the problem, really?)
Re:At least someone does care about History (Score:1)
Recoving old data isn't trivial. About a year ago I talked with a fellow who was involved in a project to analyse old Landsat data, in an attempt to get a baseline for climate change info. The data was stored on magnetic tape that had been warehoused for something like 25 years. They had to get tape drives from scrap merchants, and go thru a laborous process to restore the tapes - slowly baking the tape to drive out moisture they had absorbed over the years, and scaping them with sapphire-edged blades to remove acculmated gunk.
On a more personal scale, I've got several 5.25 floppies (360k format) worth of old BBS philes and posts that may never get read again, just because of the hassle.
Re:At least someone does care about History (Score:1)
Recovering old data isn't trivial. About a year ago I talked with a fellow who was involved in a project to analyse old Landsat data, in an attempt to get a baseline for climate change info. The data was stored on magnetic tape that had been warehoused for something like 25 years. They had to get tape drives from scrap merchants, and go thru a laborous process to restore the tapes - slowly baking the tape to drive out moisture they had absorbed over the years, and scaping them with sapphire-edged blades to remove acculmated gunk.
On a more personal scale, I've got several 5.25 floppies (360k format) worth of old BBS philes and posts that may never get read again, just because of the hassle.
Re: disappearance of History (Score:1)
I don't know where were you during DOS times and first OS wars but let me remind you about a few facts:
A: Windows was originally written in Pascal. Later for several reasons it turned into C. Unfortunately the convertion was made such way that a lot of M$ Windowz C code is a Hell of half-hybrid mess. I know because I worked with it.
B: I don't know what are the prices for Visual C now. But when I worried about it it costed a Hell. Even Borland C++ was much more cheaper and costed only $250. Is this what you mean popularity?
C: I have seen several shows made by Microsoft people around here. They advertise everything they can. But one thing that I clearly noted is that C, C++, SDK, DDK and similar stuff are poor relatives in these shows. Note that I am talking about developers.
D: The above doesn't go with VB. At least then, when I still had some interest in Windows. No matter that most of the people around here, where C and/or Pascal fans, these guys were always trying to convince that VB was the "real thing".
E: Mission critical application in C? Well I agree, if you don't presume Windows.
Re:Lord, lord, I remember those days... (Score:1)
If I recall correctly, it worked in concert with cbm's ieee 488 like protocol that communicated with the drive. There were three lines, clk, data, and atn, and a rather baroque but effective dance done twixt the c64 and 1541 that allowed a device on the serial chain to take either a master or slave role. It was pretty sophisticated stuff, really. IMHO cbm's firmware was some the slickest code I've ever hacked.
Anyway, fastloaders got around the 'may I have a byte, please?' 'Yes you can!' 'Ok give it to me.' 'here it comes!' >get the byte 'I have it, thank you.' 'you're welcome.' cycle that made the 1541 so slow by running code in the drive (yes, you could do that! Gives me a woody just thinking about it!) that would read the disk directly, bypassing the os entirely, decode the group codes on the disk w/ your own optimised code, and do serial i/o manually, splitting the data bytes into bit pairs, and using two of the i/o lines to jam that data as fast as you could into the computer. Many programs to do this relied on knowing the clock speed and skew between the drive and computer cpu's, and just syncing up, and stuffing asynchronously a bunc of bit pairs in a row, resyncing, and doing it again, and again.
This is fine for a US machine, but european machines have differing clock speeds, because of different video standards. I was stuck doing a fastloader that had to be portable to europe.
I had to be synchronous.
I found that the XOR gate on the ATN line complemented the data on the data line. This allowed me to use 2 translation tables ( one for clk=1, one for clk=0 ) and transfer data on clk *edge*. doubling my speed. I was faster than apple's prodos on a c64!
Incidentally, for those who know the ins and out's of this stuff, the client didn't care about other devices being active on the serial bus. My previous code took that into account, but was us only. These guys (westwood) wanted portability. for it to work, only one drive and no printers could be on, or bus confusion resulted.
The lore involved in c64 fastdos stuff is really neat. Wish I had time and space to get into it more here.
Re:Perspective and Teaching (Score:2)
I have a hard time imaginging handing off memory mapping to some external function. Probably that's why I still live in an Assembly Language world. But I like being in control of what the pins on that sixteen pin processor I embedd code into is gonna do.
I admit it's anachronistic, of course. But remember that a sizeable proportion of the processors being fabbed are still 4 and 8 bit ones. 64 bit processors barely even make it onto the same chart.
Re:ambrosia from heaven (Score:1)
but aren't the legendary source notes from
Unix V6 in published form now (I can't remember
the cute title they go by, but I've seen them
everywhere). They include the kernel source and
comments.
It finally became clear to me (after browsing
the source files of Ritchie's compiler) why
the implicit int existed in the language. I
never bought the "lazy programmer" excuse but
in looking at the bottom of the first source
file, the reason is obvious. Without a preprocessor, an implicit int declaration was
the most readable way to define manigest constants
(or magic numbers).
Neat =)
Re: disappearance of History (Score:1)
Microsoft paid me three cents for accepting Encarta '99. It was priced $24.97 and it had a $25 rebate coupon inside the box.
Re:intresting (Score:1)
Is Slashdot being archived anywhere? Do these "the thread dies by design after 48 hours" discussions get saved anywhere where they are searchable?
Re:Wait a second (Score:2)
Try getting & looking at the sources to gcc sometime. It's pretty interesting, although monstrous at this point.
--bdj
Re:a software repository for old code? (Score:1)