Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

Ritchie Releases Early Compilers 67

Slamtilt writes "Dennis Ritchie has posted source for 2 ancient C compilers here. They date from '72-'73, and he says he feels "an element of embarrassment about displaying it", but also that they may be of some historical interest... "
This discussion has been archived. No new comments can be posted.

Ritchie Releases Early Compilers

Comments Filter:
  • I would hope so... continuously modifying assembler is such a pain. Think of the C coding like dynamic macros.
  • will the internet be a valid backup device ?

    Isn't that what Torvalds does? He uploads so people will mirror it, so that he doesn't have to back up. ;-)



    ---
    Have a sloppy night.
  • by Anonymous Coward
    Actually, it is very simple.

    The world's very first 'C' compiler was written in 'C', it was then 'bootstrapped' by running it through a 'C' interpreter.

    Cute, and effective. K&R were gods in their day.

  • Yeah, Time travel and temporal telepathy... Those are viable archiving formats. Say, do you think my DAT is compatible with those?
  • Another interesting link is the full two-volume Unix V7 manual, online at

    http://plan9.bell-labs.com/7thEdMan/index.html

    I was gratified to find it there, since my old (printed, Bell Labs original) V7 manuals are yellowed and cracking and on the verge of dying.

    -E
  • This is very true... and it's a stark comparison to today's code... Back then when 64K mean you were really rich and probably had 10Meg worth of platter drives lying around along with the tape machines Code had to be written elegantly and efficent as all get out.. I.E. it was nice tight code... today?? everything is bloated beyond belief. written very sloppy and is slow as slow can get. There are many things in linux/GNU that are the exception to this, Kernel 2.2 is starting to get tight again, X servers are getting better, and WindowMaker is just plain scary (98% faster than E or KDE... WOW!)

    If we wrote software as tight as they did back then, a 486 running linux would make an 8 processor PIII-550 machine running NT look slowwww.

    I cant wait for the grass-roots movement to start tighen up what we use today.... it's only gonna get faster and better!
  • Heh. I remember back in high school, I put some trojans in the login program for RSTS/E running on a PDP11-34.

    • The master password: AINLEY (named after Anthony Ainley, who played "The Master" on Dr. Who. ;-)
    • Normally, we would run "power programs" (programs that let non-privledged users do privledged things) by putting trojans in one of the shared directories (account [1,2]) with the privledge protection bit set. (Analagous to putting a setuid root program under /bin in unix) Then we'de just hope that the sysadmin never noticed. Or sometimes we would create privledged accounts. But we usually got caught.

      Then one day, I found out that logged out jobs were always privledged. So I put a trojan in the login program that would make it run other another program that was in my directory. My program, although it didn't have the priv big set in the protection code, would be running as a privledged job. Then I'de have it log in as me. When the sysadmin looked at me with SYSTAT, he'd see me logged in on a non-priv account running a non-priv program, so he was never the wiser...

      Until he found my copy of the modified login source code! ;-)

    Ah, the memories...



    ---
    Have a sloppy night.
  • Cool. Useful if I ever decide to type in the printout from my "Lions Commentary" (actually would have to locate some really old hardware I don't have to do that of course.)

  • can be found at ftp://gatekeeper.dec.com/pub/digital/sim/ [dec.com]. The UNIX binaries you can find there actually include C compiler source code so you could see what changed.
  • No, not being sarcastic. I browsed through the code, and it is kinda impressive. I've never written a compiler before, but I must admit, I see some of the errors I used to make in programming plus some of the inginuity that leads up to today.

    An idea for those with disk space, and some unf.. a software repository for old code?

  • Many of the people reading Slashdot (myself included) have probably heard in class of the days when programmers would write code in such a way as to conserve memory, and of course, be as efficient as possible.
    Actually, Linux (the kernel) frees the memory used for initialization code/data even today, so this is not all history. Anybody know of userspace code doing this ?
  • Ken Thompson modified the C compiler to recognize when it was compiling login - it added a trojan.

    He then modified the compiler to recognize when it was compiling the compiler so it inserted the login trojan _and_ the code to modify the compiler when it was compiling the compiler.

    Was it this version of the compiler?

    http://www.cs.umsl.edu/~sanjiv/sys_sec/security/ thompson/hack.html
  • Yikes! That is much older than the Lions code.
  • The community was smaller in those days. Also, there was no such thing as IEEE floating point. These two facts combined to put me on the phone with Dennis for about an hour one day, trying to get a PDP-11/45&70 C compiler to compile floating point code on a PDP-11/40 via floating point emulation. It did work, eventually.

    Later, someone came out with a PDP-11/40 C compiler that used 11/40 floating point instructions.

    It was a lot of fun trying to match up C compilers with machines that would "sort of" run them as long as you didn't declare any floating point variables. The compiler itself was "float safe" in that if you didn't declare floating point, it wouldn't use it. You had to use a "-f" flag to compile floating point, though, so that the right library stuff would get loaded to save & restore the floating registers.

    C programmers had to know assembler in those days. Believe it.
  • I think it's great that stuff like this is being found and preserved.
    I think it's wonderful too... I hope that somewhere, they are able to find code from original Unix(es). _That_ would be interesting. I'd think that surely, somewhere, there's still an ancient box that has that stuff on it. Still chugging along, attesting to the power and glory of it all.
  • waste() /* waste space */
    {
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    waste(waste(waste),waste(waste),waste(waste));
    }
  • For some reason I can't reach the site. Is it /. effect making its tricks again?

    Anyway the move is fundamental. We have been quite careless on keeping bits of code and source for the future. I do keep some ol'stuff around and probably many people do it. But it looks much like the stuff I forget in the attic.

    It is fundamentally important to keep these things for future generations. Well, it were such things that gave birth to the world we have now. Frankly our frenetic mood of "go forward" and the backward incompatibility games M$ plays, may lead us to loosing the roots where we all stand. What will my granchildren face when they will try to see my life? A few broken pieces of a CD with "Windows 95" labelled? Who was Torvalds? A mad finnish hacker making UNIX to fit in 3 inches?

    I wonder if anyone notes this problem. We are loosing some good pieces of History. Software is highly volatile in terms of preservation. We might have lost already 80% of it. Maybe it is mostly worthless stuff but we all have to keep in mind that no one has a Future without knowing its Past.

    I think it is time to call for such things. To give rebirth to old software. To build museums where one can be able to look at the ol'days. To someone smile on thinking that 15 years ago "I used THAT thing".

    Let's greet Ritchie's move. And also Borland. As /. remarked they have also have done such thing.
  • A second, less noticeable, but astonishing peculiarity is the space allocation: temporary storage is allocated that deliberately overwrites the beginning of the program, smashing its initialization code to save space. The two compilers differ in the details in how they cope with this. In the earlier one, the start is found by naming a function; in the later, the start is simply taken to be 0. This indicates that the first compiler was written before we had a machine with memory mapping, so the origin of the program was not at location 0, whereas by the time of the second, we had a PDP-11 that did provide mapping.

    I find that this passage is one of the more interesting comments made on the two compilers. Many of the people reading Slashdot (myself included) have probably heard in class of the days when programmers would write code in such a way as to conserve memory, and of course, be as efficient as possible.

    I was once showed a board with a Motorola 6800 (if memory serves) series microprocessor on it, with a hexadecimal keypad and a led display. It had no internal clock which would of make keeping time quite a challenge to someone who is used to simply grabbing it using predefined functions.

    This old code has many a use. Primarily as a teaching tool, make the students looks at the code, and give them specifics on how the compilers worked. In many ways, it's like looking at an antique car, you seem to be able to relate more to what is being taught to you... Since you have something concrete from which to apply your knowledge to.

    Personally, it's hard to imagine a computer functioning without mapped memory... Being able to see that early C compilers functioned without it, to me, brings a whole new perspective to computing. My 0.02c

  • Dennis announced this over in comp.lang.c, mentioning how he got the compilers off of old DECtapes, and a group regular immediately flamed him (very tongue-in-cheek) for being off-topic. ;^) The entire thread is simply hilarious, if you're into this sort of humor. Check out Dennis' original posting at Deja (News)!!
  • Yes, slashdot archives itself. After a certain period of time it converts the entire page to a threadless static HTML page (exactly like if you go to preferences and change your view mode to "flat"). It's nearly impossible to follow the discussion in the old pages, but at least all the comments are still there.
  • Gee, I don't feel so bad about my lack of comments, now.
    -russ
  • > Ken Thompson modified the C compiler to
    > recognize when it was compiling login -
    > it added a trojan.

    As far as I know this was talked about in a paper published in the early to mid eighties.

    I'm not sure wether the actual code was written. It may not have been published back then, but only released recently.

    Roger.

  • I just checked, and Linux is mentioned one single time on the Encarta '99 CD-ROM.

    ... but what was the quote?

    Microsoft paid me three cents for accepting Encarta '99. It was priced $24.97 and it had a $25 rebate coupon inside the box.

    ... and you paid the post office $0.33 cents to mail the rebate back

    -Brent
  • "C programmers had to know assembler in those days. Believe it."
    My personal feeling is that they should know assembler TODAY, as well, so as to have some frigging idea as to the consequences of their coding. Doesn't hurt to solder up a micro from scratch, using only one's wits and a data book or two, either. I did that in '83 w/a 8085 (my first computer!) and it simply rocks!
    Not to mention the manly confidence that understanding a sytem from the metal up gives one!
    8-}
    Helps when writing fastloaders for C64's, lemme tellya!
    Found a nifty use for that XOR gate on the 1541's ATN line, for sure!
    (if you understood what I just said, you really scare me)
    I've counseled a number of newbies to take this exact tack in their own educations, contrary to popular wisdom, and each and every one thanked me profusely for suggesting it. They eventually ran rings around their classmates, and actually GOT something out of (Acck!) DeVry!
  • I agree. I learned 8086 assembly while I was in 5th grade. I grew up with a PC/XT, and I learned about counting clock cycles, using registers, and basically how to milk my 4.77 Mhz for all they were worth. The first time I ever earned money for programming was when I wrote a TSR program for some guy... I've never regretted the time I spent in DEBUG.

    Moving ahead 10 years, in one of my CS classes, the teacher was trying to explain why you can't return a pointer to a local variable. I was dumbfounded, thinking, "duh! It's on the stack, dummy!" But nobody else in the class except for the teacher had ever really done anything with assembly language, and they were pretty much clueless about what actually goes on behind the scenes. In another class, where part of the grade was the speed of the executable, again few people had used ASM, so it wasn't too hard to stay in the top 6 for most of the programs. Nobody else even considered how long it takes to divide, or how to organize the arrays (or use pointers) so that a multiply isn't needed on array access (or at least keep it to one multiply, not two or three).

    On the other hand, the optimizing compilers are making this a confusing issue. Some tricks still work, but I am no longer sure of the fastest way to do things.

    Nowadays, hardware is getting cheaper while programmer time is getting more and more scarce. Often, it seems that it is more important to make the code readable and logical than to make it fast and efficient. Or is that just an excuse...
  • k & r were no gods. they made a verbose,
    write-only, CaSe sEnSeTiVe 'language'
    that is popular only because 'if its hard
    to understand, it must be powerful'
    they are boobs compared to
    chuck moore - inventor of forth.


  • all c code looks like that to me.
  • by Anonymous Coward
    in Linux, userspace programs are mapped into the process space as Read-Only segments by exec(), normally. this means one can't overwrite startup code. however, Linux pages-out unused pages. a page is a 4k block (on i386; may vary for other processors). if a page is in a read-only segment mapped from a file, Linux optimizes by not writing it to swap (considers the file as the swap). therefore, if the init code is packed together, it will be automatically freed whenever RAM is needed.

    this is based on my knowledge of the OS, and may be different from reality. there are other possible memory-management behaviours for differently-optimized systems.

    matju(@)sodom.dhs.org
  • he noted that he got the source off old tapes which he had carefully archived.

    will the internet be a valid backup device ?

    in the way that letting people have copys of the source? Will that be enough to make sure that the code is safe for all time my gut is saying NO but my head keeps muttering about nodes and US DOD inventing it to survive the BIG one.

    what do you think ?



    a poor student @ bournemouth uni in the UK (a deltic so please dont moan about spelling but the content)
  • Web sites changes so fast, but some of these should be archived as well. Even now, looking at sites designed only 2-3 years ago look primitive.
    When sites have existed for 10-20 years or more, it would be interesting to take a 'tour' of the site through time, and see how it evolved.
  • The top of the thread at Deja is at A primeval C compiler [deja.com]
  • by scrytch ( 9198 ) <chuck@myrealbox.com> on Saturday July 31, 1999 @05:00AM (#1773372)
    Hm. Something else Microsoft embraced and extended, it would seem :^)
  • by Ektanoor ( 9949 ) on Saturday July 31, 1999 @06:16AM (#1773374) Journal
    Things were clearly not going well that day over Redmond's Holy See. Supreme Cardinal Gates IX was already up since dawn. While his face kept the pale and emotionless look, anyone could note how nervously he hit the small squares on Minesweeper 4D. When Inquisitor Ballmer IV came in he ordered everyone to leave them alone.

    "How did this happen?" - he asked. Even behind glasses his eyes were icy cold.

    "Well it is an astronomical damn trick that is creating havoc. It seems that gravitational lenses also do have a good deal on reflecting radiowaves..."

    "So?"

    "Well, the deal is that we are getting back radiowaves dropped into Cosmos in the ending of the XXth century!"

    "Damn! - that's the worst problem we ever had since Reno IV had banned paper for the danger of its use by criminals and terrorists... We have to do something with it."

    "Well your Holy Highness we have very little time. As far as I know some people are already in current of the existence of Linux source code. Well we keep claiming that Linus was absolutely mad and the code meaningless. But the worst is about to come. You see, the data that's coming down is nearing the time when US commemorated 30 years of Moon Landing."

    "No one has ever landed the Moon!" - Gates cried while his face looked like if his internal processor became overheated.

    "Well you know perfectly as me that this is not exactly the case. It was your grandfather who wiped out all records about the Moon before 2100, when Surveyor I landed with Windows9999 on it. He tought it would be a great marketing move to sell the new OS with the label: "A small step for a OS a big leap for Mankind..."

    "Well, well, well - ok. I probably got a little bit over my nerves. Anyway that's not too critical. We can explain all that as another blockbuster Holywood made in those times. Anyway people would hardly believe that anyone can rise up from a Microsoft(TM) ChairMouse to take even three steps to the fridge... So what's really worrying you?"

    "Well... uh... Ritchie..."

    "WHAT!!!?"

    "You see the founder of C had published the underlying code he used to create UNIX, somewhere near this time. This can have terrible consequences for us. People will know that there were other languages beyond QuickBasic. They will know about UNIX. They may then link all that with the cryptic meaning of Linux source code. And then they will know that Linus Torvalds was not mad at all..."

    "We are in deadly danger..."

    "What shall we do?"

    "Well pick up the M$ Windows Central MegaServer and blow it up with a GPF. Meanwhile we will explain people that this is due to the CdC and "Richard Stallman" Front guerrillas trying to undermine our society once again. Besides gather every GUI's of those who managed to see Linux code and track them. We need to isolate them from everyone else. As for me I'll try to divert the public from this by making the announcement of the new M$ HyperOffice Application Server."

    Some years later, the M$ MegaServer downtime was noted on M$ Encarta2000 as : "It was not a bug, just a feature"

  • "You see the founder of C had published the underlying code he used to create UNIX, somewhere near this time. This can have terrible consequences for us. People will know that there were other languages beyond QuickBasic. They will know about UNIX. They may then link all that with the cryptic meaning of Linux source code. And then they will know that Linus Torvalds was not mad at all..."

    micro$haft was (is) a major player in making c as popular as it is. most of windoze is c. remember that next time you wanna write a mission critical application in c.
  • A friend of mine was webmaster for a computer club at the university, when the harddisk with the webpages on it died horribly. This was perhaps three or four years ago, when Altavista was relatively new, and my friend got the idea that maybe they had it archived. A short mail to Altavista, and a few days later the computer club received their entire site as a mail attachment from the friendly guys over at Digital. Now, that's service!

    My friend was given the club's "Hack of the Year" award that year for this exploration of distributed backup schemes... *grin*
  • How is a C compiler written in C??? Something about that does not make sense.
  • Folks, THIS CODE IS NOT ANCIENT HISTORY! Its what, a few decades old, at best. And yet, its treated as an ancient artifact. A relic of long bygone days.

    "Long bygone days" when we were still landing men on the moon. (Interesting that this came up so soon after the 30th anniversary of Apollo 11's landing, eh?)

    The thing to remember is that computing's pace of change is an aberration compared to most other technologies (like the space program). We don't have hyperspace travel. We don't have manned interplanetary travel. We don't even have the capability to do a lunar mission these days! The computing equivalent would be having dumped Unix and its progeny in the dustbin, leaving the last two Unix machines ever built as museum pieces displayed at two of the national Computing Centers where they now use some really neat looking (and "reusable") minicomputers that can communicate with each other as fast as 9600bps (the bandwidth equivalent of low Earth orbit) or maybe even 57600bps (geosynch) with the right boosters.

    Depressed yet?

    Don't be. At least computing is that exception; it gives us an interesting, useful field that changes daily (just read /. to see how much it changes). Yes, too much change can be overwhelming--but the lack of change can be stifling. There may not be a middle ground, either (this isn't Goldilocks and the Three Bears).

  • Um... you badly encrypted a disk, what?

    My first computer was a C64, but I didn't know enough back then to tinker with it on that level.

    I don't solder, but I do read the assembler output from compilers. It's instructive, but no programming teacher in the world would pass me if I wrote code like that... (well, I just knew where the memory was allocated, so I used it, and I knew what the offset was, what's wrong with that? Oh, and those jumps and increments work just like a for loop, what's the problem, really?)
  • Post-CD, of course, the problem is moot. Every version of the Linux that made it onto a CD will be around for 100 years or more, somewhere, for instance.
    I believe that CDs are turning out to be not nearly as durable as expected - many are becoming unreadable after only a few years. And finding a CD reader may be very difficult 100 years from now when everyone's using data crystals or DNA storage or who knows what.

    Recoving old data isn't trivial. About a year ago I talked with a fellow who was involved in a project to analyse old Landsat data, in an attempt to get a baseline for climate change info. The data was stored on magnetic tape that had been warehoused for something like 25 years. They had to get tape drives from scrap merchants, and go thru a laborous process to restore the tapes - slowly baking the tape to drive out moisture they had absorbed over the years, and scaping them with sapphire-edged blades to remove acculmated gunk.

    On a more personal scale, I've got several 5.25 floppies (360k format) worth of old BBS philes and posts that may never get read again, just because of the hassle.

  • Post-CD, of course, the problem is moot. Every version of the Linux that made it onto a CD will be around for 100 years or more, somewhere, for instance.
    I believe that CDs are turning out to be not nearly as durable as expected - many are becoming unreadable after only a few years. And finding a CD reader may be very difficult 100 years from now when everyone's using data crystals or DNA storage or who knows what.

    Recovering old data isn't trivial. About a year ago I talked with a fellow who was involved in a project to analyse old Landsat data, in an attempt to get a baseline for climate change info. The data was stored on magnetic tape that had been warehoused for something like 25 years. They had to get tape drives from scrap merchants, and go thru a laborous process to restore the tapes - slowly baking the tape to drive out moisture they had absorbed over the years, and scaping them with sapphire-edged blades to remove acculmated gunk.

    On a more personal scale, I've got several 5.25 floppies (360k format) worth of old BBS philes and posts that may never get read again, just because of the hassle.

  • Are you joking yeah?
    I don't know where were you during DOS times and first OS wars but let me remind you about a few facts:

    A: Windows was originally written in Pascal. Later for several reasons it turned into C. Unfortunately the convertion was made such way that a lot of M$ Windowz C code is a Hell of half-hybrid mess. I know because I worked with it.

    B: I don't know what are the prices for Visual C now. But when I worried about it it costed a Hell. Even Borland C++ was much more cheaper and costed only $250. Is this what you mean popularity?

    C: I have seen several shows made by Microsoft people around here. They advertise everything they can. But one thing that I clearly noted is that C, C++, SDK, DDK and similar stuff are poor relatives in these shows. Note that I am talking about developers.

    D: The above doesn't go with VB. At least then, when I still had some interest in Windows. No matter that most of the people around here, where C and/or Pascal fans, these guys were always trying to convince that VB was the "real thing".

    E: Mission critical application in C? Well I agree, if you don't presume Windows.
  • My post is rather old, by now, so noone is likely to read this, but for the sake of completeness, I'll explain ,cursorily, what that XOR gate did, to the extent that I can remember. (incidentally, I worked this out by *tracing circuit board foils* to figure out what was going on. How many of you HLL weenies have ever done THAT, eh?)
    If I recall correctly, it worked in concert with cbm's ieee 488 like protocol that communicated with the drive. There were three lines, clk, data, and atn, and a rather baroque but effective dance done twixt the c64 and 1541 that allowed a device on the serial chain to take either a master or slave role. It was pretty sophisticated stuff, really. IMHO cbm's firmware was some the slickest code I've ever hacked.
    Anyway, fastloaders got around the 'may I have a byte, please?' 'Yes you can!' 'Ok give it to me.' 'here it comes!' >get the byte 'I have it, thank you.' 'you're welcome.' cycle that made the 1541 so slow by running code in the drive (yes, you could do that! Gives me a woody just thinking about it!) that would read the disk directly, bypassing the os entirely, decode the group codes on the disk w/ your own optimised code, and do serial i/o manually, splitting the data bytes into bit pairs, and using two of the i/o lines to jam that data as fast as you could into the computer. Many programs to do this relied on knowing the clock speed and skew between the drive and computer cpu's, and just syncing up, and stuffing asynchronously a bunc of bit pairs in a row, resyncing, and doing it again, and again.
    This is fine for a US machine, but european machines have differing clock speeds, because of different video standards. I was stuck doing a fastloader that had to be portable to europe.
    I had to be synchronous.
    I found that the XOR gate on the ATN line complemented the data on the data line. This allowed me to use 2 translation tables ( one for clk=1, one for clk=0 ) and transfer data on clk *edge*. doubling my speed. I was faster than apple's prodos on a c64!
    Incidentally, for those who know the ins and out's of this stuff, the client didn't care about other devices being active on the serial bus. My previous code took that into account, but was us only. These guys (westwood) wanted portability. for it to work, only one drive and no printers could be on, or bus confusion resulted.
    The lore involved in c64 fastdos stuff is really neat. Wish I had time and space to get into it more here.
  • Bare-hardware programming is quite different from the kind of thing they teach in most schools. I've had to pick up embedded projects started by people used to coding in an Operating-System environment (where, as you mention, time can be determined by grabbing a function). Their code on barebones hardware usually lacks a robust initialization section at startup to properly initiate timers and I/O. When there are NO startup services, and NO OS to call on for anything, you learn pretty fast that every initial condition (i.e. handling the I/O ports on the processor itself, plus startup condtions for any external periperhals) is very important. That status LED will burn up pretty damn fast if you're using low duty cycle high current pulses to drive it (an important way to get optimal efficiency from an LED in a battery operation) and you forget to turn it off as one of the first operations out of the reset vector.

    I have a hard time imaginging handing off memory mapping to some external function. Probably that's why I still live in an Assembly Language world. But I like being in control of what the pins on that sixteen pin processor I embedd code into is gonna do.

    I admit it's anachronistic, of course. But remember that a sizeable proportion of the processors being fabbed are still 4 and 8 bit ones. 64 bit processors barely even make it onto the same chart.


  • I just woke up so my mind isn't entirely engaged,
    but aren't the legendary source notes from
    Unix V6 in published form now (I can't remember
    the cute title they go by, but I've seen them
    everywhere). They include the kernel source and
    comments.

    It finally became clear to me (after browsing
    the source files of Ritchie's compiler) why
    the implicit int existed in the language. I
    never bought the "lazy programmer" excuse but
    in looking at the bottom of the first source
    file, the reason is obvious. Without a preprocessor, an implicit int declaration was
    the most readable way to define manigest constants
    (or magic numbers).

    Neat =)
  • I just checked, and Linux is mentioned one single time on the Encarta '99 CD-ROM.

    Microsoft paid me three cents for accepting Encarta '99. It was priced $24.97 and it had a $25 rebate coupon inside the box.
  • Nope. The Internet will not be a valid backup device. It fails dismally at the task, as the primary function of the Internet still appears to be to scatter knowledge as widely as possible. It's not so bad as the disconnected BBSes of the 80's, but it's definitely not being archived anywhere particular. Plus, we have things like DejaNews deciding it's better to become a 'portal.' Not that there's a heck of a lot worth archiving on most of Usenet. It frightens me to think that Usenet posts could be viewed as one of the significant historical records of our time at some point in the future.

    Is Slashdot being archived anywhere? Do these "the thread dies by design after 48 hours" discussions get saved anywhere where they are searchable?
  • Each new C compiler is written in C and compiled using an older compiler. Obviously, somewhere along the line a compiler was once written in assembly.

    Try getting & looking at the sources to gcc sometime. It's pretty interesting, although monstrous at this point.

    --bdj

  • I mean something for like apache 1.0, linux 0.8.1 and what not. its too much of a scavanger hunt to find some of the old software. they are so much like historic works of art that we can learn from.

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis

Working...