Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AT&T Operating Systems Unix Technology

The Strange Birth and Long Life of Unix 293

riverat1 writes "After AT&T dropped the Multics project in March of 1969, Ken Thompson and Dennis Ritchie of Bell Labs continued to work on the project, through a combination of discarded equipment and subterfuge, eventually writing the first programming manual for System I in November 1971. A paper published in 1974 in the Communications of the ACM on Unix brought a flurry of requests for copies. Since AT&T was restricted from selling products not directly related to telephones or telecommunications, they released it to anyone who asked for a nominal license fee. At conferences they displayed the policy on a slide saying, 'No advertising, no support, no bug fixes, payment in advance.' From that grew an ecosystem of users supporting users much like the Linux community. The rest is history."
This discussion has been archived. No new comments can be posted.

The Strange Birth and Long Life of Unix

Comments Filter:
  • Future (Score:5, Insightful)

    by masternerdguy ( 2468142 ) on Friday December 02, 2011 @01:48PM (#38240990)
    I can see some form of UNIX making it to the 22nd century and beyond.
  • since the old versions were known as Version 5, Version 7, and so on.

    • System I, I think System II got the "versions", then they "jumped" to System III, although many people allready had sidejumped to Berkeley.
      But indeed it started with System, not versions, (but the "versions" made it popular :-))
      I found particularly interesting the "programmers work bench" http://en.wikipedia.org/wiki/PWB/UNIX [wikipedia.org] wich had all kinds of cool programming and text processing tools :-)

    • by antdude ( 79039 )

      ... and don't call me Shirley. ... and stop calling me Shirley! :)

  • UNIX family tree (Score:5, Informative)

    by HockeyPuck ( 141947 ) on Friday December 02, 2011 @01:53PM (#38241072)

    Image from wikimedia of the UNIX Family Tree [wikimedia.org]

    • by the linux geek ( 799780 ) on Friday December 02, 2011 @02:02PM (#38241196)
      Top500 is basically irrelevant as a model of the server industry as a whole. UNIX is still kickin' on scale-up commercial servers and doing pretty well at it.
    • by sunderland56 ( 621843 ) on Friday December 02, 2011 @02:40PM (#38241752)
      Do you really consider Unix and Linux to be two separate things?

      If lawyers didn't exist, Linux would not have been needed.
      • by BitZtream ( 692029 ) on Friday December 02, 2011 @02:54PM (#38241946)

        Yes, they are different things.

        UNIX implies a specific API and several other things. Several OSes are UNIX, including Mac OS and Solaris.

        Linux is an OS that is not UNIX as it intentionally does not implement the requirements for being called UNIX and as such has never and will unlikely ever be certified as a UNIX.

        Just because you don't know what the words you use MEAN doesn't mean no one else does.

        • Re: (Score:3, Funny)

          Several OSes are UNIX, including Mac OS and Solaris.

          Right, of course, I had *totally* forgotten that MacOS and Solaris were binary compatible. My bad.

          Good thing I didn't confuse the terms "Unix" and "POSIX compliant". That would have been embarrasing.

        • Re: (Score:2, Insightful)

          by jedidiah ( 1196 )

          You sound like a Mac user that's content to use a very limited definition of what constitutes Unix in order to brag and create misleading ads.

          Those of us that actually use multiple Unixen on a daily basis and have done so for decades have a more complete picture of what a Unix is.

          The "several other things" are kind of important on a day to day basis.

          Linux is much more compliant as a Unix than MacOS is in this regard.

          Go play with your pretty pictures and stop trying to lecture real Unix users.

          • by Guy Harris ( 3803 ) <guy@alum.mit.edu> on Friday December 02, 2011 @03:41PM (#38242748)

            Those of us that actually use multiple Unixen on a daily basis and have done so for decades have a more complete picture of what a Unix is.

            And those of us who actually develop for multiple Unixes have to deal with all their quirks, and don't always find Mac OS X the quirkiest.

      • by jedidiah ( 1196 )

        No. Linux was needed because the FSF takes too long to finish anything. Even without the lawsuits, GNU would have had broader mass market appeal than the BSD.

      • Do you really consider Unix and Linux to be two separate things?

        "Unix", as in "the registered trademark "Unix"", is separate from "Linux", meaning either the Linux kernel or the set (equivalence class?) of Linux distributions. To be legally eligible to be called a "Unix", an OS has to pass the Single UNIX Specification test suite; as far as I know, nobody's run any Linux distributions through that test suite.

        However, Linux is most definitely a "Un*x", in that its API is a UNIX-derived API, even if it might not be able to check every single checkbox for the Single UNIX

    • by JWSmythe ( 446288 ) <jwsmythe.jwsmythe@com> on Friday December 02, 2011 @02:46PM (#38241860) Homepage Journal

      [smacks G3ckoG33k with a wrench, and drags him into another room]

      Look here, we have something to explain to you.. Unix spawned many variations.

      http://upload.wikimedia.org/wikipedia/commons/7/77/Unix_history-simple.svg [wikimedia.org]

      All were similar in concept, but had their own ways of doing things. As this branched away from a common path, most groups agreed on a common set of rules, known as POSIX.

      Once you've learned how one Unix-like environment works, you can use them all. You will find that a Linux server, an Android phone, a TiVo DVR, and even an Apple desktop, all operate in very similar ways, although each has its quirks.

      The outstanding rogue operating system now is Windows. They too have recognized that they are missing out by remaining completely non-compliant, and have begun incorporating various aspects of POSIX as add-on (SFU or SUA) and 3rd party (Cygwin) packages.

      The chart you displayed should have had the "Unix" name divided between major and minor groups. Major being operating systems such as Linux. Minor elements combined in as "Other Unix" and "Other OS". In that, "Windows" having such a minor share, should have only been labeled "Other OS".

      In November 1993, Cray, Inc accounted for 40% all systems in the graph, and the largest share of the "Unix" segment. It would have been a mixture of UNICOS, COS, and Solaris. "Unix" as a specific OS only accounted for 15%. Even those were simply the OS name provided for the list, as an indication of a Unix-like operating system, not that it was actually "Unix".

      Now get back out there, and don't make me hit you with a wrench again.

    • The heydays ended ten years ago:

      http://en.wikipedia.org/wiki/File:Operating_systems_used_on_top_500_supercomputers.svg [wikipedia.org]

      The culprit? Linux.

      ...which is a UNIX-compatible OS.

      I'm curious how much recognizably-AT&T-derived code is in the current commercial UNIXes; probably more than in Linux distributions, but it might not be as much more than people think. UNIX's legacy is more the APIs and command-line interface than the actual code, and Linux has that stuff.

    • by DesScorp ( 410532 ) on Friday December 02, 2011 @02:59PM (#38242034) Journal

      The heydays ended ten years ago:

      http://en.wikipedia.org/wiki/File:Operating_systems_used_on_top_500_supercomputers.svg [wikipedia.org]

      The culprit? Linux.

      Linux is Unix. Even if it's not certified as such. If it walks like a duck, quacks like a duck, etc. People started using Linux in the first place because they wanted "a Unix" for personal use. Linux is just a clone of Unix. In the end, it's not really all that different from "Unix proper" than the various flavors of licensed Unix are from each other. I'd argue that most Linux systems are a good deal closer to, say, Solaris, than OS X is... an officially certified Unix.

      • by MightyMartian ( 840721 ) on Friday December 02, 2011 @03:49PM (#38242854) Journal

        No kidding. The whole "*nix" descriptor came about because there were operating systems that were actually licensed variants of Unix, and other systems that were Unix-like, but legally could not call themselves Unix. Unix vs. Unix-like was not a technical description, but rather a legal one. Since Linux supports pretty much all the major features found in actual Unix-based systems, for all intents and purposes it is a Unix variant, even if it is a rewrite.

    • The top500 is only an indicator of a specific use. And I suspect that Linux is used because of specific advantages like cost and flexibility. The ability to run Intel/AMD in combination with GPUs is a lot cheaper than hardware from a single maker. Also the Linux kernel can be tweaked for HPC. It is harder with commercial Unix. For the most part Unix is still being used for things like big iron.
  • I remember ... (Score:5, Interesting)

    by versimilidude ( 39954 ) on Friday December 02, 2011 @01:57PM (#38241112)

    I remember the first time I saw Unix, in 1976. The first step in installing it was to compile the C compiler (supplied IIRC in PDP-11 assembler) and then compile the kernal, and then the shell and all the utilities. You had an option as to whether you wanted to put the man pages online since they took up a significant (in those days) amount of disk space. Make was not yet released by AT&T so this was all done either by typing at the command line or (once the shell was running) from shell scripts.

    • Re:I remember ... (Score:5, Interesting)

      by Guy Harris ( 3803 ) <guy@alum.mit.edu> on Friday December 02, 2011 @02:48PM (#38241886)

      I remember the first time I saw Unix, in 1976. The first step in installing it was to compile the C compiler (supplied IIRC in PDP-11 assembler)

      As I remember, and as the "SETTING UP UNIX - Sixth Edition" document says (see the start *roff document in this V6 documentation tarball [tuhs.org] - yes, I know, tarballs are an anachronism here :-)), V6 came in a binary distribution that you read from a 9-track tape onto a disk:

      If you are set up to do it, it might be a good idea immediately to make a copy of the disk or tape to guard against disaster. The tape contains 12100 512-byte records followed by a single file mark; only the first 4000 512-byte blocks on the disk are significant.

      The system as distributed corresponds to three fairly full RK packs. The first contains the binary version of all programs, and the source for the operating system itself; the second contains all remaining source programs; the third contains manuals intended to be printed using the formatting programs roff or nroff. The `binary' disk is enough to run the system, but you will almost certainly want to modify some source programs.

      You didn't have to recompile anything (at least not if you had more than 64KB; I had to do some hackery with the assembler to get it to run on a 64KB machine, as there wasn't enough memory to run the C compiler - I had to stub out the pipe code with an assembler-language replacement for pipe.c, and then recompile the kernel with a smaller buffer cache and the regular pipe code). Most users probably either had to or had good reasons to recompile the kernel (different peripherals, more memory for the buffer cache - or less memory in my case, so I had to shrink it from 8 whole disk blocks to 6 - etc.), and if you weren't in the US eastern time zone or didn't have daylight savings time you had to change ctime.c, or whatever it was called, in the C library for your time zone, recompile the C library, and then rebuild all utilities with the new C library (no Olson code and database, no shared libraries, no environment variables so no TZ environment variable).

    • by Thud457 ( 234763 )
      do I smell a whiff of Gentoo in here?
  • by cashman73 ( 855518 ) on Friday December 02, 2011 @02:01PM (#38241180) Journal
    Since AT&T was restricted from selling products not directly related to telephones or telecommunications, they released it to anyone who asked for a nominal license fee.

    It's interesting how AT&T couldn't support it for this reason, because today, UNIX is at the heart of both iOS and Android, which run some of today's most popular telephones.

    • by ackthpt ( 218170 ) on Friday December 02, 2011 @02:07PM (#38241264) Homepage Journal

      Since AT&T was restricted from selling products not directly related to telephones or telecommunications, they released it to anyone who asked for a nominal license fee.

      It's interesting how AT&T couldn't support it for this reason, because today, UNIX is at the heart of both iOS and Android, which run some of today's most popular telephones.

      Also at the heart of OS X. One of the smartest moves by Apple and Jobs, replacing the hideous old Mac OS with something built on Mach and borrowing heavily from BSD. Apple made the painful leap and it paid off handsomely.

      • I thought it was a Mach kernel and a BSD userland. How exactly that's quintessentially different from me installed Cygwin on my Windows machine and calling it a Unix machine is beyond me.

        • I thought it was a Mach kernel and a BSD userland. How exactly that's quintessentially different from me installed Cygwin on my Windows machine and calling it a Unix machine is beyond me.

          http://en.wikipedia.org/wiki/XNU [wikipedia.org]

          http://osxbook.com/book/bonus/ancient/whatismacosx//arch.html [osxbook.com]

        • I thought it was a Mach kernel and a BSD userland.

          It's a kernel that consists of Mach plus BSD code plus IOKit, with the BSD code modified to let Mach handle platform-specific stuff, the lower levels of process/thread management, and paging and let IOKit handle talking to hardware. Except when doing stuff such as sending Mach messages, userland talks to the BSD code for system calls - process management, address-space management, file system operations, and network operations all involve system calls to the BSD layer even if the system call in question ma

    • by AB3A ( 192265 ) on Friday December 02, 2011 @02:23PM (#38241518) Homepage Journal

      Yes, but imagine trying to argue this to Judge Greene during the breakup of AT&T in the early 1980s.

      AT&T stayed out of that fray because there was no way in hell that they could have argued that this was a possible outcome based upon what was going on with the state of the art at the time.

  • by Myself ( 57572 ) on Friday December 02, 2011 @02:03PM (#38241204) Journal

    Several issues of the Bell System Technical Journal tell the story of UNIX [alcatel-lucent.com], in their own words. This one [alcatel-lucent.com] in particular is interesting.

    • by Myself ( 57572 ) on Friday December 02, 2011 @02:07PM (#38241262) Journal

      Here's the index of the July-August 1978 issue [alcatel-lucent.com] where the whole series of articles appears. Better format than the search above.

    • Thanks! Fascinating reading. Like this snippet:

      UNIX systems generally have a good, though not impeccable, record for software reliability. The typical period between software crashes (depending somewhat on how much tinkering with the system has been going on recently) is well over a fortnight of continuous operation.

      (The term "fortnight" is not widely used in the U.S., so I'll clarify that a fortnight is two weeks.)

      • Cardinal sin of replying to myself, but this one is too good not to post (in the spirit of the apochryphal "640K should be enough for anyone"). From page 1962:

        ...most installations do not use groups at all (all users are in the same group), and even those that do would be happy to have more possible user IDs and fewer group-IDs. (Older versions of the system had only 256 of each; the current system has 65536, however, which should be enough.)

    • by freality ( 324306 ) on Friday December 02, 2011 @03:15PM (#38242326) Homepage Journal

      There's a talk from 1986 by Richard Hamming at Bellcore, about how to do great research, but it also ends up in a short discussion about the conditions there that led to UNIX:

      http://www.paulgraham.com/hamming.html [paulgraham.com]

      The whole talk is really excellent, and there's this theme in it that the really great things come from some unexpected places, by the compounding of seemingly unrelated character traits, work habits and organization dynamics.

      At the end in the Q&A, Hamming gets into a short discussion with the host Alan Chynoweth about the origins of UNIX, evincing from Alan a favorite quote:

      "UNIX was never a deliverable!"

      expanded:

      "Hamming: First let me respond to Alan Chynoweth about computing. I [was in charge of] computing in research and for 10 years I kept telling my management, ``Get that !&@#% machine out of research. We are being forced to run problems all the time. We can't do research because we're too busy operating and running the computing machines.'' Finally the message got through. They were going to move computing out of research to someplace else. I was persona non grata to say the least and I was surprised that people didn't kick my shins because everybody was having their toy taken away from them. I went in to Ed David's office and said, ``Look Ed, you've got to give your researchers a machine. If you give them a great big machine, we'll be back in the same trouble we were before, so busy keeping it going we can't think. Give them the smallest machine you can because they are very able people. They will learn how to do things on a small machine instead of mass computing.'' As far as I'm concerned, that's how UNIX arose. We gave them a moderately small machine and they decided to make it do great things. They had to come up with a system to do it on. It is called UNIX!

      A. G. Chynoweth: I just have to pick up on that one. In our present environment, Dick, while we wrestle with some of the red tape attributed to, or required by, the regulators, there is one quote that one exasperated AVP came up with and I've used it over and over again. He growled that, ``UNIX was never a deliverable!''"

  • Makes me wonder whether or not we'd be using as many Windows machines had the government allowed AT&T to sell and market Unix.

    • Makes me wonder whether or not we'd be using as many Windows machines had the government allowed AT&T to sell and market Unix.

      No. Windows got ahead because it was designed primarily as a platform for running high level applications, such as word processors and spreadsheets, by single users on microcomputers rather than being designed as a multi-user, general purpose platform for programmers and other users who could invest a little more time in learning their way around the operating system. Also, Windows was backwards compatible with an operating system (DOS) which ran on older computers that did not have the hardware resources

      • by jedidiah ( 1196 )

        Windows was born ahead.

        It was born ahead, because it was the successor to DOS.

        DOS was successful because of it's association with the previous computing monopoly, namely IBM.

    • Windows? Windows didn't even exist back then. The competitor on the low end was CP/M. A few years later MS introduced their CP/M clone, DOS. Windows came about a decade later. No one used UNIX on personal computers because it was only lightweight by mainframe / minicomputer standards. Most personal computer didn't have protected memory and multitasking was a completely pointless operating system feature on a system that barely had enough RAM for one program.

      Window got ahead because DOS was already

      • by swb ( 14022 )

        Most personal computer didn't have protected memory and multitasking was a completely pointless operating system feature on a system that barely had enough RAM for one program. .

        To read the descriptions of the early systems UNIX ran on, 640k sounds like a lot of memory, especially when combined with one of those beefy 10 mb early hard disks.

        Or was the 8086 really that crippled? I seem to remember some kind of UNIX-alike OS running on it, but maybe it was on better 80286 CPUs or even 386s.

    • Ironically MicroSoft's first saleable OS was a flavor of UNIX called Xenix. But Xenix on 80286's was really lame compared to UNIX on a PDP-11 or VAX. UNIX wasnt really that efficient on a PC until the 80486s in the mid-1990s. That was fortunately the same time Linus started his version. MicroSoft sold Xenix to SCO after it developed MS-DOS. SCO patent-trolled it unsuccessfully for many years.
  • According to a friend of mine (who had a single-digit Unix license #), AT&T originally refused to release UNIX on the advice of their lawyers because the anti-trust agreement prevented them from getting into non-phone markets. The universities who wanted access to the, then fledgling, OS then sued them over a clause that prevented AT&T from suppressing technology. The universities won that battle.

    So (after probably sticking their tongue out at the lawyers who originally nixed the release) they released UNIX ... and were then sued by other computer companies for violating the "phones only" clause of the anti-trust agreement. AT&T also lost that battle.

    So now it was law. They couldn't suppress the technology, but they couldn't market or support it because it wasn't directly phone- related. That's where they came up with the rather convoluted system where, for a nominal price ($1 for universities, and more ($20K, I think for companies), and signing a non-disclosure agreement, anybody could get a mag tape with a working system, and source code, a pat on the back and a 'good luck'.

    ALL support was done by users (who, pretty early on got better at it than any company would have been) -- but the non-disclosure agreement meant that you couldn't just post a file with the fixed code in it... so that's where diff(1) patches came into play -- they exposed the fix without exposing too much of the source code. In some cases where patches were extensive, the originator of the patch would simply announce it and require people to fax a copy of the first page of their license before being emailed the fix.

    AT&T was also rather pedantic about protecting their trademark, which resulted in people often using the UN*X moniker rather than include the trademark footnote at the end of their postings.

  • Seems like this sort of story always brings out the low number /.'ers. I remember one post in the last few years where each reply was by a lower post until someone showed up with a number under 1000. (If I remember right, lol. Memory is not my strong suit now. And the older I get, the less I can about that. lol) While this was all happening, I was changing vacuum tubes in military crypto boxes. lol Hell, I remember my dad testing our TV's vacuum tubes at the A&P grocery store.
    • I remember using NCR SVR4 MP-RAS, which was NCR's flavor of UNIX after they were split off from AT&T in the late '80s.

      Okay, now it's someone's turn from the upper five digit cadre. ;-)

    • by Myself ( 57572 )

      Okay, I've got a five-digit...

  • by Kamiza Ikioi ( 893310 ) on Friday December 02, 2011 @03:03PM (#38242130)

    'No advertising, no support, no bug fixes, payment in advance.'

  • The article is well written but I am not sure they have checked their facts ... here is a direct quote from the article .... "It even runs some supercomputers." Now ... just head over to the TOP500 page (http://i.top500.org/stats) and sort by OS ..... I wouldn't call > 80 % just 'some supercomputers' ???
  • The "first brood" of high level languages- COBOL, LISP and FORTRAN- are well into their 2nd half century. I would not be surprised if they last a century along with UNIX and C.

Stellar rays prove fibbing never pays. Embezzlement is another matter.

Working...