Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

Dennis Ritchie Interview 82

A reader wrote to us with the news that Linuxworld is currently running an interview with Dennis Ritchie, Unix guy, C author, and Plan 9 [?] proponent.
This discussion has been archived. No new comments can be posted.

Dennis Ritchie Interview

Comments Filter:
  • by Anonymous Coward
    Dennis M. Ritchie heads the system software research department at Bell Laboratories's Computing Science Research Center.

    Ritchie joined Bell Laboratories in 1968 after obtaining his graduate and undergraduate degrees from Harvard University. He assisted Ken Thompson in creating Unix, and was the primary designer of the C language. He helped foster Plan 9 and Inferno.

    He is a member of the US National Academy of Engineering and is a Bell Laboratories Fellow, and has received several honors, including the ACM Turing Award, the IEEE Piore, Hamming, and Pioneer awards, the NEC C&C Foundation award, and the US National Medal of Technology.

    Ritchie out loud

    Check out our audio file at http://mithras.itworld.com/media/001021kalev_ritch ie.ram [itworld.com] to hear further conversations between Dennis Ritchie and Danny Kalev. LinuxWorld.com: Can you introduce us to Plan 9 (see Resources [slashdot.org] for a link), the project in which you're currently involved, and describe some of its novel features?

    Dennis Ritchie: A new release of Plan 9 happened in June, and at about the same time a new release of the Inferno system, which began here, was announced by Vita Nuova. Most of the system ideas from Plan 9 are in Inferno, but Inferno also exploits the exceptional portability of a virtual machine that can be implemented either standalone as the OS on a small device, or as an application on a conventional machine.

    As for Plan 9, it combines three big ideas. First, system resources and services are represented as files in a directory hierarchy. This comes from Unix, it is worked even better in Linux, but Plan 9 pushes it hardest. Not only devices, but things like Internet domain name servers look like files. Second, remote file systems -- likewise not a new or unique idea. But if all system resources are files, grabbing bits of another machine's resources is easy, provided the permission gods permit. Third, and unusual, is that the namespace -- the hierarchy -- of files seen by a particular process group is private to it, not machine-wide.

    LinuxWorld.com: C and Unix have exhibited remarkable stability, popularity, and longevity in the past three decades. How do you explain that unusual phenomenon?

    Dennis Ritchie: Somehow, both hit some sweet spots. The longevity is a bit remarkable -- I began to observe a while ago that both have been around, in not astonishingly changed form, for well more half the lifetime of commercial computers. This must have to do with finding the right point of abstraction of computer hardware for implementation of the applications.

    The basic Unix idea -- a hierarchical file system with simple operations on it (create/open/read/write/delete with I/O operations based on just descriptor/buffer/count) -- wasn't new even in 1970, but has proved to be amazingly adaptable in many ways. Likewise, C managed to escape its original close ties with Unix as a useful tool for writing applications in different environments. Even more than Unix, it is a pragmatic tool that seems to have flown at the right height.

    Both Unix and C gained from accidents of history. We picked the very popular PDP-11 [industrial computer] during the 1970s, then the VAX during the early 1980s. [See Resources [slashdot.org] for links to both.] And AT&T and Bell Labs maintained policies about software distribution that were, in retrospect, pretty liberal. It wasn't today's notion of open software by any means, but it was close enough to help get both the language and the operating system accepted in many places, including universities, the government, and in growing companies.

    LinuxWorld.com: Five or ten years from now, will C still be as popular and indispensable as it is today, especially in system programming, networking, and embedded systems, or will newer programming languages take its place?

    Dennis Ritchie: I really don't know the answer to this, except to observe that software is much harder to change en masse than hardware. C++ and Java, say, are presumably growing faster than plain C, but I bet C will still be around. For infrastructure technology, C will be hard to displace. The same could be said, of course, of other languages (Pascal versions, Ada for example). But the ecological niches you mention are well occupied.

    What is changing is that higher-level languages are becoming much more important as the number of computer-involved people increases. Things that began as neat but small tools, like Perl or Python, say, are suddenly more central in the whole scheme of things. The kind of programming that C provides will probably remain similar absolutely or slowly decline in usage, but relatively, JavaScript or its variants, or XML, will continue to become more central. For that matter, it may be that Visual Basic is the most heavily used language around the world. I'm not picking a winner here, but higher-level ways of instructing machines will continue to occupy more of the center of the stage.

    LinuxWorld.com: What is your advice to designers of new programming languages?

    Dennis Ritchie: At least for the people who send me mail about a new language that they're designing, the general advice is: do it to learn about how to write a compiler. Don't have any expectations that anyone will use it, unless you hook up with some sort of organization in a position to push it hard. It's a lottery, and some can buy a lot of the tickets. There are plenty of beautiful languages (more beautiful than C) that didn't catch on. But someone does win the lottery, and doing a language at least teaches you something.

    Oh, by the way, if your new language does begin to grow in usage, it can become really hard to fix early mistakes.

    LinuxWorld.com: C99, the recently ratified ANSI/ISO C standard, contains several new features, such as restricted pointers, variadic macros, bool, and new libraries for complex and type-generic arithmetic. Are you satisfied with C99?

    Dennis Ritchie: I was satisfied with the 1989/1990 ANSI/ISO standard. The new C99 standard is much bulkier, and though the committee has signaled that much of their time was spent in resisting feature-suggestions, there are still plenty of accepted ones to digest. I certainly don't desire additional ones, and the most obvious reaction is that I wish they had resisted more firmly.

    Of the new things, restricted pointers probably are a help; variadic macros and bool are just adornment. I've heard the argument for complexity for a long time, and maybe it was inevitable, but it does somewhat increase the cross-product of the type rules and inflate the library. One issue the question didn't mention is the introduction of the "long long" type and its implications, which is one of the more contentious issues in discussion groups about the language -- and it also makes the type-promotion rules much more complicated. But of course, 64-bit machines and storage are here, and it had to be faced.

    I'm less ecstatic about the C99 standard, but don't denounce it. They did a pretty good job; C does have to evolve. I was not involved with its work, but was given opportunities to snipe or contribute earlier. So I won't do much second-guessing after the fact.

    LinuxWorld.com: Considering proprietary languages such as Java and C#, was the decision to make C free deliberate? C users sometime complain that standardization bodies have no teeth and cannot force vendors to provide standard-compliant implementations. What is your preferred model of language development and standardization?

    Dennis Ritchie: I can't recall any difficulty in making the C language definition completely open -- any discussion on the matter tended to mention languages whose inventors tried to keep tight control, and consequent ill fate.

    I'm just an observer of Java, and where Microsoft wants to go with C# is too early to tell. Although Sun doubtless has spent more on Java as a strategic tool than would be justified simply by garnering some publicity for neat research work by Gosling and company, they've been quite open about the language specification as such. But of course they have been regarding the whole Java package (with libraries) as strategic versus Microsoft and other competitors.

    True enough that standards bodies themselves have weak teeth, but they do have influence and importance when a language begins to be widely used. Partly this is simply because it does allow public comment, partly because it adds a certain gravitas to the project. If there is an ISO or ANSI standard, and you distribute a product that claims to conform, your customer has at least a hook for arguing to you when it doesn't.

    On the other hand, the "open evolution" idea has its own drawbacks, whether in official standards bodies or more informally, say over the Web or mailing lists. When I read commentary about suggestions for where C should go, I often think back and give thanks that it wasn't developed under the advice of a worldwide crowd. C is peculiar in a lot of ways, but it, like many other successful things, has a certain unity of approach that stems from development in a small group. To tell the truth, I don't know how Linus and his merry band manage so well -- I couldn't have stood it with C.

    This whole area is complicated and there is no single lesson to be drawn from its history, except that early and extreme attempts at close control are likely to be detrimental.

    LinuxWorld.com: When will we have a C99-compliant edition of The C Programming Language? (See Resources [slashdot.org] for a link.)

    Dennis Ritchie: This is a question about which Brian [Kernighan] and I have thought hard and long, with considerable advice and assistance via email, Usenet, visits from our publisher, and interviews like this one. And we're still thinking. We are prepared to announce that we have not committed ourselves either way.

  • by Anonymous Coward
    For years I worked with DEC VMS (and IBM mainframes using VM/CMS). Both are rock solid, reliable as granite. Most desktop PeeCees don't have that same kind of reliability where you can beat on them like a bitch and they keep on humping. Great stuff. I would venture to say that most of Slashdot readers have had little experience beyond the PeeCee.
  • Other synonyms (taken from the Vandals song "I've Got an Ape Drape" of their 1998 album "Hitler Bad, Vandals Good"): ape drape, mullet, normal neckwarmer, hockey hair, achy breaky hair, forbidden hair, shong.


    "It's short in front and long in back"

  • Perhaps that's why the Linux source is a convoluted mess of hacks on top of hacks. When somebody adds a new feature it breaks half the other features in the kernel. That's what OO programming is designed to prevent - you can replace an object without breaking the rest of the program as long as the public interfaces are kept consistent. Apparently Linus et al haven't read anything on good programming habits since around 1970.
  • I especially like his comparison of a new language becoming widely used to hitting the lottery.

    Although I wonder what the odds are that I would ever write a script with a lottery ticket?

  • First of all, parameterized types come from C++, not C, so you'd be introducing part of something foreign to the language without introducing all of it (e.g. mystruct<long>).

    Second, the whole idea of short, int, and long is source-level (not binary) portability -- so there is no standard size for any of them. All that you can assume is that sizeof(long) <= sizeof(int) <= sizeof(short). I'm not saying this was necessarily the best idea (hence all the typedefs for INT32, etc.), but it is pretty neat if you have to write a C compiler on a platform with, say, 29-bit words. So specifying the size of the type as you suggest would run counter to the C philosohpy.

    Other than that, it's a great idea.<grin>

  • by dvdeug ( 5033 )
    >> Ada was beautiful...

    > Rubbish - the type system is flawed and the syntax is ugly.

    Why is the type system flawed? Are you complaining about specific features, or the whole concept of a highly rigid non-inferring type system?

    Syntax is in the eye of the beholder. Personally, I like block style (if .. then .. end if) better than C's (if (...) statement;). A true for loop that does everything a for loop should do without being the generic control structure is nice.

    I looked at your link to Limbo, and I'm not impressed with Limbo's syntax. Too quiet. I prefer a language to loudly tell me what's going on, rather than putting a lot of meaning into puncation.
  • For cases where you have lots of little things packed together, you have bitfield structure members. (Not that those have nice packing guarantees, but they are implementation-defined and thankfully the implementations I've worked with are sane.)

    Ideally, your code shouldn't rely on data types being exactly some width, but rather rely on them being at least some width. After all, signed integer overflow is actually undefined by ANSI, though every platform I've run into says it behaves as you'd expect 2s complement arithmetic to behave. Of course, none of that stops people (including myself) from writing code that relies on the specific width of the data types.

    What we really need are the "unspecified width integral types" such as we have today in int and friends, and a new set of types (or type aliases) for "exact width integral types". The latter might not get complete support from a conforming compiler, but at the same time, could make life for a bitfidler like me much nicer.

    --Joe
    --
    Program Intellivision! [schells.com]
  • And don't forget, randomly switching from night to day to night. "You stupid, stupid people!"

    --Joe
    --
    Program Intellivision! [schells.com]
  • Cool! I had missed that in C99. I knew about some of the other features (some of the initializer stuff, restrict pointers, the new complex type, etc.) but I had missed that one.

    Thanks for the pointer!

    --Joe
    --
    Program Intellivision! [schells.com]
  • Yes, but the standard didn't need extending at all.

    Strictly, no. Pragmatically, yes. There is (unfortunately) too much code which uses long where int would suffice (at least on modern machines).

    long could just have been made 64-bit (or 128 or 256 or whatever).

    Agreed. Indeed, that's what TI's compiler does for the C6000-family DSPs. The long type is 40-bits wide, since the DSP supports a 40-bit type in hardware (for high-precision filters). Unfortunately, it breaks code which assumes long is exactly 32 bits, and it causes code which only needed 32-bits (but which otherwise doesn't break) to run much less efficient. It's very annoying when a customer compiles their code and says "the output is big and slow, your compiler sucks", when really the problem is that their variables are declared as longs rather than ints.

    That said, you could argue the case for needing a new type to hold integers larger than the machine's word length

    Personally, I'd be happy if there was a portable way to get at something like a carry bit, so that arbitrary precision arithmetic isn't so painful. Right now, if I want to do arithmetic at a size larger than the largest available integer type, I either have to jump through hoops to figure out what the carry was supposed to be and do math at the maximum machine word size, or do math at some smaller size and use upper bits to represent the carry. Annoying, annoying, annoying.

    and retain a traditional unsigned long as the word length (so that you can cast to and from pointers).

    Ick! Ick! Ick! Actually, on the platforms I'm interested in, sizeof(int) == sizeof(void *) , but not necessarily sizeof(long) == sizeof(void *). Assuming you can typecast a pointer to an integer type and back is asking for trouble. (Although historically (and for old K & R C compatibility), typecasting between a pointer and an int is usually possible and fairly reliable across 32-bit environments.)

    --Joe
    --
    Program Intellivision! [schells.com]
  • ... didn't get the joke. The mother language of B and (later) C was BCPL. So it stands to reason that the next two languages should be P, and then L. Of course, some in the perl community have claimed they're the PL...

    ObPun: Back before C was called C, it was called New B. I guess even Ritchie and Thompson aren't immune from l33t sp33k.

    --Joe
    --
    Program Intellivision! [schells.com]
  • That's amazing, considering the plastic bricks came about 30 years [ocregister.com] after WWI was over.
  • When was the last time you saw a story on /. about VMS that wasn't FUD? NEVER? CmdrTaco must have some unholy pact with AT&T / Bell Labs to market UNIX System V release 4. Does Bell Labs UNIX run on Digital's superior VAX architecture? NO! (But before I get flamed, UC Berkeley's "BSD" UNIX does) UNIX is a toy operating System!
  • yes, the comp.os.plan9 newsgroup is getting more and more popular, there are many people who use it daily (and prefer it)... the code is easy to understand and play around with.

    the general fact is that whomever managed to install plan9 has a very high opinion of it.
  • It's been a while, but wouldn't it be Octopoi? I remember my Greek teacher saying that the plural of Hippopotamus should really be Hippopotamoi. Of course this was Ancient (Attic) Greek so maybe it's different anyways.

    In case you can't tell, I don't know what I'm talking about.
  • 30 years!? and we complain about 2.4.0 being late...
  • I was curious if anyone had started working on a successor to C++ yet?

    ...anyone other than Sun and Microsoft?

    -c

  • Digital/Compaq end-of-lifed the VAX architecture earlier this year. You can still run VMS (OpenVMS, really) on Alphas.
  • Take your pick. LISP. ADA. Smalltalk.
  • Presumably this post is a joke, since you can't buy a new VAX any longer, but yes, Bell Labs UNIX does run on the VAX. 32V is a port of Seventh Edition UNIX to the VAX. The Eighth Edition of Bell Labs research UNIX runs on the VAX only (and I think the same is true of Ninth and Tenth), though that is derived from BSD UNIX. I doubt that System V Release 4 has ever run on the VAX, but I could be wrong.

  • by volpe ( 58112 )
    Yeah, I loved Ada. Especially how I had to write external C functions to do bitwise operations because Ada has no facilities for this.
  • The "long long" hack in C99 is just plain stupid. How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    You obviously didn't read much about the new standard. C99 now mandates typedefs for datatypes which specify the type's size. If it's important for you that you have an unsigned integer which is EXACTLY 64 bits, you can use the type UINT64_C. This is all contained in <inttypes.h>; we now have typedefs specifying integer types of exactly n bits and at least n bits. If all your code is type-size dependent, simply use the new macros and you'll never have to change your code when 128-bit types become available.

    You can get more information at http://web.onetelnet.ch/~twolf/tw/c/c9x_changes.ht ml [onetelnet.ch].


    Regards,

  • The "long long" hack in C99 is just plain stupid. How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    I make a proposal a while back to add the keyword "extra", and define long and short as 32 bits and 16 bits, respectively.

    64 bit data would be "extra long", 128 bit data would be "extra extra long", etc.

    This would add a true integral (not char) 8 bit type, the "extra short". Bool could go away, since it's redundant with "extra extra extra extra short".

  • > you obviously don't know C!
    Whoa there Nelly! That's incorrectly inferring A LOT from just ONE post !

    > integer has no fixed size.
    True. Only a RELATIVE and MINIMUM number of bits is specified. This is guaranteed by ANSI:
    i.e.
    sizeof( long ) >= sizeof( long )
    sizeof( double ) >= sizeof( float )
    and
    sizeof( char ) >= 1 byte
    sizeof( short ) >= 2 bytes
    sizeof( int ) >= 2 bytes
    sizeof( long ) >= 4 bytes

    See the ARM [att.com] 3.6.1 and 3.2c Numerical Limits which states "This section defines the minimum numerical limits that a C++ implementation consistent with the ANSI C standard will provide in the header file <limits.h> and <float.h>

  • > How about reusing parameterized type syntax, e.g. long<64>, long<128>.

    Yes, it's a real pitty that C99 didn't adopt that syntax.
  • Plan 9's been around for a while, but it's catching on very well now. this is largely due to the recent new release and the open source license it's released under. the new release also includes substantial reworking of the internals and numerous other technical improvements. i highly recomend giving it a shot.
    i've been using Plan 9 for a few years. my progression (outside of work) basically went Micro$oft, Linux, FreeBSD, Plan 9. i've tried various Linux distros, *BSD, Solaris, and a few other smaller projects. Plan 9 blows them all away, hands down, in terms of everything but hardware support and application base.

    Linux has a good goal - make a free (or open, or whatever) Unix kernel. Unix is good, free is good. but it's about cloning Unix. it's not about advancing the state of the art; it's not about innovation. and adding new support for a really fast networking card or getting a file system to run %15 faster isn't innovation. Linux doesn't advance the art of the OS any. if you're interested in that, look to Plan 9 and Inferno.
  • Degrees are not for everybody, but would benefit more people than /. population would have you believe. The value is not in what computer skills the degree teaches specifically, but it does teach you about a whole lot of ideas people have already had.
    All that background reading may be a drag at times, but it will stop you spending time reinventing a whole lot of wheels and give you more time for coming up with those neat new things.
    It will also probably make you better at developing and improving your ideas.
    Depending on you school you will may have the opportuninty to sit face to face with large number of very good professors and grad students and discuss stuff. Online communication is fine but doesn't compare to face to face brainstorming sessions, an awful lot of good stuff has been invented in the coffee room.
  • It was all downhill after E
    Rules? My language don't need your stinkin rules.
  • The "short long" is a popular haircut amongst professional wrestlers, primadonna soccer players, and residents of Indiana.
    (buzzed on the top, but long in the back)

    As such, it should not be used in C/C++
  • In general computer languages become less readable and explainable in direct proportion to the amount of punctuation in them. This is why anyone can read gwbasic but perl is hard to read.

    On the other hand perl is more concise.

    DCL has the worst of both worlds. It has tons of punctuation and is less concise. Users prefer bash.

    In general VMS is ok if you want to set a mail server or something that doesn't require any attention after the initial set up. But no one really gets excited about these type of applications.

    UNiX is more loved than VMS because it is user friendly. Imagine that Linux was LinVMS for a second...

    What a horrid thought.

  • You could make long long 256-bit, ...

    The question isn't so much "how to support an X-bit type" (where X is 64, 128, 256, whatever).

    The *real* question is how to support ALL of the X-bit types: 8,16,32,64,128,... at the same time -- either that or we'll just change all the network (etc) protocols such that there are no more 8 or 16 bit-sized fields.

    Yes, there are ways to work around this, but they are all reasonably gross (IMO).

  • Having used ML in the past I'd agree with this; except for io which is horribly botched on the rest of the language. I guess I won't be using it to write device drivers any time soon.
  • I'm really surprised how so many free software advocates really rag on C++ continuosly. Ive been reading rant after rant about how C++ is really a bad solution to just about any problem.

    I know and use several languages (C,perl,python,tcl,FORTRAN,pascal,lisp,C++,java,BA SIC) and of all of them, C++ is my run away favorite. Why: its the only one of the languages I know thats really about what you can do, while the others are all about limiting you.

    C++ is the sharper blade, it is full of features that can use if you want to and know how- but you dont have to. It gets its bad rep because you can also use it if you dont know how. (Its also suffering from a moving ABI and g++ is still maturing, but those issues are easy enough to work around.)

    btw java is my least favorite language- mostly because of all the marketing and hype associated with it. To me its just an unremarkable lisp subset with C++ ish syntax and oft times a virtual machine backend.

  • No, its an operating system. It's true it has a virtual machine, but it's an operating system.
  • I'd rather have a short black ...

  • What we really need are the "unspecified width integral types" such as we have today in int and friends, and a new set of types (or type aliases) for "exact width integral types".
    It's funny you should say that... because the C99 standard provides precisely that with its intX_t types in stdint.h. Details in this slightly out of date summary [onetelnet.ch], or section 7.18 of the C99 standard.
  • The whole thing was about various languages. Plan 9 has it's own language - Alef. Why they haven't mentioned? It definitely (at least) interesting. And why everything about it just had disappeared from Internet? Censored ;)?

    I have an alter-ego at Red Dwarf. Don't remind me that coward.

  • ... does it support Beowulf clustering?

    ha ha.
    but to give a serious answer to a joky question, beowulf clustering under plan 9 would be doddle. it's a naturally highly distributed system.

    you'd hardly need any glue code at all.

  • Bell Labs isn't part of AT&T, it is part of Lucent. Lucent does not sell UNIX. Lucent doesn't really sell Plan 9 either.
  • We still have two MicroVAXen running in our lab. They do the data aquisition and control of a nuclear physics experiment. Then only thing that we had to replace in the 10 years were the SCSI disks - they were too small (only 20MB) - now they are 1GB. Most of the programs are written in FORTRAN. The best thing about the OS is it's backup handling of files: If you overwrite a file, the system automatically makes a backup copy (filename with appended number) and you can have as many backup copies as you want (very useful in environments where several users fiddle with the system *g*). One can also access files on other VAXen over DECnet without using special networking tools - very handy.
  • Even worse than sizeof(long) == 4 are the people who insist that Microsoft uses typedefs like BYTE, WORD, and DWORD in an attempt to proprietarize C instead of doing it to make the the definition of a file format (let's say .BMP) platform independent.
  • This story remindes me of the time microsoft was a part of linux back in the day. Bill Gates was an expert genius at microcoding the hardware of today in most unix machines. And thanks to the wonderful work of most of todays leaders such as Steve Jobs we can all sleep peacfully thanks to the invention of the century that will be imbedded within everything we buy by the year 2002. Thank you Bill. your the best!
  • and then you posted a comment so your moderation was undone, so it's still at +2.
  • Different endings in Greek form their plurals differently, much as in English. So, yes, the plural of -os is -oi, but at least in modern Greek (I don't know ancient Greek) the pural of -i (Octopodi is octopus) is -ia (Octopodia).
  • You could make long long 256-bit, long 128-bit, int 64-bit, and short 32-bit if you really, really needed to. The standard certainly permits that.

    Yes, but the standard didn't need extending at all. long could just have been made 64-bit (or 128 or 256 or whatever). There was no need to add a new "long long" type. That said, you could argue the case for needing a new type to hold integers larger than the machine's word length, and retain a traditional unsigned long as the word length (so that you can cast to and from pointers).

  • VMS FUD? Whatchya talkin about Willis? How can you FUD a great machine like the VAX? Slashdot has had a few great articles on this legend. [doa.org] VMS, although different than UNIX in terms of syntax, was also a very reliable system used for much the same scientific purposes. For three years I was at my university, the only two times the VAX failed was when a garbage truck backed into the substation transformer or air conditioner failed. Despite its arcane syntax (cd == SET DEFAULT .[DIRECTORY]) it and the people who used it were great.
  • Plan9 != VMS
    wow, the lameness filter said no to the above line.
    ------------
    a funny comment: 1 karma
    an insightful comment: 1 karma
    a good old-fashioned flame: priceless
  • I bet he could be talking about Modula-2 which is beautiful in many ways. (but silly in some other ways too...)

    I guess the heyday of M2 was in the late 1980's when some good cheap compilers were available for Atari ST's and Amigas. The C compilers that I could get for my Atari ST either didn't have any floating point support, or they were too expensive for a poor student. M2 was $30 I think, so I learned it.

  • Comment removed based on user account deletion
  • To tell the truth, I don't know how Linus and his merry band manage so well -- I couldn't have stood it with C.
    This is a good question. How do they manage? I can't even get my group of friends to pick a resturant.
    Also, "Linus and his merry band" would be a great name for a rock group.
  • Shouldn't that be P and not D?
  • How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    How about reusing parameterized type syntax, e.g. long<64>, long<128>. For that matter, long would be identical to int<2> or even char<4>.

  • The "long long" hack in C99 is just plain stupid. How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    The only obvious solution - 128 bit integers will be represented by either "loooong" or "damn long", and "long long" will be depricated in favor of "looong" or "rather long".

    The new language hybrid thus produced will be called "(C)obol" - the (C) means I have the copyright, and will demand royalties for each line of code produced to encourage terseness in the industry.
  • I'm not sure if you're asking about an undergrad or graduate degree - for me the undergrad degree in computer science was a great idea but I'm sure going to grad school would have been a huge mistake for me - I was not ready for that at the time.

    On the general question of degrees, I don't know about other degrees but I've found most of my computer science degree to be incredibly helpful - and of all the people I've worked with most of the people I've respected the most and produced the best work seemed to have CS degrees. Of course there are exceptions both ways, but a CS degree is a great way to spend a few years thinking about CS in general and then spend a while applying all of the cool abstract things you knew.

    I think probably going back to school for a year or so every ten years would be the best possible idea, but I'm not sure if I'll listen to my own advice.

    Related to your question is my opinion on why so many of these cool things seem to come from people who have graduate degrees - that's because people who have graduate degrees are very used to publishing ideas, and also able to document ideas in a clear and readable fashion (for the most part). Even if you have an undergraduate degree you are probably not used to writing at the level a grad student is required to, and it will not be in your nature to do so if you have a cool idea.

  • A lot of us have come to believe that degree is not worth the cost, but I have noticed that many people who come up with very neat stuff have degrees, I am begining to ponder if my choice of skipping school to work is the right one. Anyone wanna help me compile a list of smart people and their inventions, and what degree they have, etc.

    "Ritchie joined Bell Laboratories in 1968 after obtaining his graduate and undergraduate degrees from Harvard University."

    segmond
  • Hmm, you better tell Vita Nuova (the Inferno people). They mistakenly believe it's an OS. Here's an excerpt from their announcement:

    Some seem to think that browsers should become operating systems, with sprawling functionality and clunky system calls. We thought it was much more stylish and productive to embed a proper
    operating system inside a browser. So we did.
  • Imagine that, putting an OS inside a browser. What does that do to Microsoft's antitrust case argument that the browser should be part of the OS?
  • by rjh ( 40933 )
    Unless I miss my guess, the Standard doesn't specify a size for char, either. It does guarantee that char <= short int <= int <= long int, but nothing more than that.

    Note that it's perfectly valid to have a C compiler where all the integer data types are of the exact same size.
  • anyone had started working on a successor to C++ yet?

    One well-known person in the C++ community was asked to speculate on what the next ISO C++ Standard would include (this was about a year after the standard had been released). He answered, "I love speculation. Work on [the next library] will start in March, 2012, shortly after the new ISO C++ 2011 was adopted. It will support 256 bit integer types, and a new library header <voicerecognition>."

    There have been plenty of direct "successors" to C++ in the last decade. None have caught on.

    "long long long"

    If you have hardware to support 128-bit stuff, then the system headers will have a datatype already. Personally, I think exponents are the way to go: "long^6 foo;" for a one-kilobit signed integer.

  • Is Plan 9 taking off?? I would really like to ditch this Linux crap and use something a little more current!!

    even though plan 9 itself is 9 years old

  • Linus does it just like the rest of us, ..., oh, er, nevermind.

    Seriously, Linus does it with strong leadership attained by actively (but not obviously) pursuing his apparent Godhood status.

    If people will die for their Gods, certainly they will not whine when some of their code hits the cutting floor.

    --
  • He helped write the best programming language, best operating system, and my favourite book (K&R)

    --
  • Ada was beautiful...

    Rubbish - the type system is flawed and the syntax is ugly.

    and before you ask - yes I DO know what I'm on about. I worked on a Validated Ada compiler. It was the most compact (lines of code) validated compiler of its day. I worked on several areas of validation suite compliance and an Ada Debugger.

    I then went on to work on much more interesting things and a much more interesting and elegant language - Limbo [vitanuova.com]

  • Alef was dropped from the latest version of Plan9 (3rd Edition) It was proving too irksome to port "yet another compiler" to each Plan9 platform, so a C threads library was written and many Alef programs ported to use it.

    Many Plan 9 ers have lamented the loss of Alef. If you are interested in Alef like languages you should check out Limbo, the inferno [vitanuova.com] programming language. Inferno runs hosted under Plan 9 and many other operating systems. There's even an browser plug-in so that you can run limbo apps in an Internet Explorer web-page!

  • I prefer a language to loudly tell me what's going on, rather than putting a lot of meaning into puncation.

    I have found limbo to be the most readable programming language I have encountered.

    Of course, any language can be used to write poorly structured or obsfucated code. But some languages make it difficult to write clear concise and coherent code.

    Contrary to your preference, I find that excessive syntactic sugar gets in the way of quickly determining the intent of a piece of code. (viz. source code with excessive comment lines - you soon lose the plot, especially since comments are almost always out of date w.r.t the code!)

    I agree that such sugar can make poor code more readable by emphasising syntactic structures. This is a poor substitute for good coding.

  • Didn't Digital ditch the VAX architecture? OR was it just the VMS OS
  • After a breif and unsuccessfull first attept at uni I joined the workforce convinced a degree wouldn't make much difference. To an extent it didn't, I earn more than most of my degreed friends for example.

    But, now I'm doing comp sci by correspondance, I mainly started it because my employer would pay for it but now I wish I'd done it earlier. The big differenc is not only being trained in how to think and how to problem solve but having having "useless theory" to fall back when learning new things.

    I'd definately recommend going back to school, in fact I think correspondance is the way to go if you want to do comp sci, you do it at your own pace when you want. For me thats do a huge ammount of work to get ahead then slacken off fall behind then get ahead again, repeat.

  • by Anonymous Coward on Tuesday December 05, 2000 @10:20AM (#580401)
    How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    no, the proposed change is:

    char char char char char char char char char char char char char char char char

  • by Mr Z ( 6791 ) on Tuesday December 05, 2000 @11:32AM (#580402) Homepage Journal
    i.e. The "long long" hack in C99 is just plain stupid. How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    You could make long long 256-bit, long 128-bit, int 64-bit, and short 32-bit if you really, really needed to. The standard certainly permits that.

    What really grinds me is that so many people assume sizeof(long) == 4 or worse sizeof(long) == sizeof(int) == 4. On the C6000-family DSPs, long is actually 40-bits long whereas int is 32-bits. You'd be surprised how many people this trips up.

    --Joe --Joe
    --
    Program Intellivision! [schells.com]
  • by cpeterso ( 19082 ) on Tuesday December 05, 2000 @11:48AM (#580403) Homepage
    The "long long" hack in C99 is just plain stupid. How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    Just make "long long" 128 bits. Make "short long long" 64 bits. Though, I'm not sure what "short long" or "long short" would be... maybe 16 bits?

  • by Azog ( 20907 ) on Tuesday December 05, 2000 @12:12PM (#580404) Homepage
    I think Linus does it by occasionally smacking people. For instance, on the Linux Kernel mailing list today, in the middle of a very technical discussion of how to fix a problem that was causing file system corruption, he posted:
    ...
    Are you all on drugs?
    ...
    Get your acts together, guys. Stop blathering and frothing at the mouth.
    ...
    This may sound really harsh taken out of context - in context I get the impression he was a little annoyed but still smiling.

    I'm not sure how much he does it on purpose and how much is just his personality, but he keeps a pretty tight grip on the overall direction of the kernel, mostly by understanding the code better than anyone else.

    Torrey Hoffman (Azog)
  • by nakaduct ( 43954 ) on Tuesday December 05, 2000 @11:42AM (#580405)
    Shouldn't that be P and not D?
    Whyzat?
    From the Jargon File [astrian.net]:

    C n. ... 3. The name of a programming language ... so called because many features derived from an earlier compiler named `B' in commemoration of its parent, BCPL. [Before C++] there was a humorous debate over whether C's successor should be named `D' or `P'.

    cheers,
    mike

  • by Junks Jerzey ( 54586 ) on Tuesday December 05, 2000 @11:15AM (#580406)
    In a past interview, he specifically mentioned Standard ML as a beautiful and practical language that he was surprised didn't catch on.
  • by UnknownSoldier ( 67820 ) on Tuesday December 05, 2000 @10:02AM (#580407)
    Dennis Ritchie made an interesting comment on new languages
    "There are plenty of beautiful languages (more beautiful than C) that didn't catch on."


    Does anyone know what [programming] languages he is specifically talking about?

    I was curious if anyone had started working on a successor to C++ yet?

    i.e. The "long long" hack in C99 is just plain stupid. How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?

    Don't get me wrong, I love C, but it needs to be cleaned up, and morphed into D.
  • by Dannon ( 142147 ) on Tuesday December 05, 2000 @07:48PM (#580408) Journal
    no, the proposed change is: char char char char char char char char char char char char char char char char

    Any syntax mistake with this data structure will be referred to as a 'char wreck'.

    ---
  • by photozz ( 168291 ) <photozz AT gmail DOT com> on Tuesday December 05, 2000 @09:48AM (#580409) Homepage
    Dennis Ritchie, Unix guy, C author, and Plan 9[?] proponent.

    Great, cheap sets, dentists for monsters, cardboard headstones and rubber octopuses (octopi?) That's all I need.........

  • by skoda ( 211470 ) on Tuesday December 05, 2000 @11:28AM (#580410) Homepage
    Nearing the end of my Ph.D. graduate career, having come in straight from undergrad, I suppose I'm qualified to comment on why a degree might be useful :)

    1) Like anything else (sports, programming, driving) "thinking" is an activity that can be made more productive & efficient through training, practice, and guidance. College can help you develop strong analytical thinking skills.

    2) In certain fields it is very helpful, if not necessary. It would be very difficult to go into, say, chemistry, chem. engineering, optics, EE, etc. without the focused study required of a college degree. One could self-teach themselves, say optical science, but it would be much more challenging than learning it with other students, and being taught by professors who already understand the field.

    3) Networking. networking. networking. (networking) It's an early chance to shmooze. Even if you're a socially awkard, introspective nerd (somewhat like me :), you will make friends who may be able to help you professionally later (and vice versa). Because college is so social, this is, perhaps, an easier way to start those skills compared to starting at work, where it may be more difficult initially to develop strong friendships with coworkers.

    4) Credibility. The job market for scientists & engineers is great right now. But the US economy *will* slow down, at some point. When that happens you (or I :( ) may lose our job. Anecdotal evidence suggests if two middle-aged people are applying for a job, all other things being equal, the one with a degree will be hired over the one with no degree but four extra years of experience. (YMMV)

    5) Further traing and/or change fields. Getting a degree later in life can be an effective way to switch careers, or move to a different field within your general profession. For example, an EE might get a M.S. in optics, so he can more easily get a job in the fiber communications field.

    Those are just some ideas. There is no right or wrong choice here -- it's a matter of what's the best choice for someone given their life, desires, etc.

    Assuming you are in your 20's (post typical undergrad age), then perhaps a M.S. could be a good fit (and just skip the whole undergrad thing). There are some excellent nine-month, course-only Masters Degree programs. These you can take a year off from work, get a M.S., and then get a new job. Or you can go part time (on your company's dime :) and get the M.S. in a few years.

    In general, people who return to school after working are more focused and have a much clearer idea of what they want to make happen after finishing the degree. If you need to get a degree, or just more coursework, use that to your advantage.

    One final thought: summer school. Departments often have two-week summer school programs which broadly cover some field. This can be a good way to: brush up on old material, schmooze, test the waters if considering changing careers.

    Hope that helped. College/post-grad degrees are certainly the norm today, and generally helpful, but not required it seems. And while more difficult sometimes, people can always return to school later in life. It's not an all-or-nothing choice at age 18.
    -----
    D. Fischer
  • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday December 05, 2000 @01:37PM (#580411)
    First, I think you could successfully persuade Bjarne to agree that C++ is not C's successor; it is now its own totally distinct language from C, which supports a very large subset of the ISO C90 specification.

    C has no successor, because it doesn't need one. C is meant to be a portable assembly language, and it does that remarkably well. It will continue to do it remarkably well for years to come. The problem set C was originally meant to address is still around, and C still addresses that problem set very well.

    C++ did not "do it pretty badly". People who condemn C++ so broadly generally don't know the first thing about the language (free hint: there's a lot more to it than the "class" keyword). Is the language spec large? Yes. The Jargon File is dead accurate when it says that the language spec is just at the limit of memory. The language spec is large because C++, moreso than any language other than Perl, is a Swiss Army chainsaw.

    You want generic programming? It's in there. You want an OO language? It's in there. You want a pure OO language? You can write pure OO in C++ (need a few libraries). You want a procedural language? It's in there. C++ can be usefully used in a staggering variety of problem sets, but only if the programmer understands that there's more than one way to solve things.

    C++ gets its bad reputation more from lousy programmers than from flaws in the language itself.

    My own C++ code winds up looking like Perl by the time I'm done with it. Something as simple as:


    int main(int argc, const char *argv[])
    {
    try
    {
    /* Variable and object decls */
    string key(argv[1]);
    fishtank::blowfish cipher(key, fishtank::ENCRYPT);
    std::ifstream infile("myfile.txt");
    std::ofstream outfile("myfile.txt.encrypted");

    /* Three lines for all the functionality */
    cipher << infile;
    cipher.process();
    cipher >> outfile;
    }

    catch (exception &e)
    {
    std::cerr << "Exception condition caught" << endl;
    std::cerr << e.what() << endl;
    return 1;
    }

    return cipher.result();
    }


    ... Presto. You get the encryption functionality, you get error handling, you get secure memory management facilities, you get versatile file and network I/O, all without needing to bat an eyelash.

    Sometime, take a look at Bjarne Stroustrup's homepages. He's got a great comparison of C versus C++ for a trivial enter-your-name program.

    C has no successor because it doesn't need one. The problem set C was meant to address is still with us, and C is still a great way to solve those problems.

    C++ is not C's successor. It was not meant to be. It addresses a much larger, much different problem set.

    Smart hackers will know when a C++ approach is called for (more accurately, which C++ approach is called for--there are many to choose from), and when a C approach is called for, and when a LISP or Standard ML approach is called for.

    Specialization is for weenies. :)
  • by sconeu ( 64226 ) on Tuesday December 05, 2000 @10:32AM (#580412) Homepage Journal
    \i{How is C/C++ going to be patched *cough hacked cough* to support 128-bit integers? "long long long"?}

    maybe "really long long"?

    There goes my karma!
  • by rpeppe ( 198035 ) on Tuesday December 05, 2000 @11:19AM (#580413)
    Is Plan 9 taking off?? I would really like to ditch this Linux crap and use something a little more current!!

    plan 9 is cool (it's the OS that i use for development), but due to the usual difficulty of developing PC drivers (in particular graphics cards) it probably won't work with your existing h/w configuration.

    however, as dennis says in the interview, most of plan 9's features are in Inferno [vitanuova.com]. in fact, Inferno's is basically a slimmed down Plan 9 with virtual machine and a new language [vitanuova.com] (Limbo [vitanuova.com]) in which Ritchie has had a strong influence.

    in lots of ways, Inferno is considerably more sleek than plan 9 - it is a real OS, but it's also a "virtual OS" that will run hosted under plan 9 or Windows or Linux or BSD or... the same programs run identically on all Inferno platforms.

    there's even a version [vitanuova.com] of Inferno that runs as a plug-in inside Internet Explorer on Windows! if you want to get a feel for it, there's even a shell prompt [vitanuova.com] to play with for command line addicts. not to mention a few other little demos [vitanuova.com] to get a feel for the performance of the thing. i'm afraid the plugin doesn't currently run under Netscape or platforms other than Linux, but the full download [vitanuova.com] does.

    Inferno and Plan 9 are both OSs "done right", maintaining a healthy balance between performance-related pragmatism and theoretical purity. compared to the tangled morass that is Java or any of the more recent Unix variants (and i'm afraid i don't exclude Linux), they're a breath of fresh air.

    it was plan 9 which John Carmack once described as "achingly beautiful" and he's not wrong.

  • by rpeppe ( 198035 ) on Tuesday December 05, 2000 @11:39AM (#580414)
    (Really pushes it hardest??? Oh right, as opposed to Linux where so many of my devices aren't represented as files. Let's face it, a friend of mine cats vi to /dev/audio for an alarm clock. How much harder can you push the concept? )

    lots harder.
    in plan 9, any old program can present a filesystem, and it can then interpret operations on that filesystem at will. basically, you can mount one end of a pipe. filesystem requests on any file or directory below the mountpoint turn into RPC messages down the pipe. so MIME mailboxes are presented as a filesystem [bell-labs.com], the editor cum window system acme [bell-labs.com] allows program interaction through a filesystem [bell-labs.com], access to ftp is provided through a filesystem [bell-labs.com], etc, etc.

    plan 9 doesn't have an ioctl call, which means that an enormous amount of functionality is available via straight shell commands (echo, cat, et al).

    ok, so the ideas might not be completely new, but the implementation works really well in practise. and it means that a sophisticated system can be built out of small chunks of code, which in turn means that the whole system is more understandable and more reliable.

    i can create windows with echo, look back through history with cd and extract parts of cpio archives with cat - and all of this functionality can be transparently exported and imported securely across the net.

    tell me that's not pushing it further!

In any formula, constants (especially those obtained from handbooks) are to be treated as variables.

Working...