Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Phantom OS, the 21st Century OS? 553

jonr writes "Phantom OS doesn't have files. Well, there are no files in the sense that a developer opens a file handle, writes to it, and closes the file handle. From the user's perspective, things still look familiar — a desktop, directories, and file icons. But a file in Phantom is simply an object whose state is persisted. You don't have to explicitly open it. As long as your program has some kind of reference to that object, all you need to do is call methods on it, and the data is there as you would expect."
This discussion has been archived. No new comments can be posted.

Phantom OS, the 21st Century OS?

Comments Filter:
  • Doubt it. (Score:5, Insightful)

    by SatanicPuppy ( 611928 ) * <SatanicpuppyNO@SPAMgmail.com> on Friday February 06, 2009 @05:34PM (#26758017) Journal

    Yes, yes, very interesting.

    Is it volatile? If it is, then no thanks. If it isn't then it must be written to disk, in which case it's simply a regular file with a spiffy interface. Does that interface take up memory? How does it handle locking conflicts? How does it handle paging?

    FTFA it's more like a virtualization system that takes constant snapshots of the system states, and reverts to them if there is a power loss or a shutdown or whatever. Fine. Cool.

    But TFA skips over (in true Register style) any possible downsides to that. I'm a typical geek. I have 20 things running at any given time. Over time, with a traditional software system, there are enough page faults that when I roll back around to something I opened yesterday, the performance is extremely slow while all the states are being loaded back into active memory (and the states of something I'll need in 5 more hours are being written to disk).

    If I'm persisting my whole filesystem in that fashion, there are quickly going to be issues. If I'm not, then there is some bullshit in there somewhere. They may have a fancy file allocation table, they may have some fancy I/O tricks, but their stated abilities are frankly contradictory, because the state is not being maintained, it is simply being preserved, and the difference is only subtle linguistically.

    In short, the Phantom OS sounds more like the Phantom game console than anything I'd want to run on my computer.

  • Oh really? (Score:5, Insightful)

    by vux984 ( 928602 ) on Friday February 06, 2009 @05:45PM (#26758175)

    But a file in Phantom is simply an object whose state is persisted.

    Persisted to a file?

    You don't have to explicitly open it. As long as your program has some kind of reference to that object, all you need to do is call methods on it, and the data is there as you would expect.

    I've written countless classes that work the same way. When I want to read the settings file for my app for example, I just instantiate my settings object and start reading the settings, the object handles actually opening the file (creating it if necessary), opening it if necessary, etc. If I set new settings, the object handles persisting them.

    So all they've done is taken my (and anyone else who does any OO programming) model, and moved it into the OS API?

    I'm not usually one to say, "no big deal, this has been done before" but seriously... this time it really is no big deal, its been done before. Hell, lots of API's for this sort of stuff even already exist, some of them even come with OSes.

    The only thing that might be novel is if this phatomOS goes whole hog, and forces you to use that api and actually denies you all access directly to files using more traditional methods. But I have my doubts... that would make it needlessly incompatible with a lot of existing software.

  • by halivar ( 535827 ) <bfelger@gmai l . c om> on Friday February 06, 2009 @05:45PM (#26758181)

    M'thinks it shares much in common with its gaming namesake, the Phantom Console.

  • by jandrese ( 485 ) <kensama@vt.edu> on Friday February 06, 2009 @05:46PM (#26758191) Homepage Journal
    Oh boy, I can't wait for every application to have to invent it's own directory system to store saved state in, since it can't just use the filesystem to save the file to like in the old days. I bet it will be all kinds of fun to try to get your data from one application into another, especially competitors applications. Not to mention the pure joy that making an incremental backup on this system must be.

    This seems like a throwback to old IBM mainframes and PalmOS. It's fine if your users don't mind being more or less locked into their applications and don't want to move data around very much, but it's crappy when they want to do more sophisticated things like compressing and emailing the document they're working on.

    In short: This is a compatibility nightmare. There is a good reason full fledged systems don't use it.
  • Sounds lucrative.. (Score:5, Insightful)

    by mr_stinky_britches ( 926212 ) on Friday February 06, 2009 @05:48PM (#26758221) Homepage Journal

    Sounds lucrative.. not!

    At first, when I read the OP's post, I thought he was being harsh. Then I actually read TFA, and here are some highlights:

    Q: Is Phantom a POSIX-compliant system?
    A: No. It is possible to layer POSIX subsystem above the Phantom native environment, but it is not an idea per se.

    Q: OS is based on VM â" does it mean that not all the possible programming languages will be supported?
    A: Yes. Say goodbye to C and Assembler. On the other side, everything is in Java or C# now, or even in some even more dynamic language, such as Javascript or even PHP. All these languages will be supported.

    Then it also has a special ASM language called "Phantasm". Looking over the example code, the question "Why?" kept flashing in my brain.

    Ah, then we come to Why a new os? [www.dz.ru]:

    The most obvious questions: why new operating system? Isnâ(TM)t Linux enough? Of course, Linux is not enough. Being a clone of Unix, Linux conceptually is a dinosaur. Donâ(TM)t be happy, Windows guys, Windows is not really far away. Lets see, what is wrong with todayâ(TM)s popular operating systems.
        >> OO-Friendly? No!
        >> Network friendly? No!
        >> Simple? No.
        >> Communication friendly? No!
        >> Future friendly? No!

    Okay, so according to the guy who created it, OS's should be simple, oo-friendly (my mom always says "Hey, stinky, why isn't my computer more object oriented?" (wtf?no), and future friendly? The guy must be just another cracked out developer..

    Thanks but no.

  • by SatanicPuppy ( 611928 ) * <SatanicpuppyNO@SPAMgmail.com> on Friday February 06, 2009 @05:53PM (#26758275) Journal

    Yea, I'm there with you. Power failures are a problem for one reason and one reason alone: RAM I/O is faster than disk I/O. If disk I/O was faster, we wouldn't even need RAM...RAM would be useless because it has a huge disadvantage: its volatility.

    Now Phantom wipes that problem out by "...storing its complete state on disk". Either this is bullshit, or this OS will have serious performance issues.

    Then, then it starts talking about C vs Java. WTF is that about? Regardless of how cool the OS' underpinings are, you could write C for it with an OS-specific compiler. That's no different from the output of Java's intermediate compiler.

    It's not like Java is outputting some sort of magical instructions that are different from the output of compiled C. The difference is that C doesn't abstract the hardware layer in the user code like Java does, and that Java is compiled to be interpreted on the fly by an intermediate virtual runtime environment. Get right down to the hardware and there isn't a lot of difference.

    I'd want to see some real specifics that they could deliver anything resembling what they're promising, and frankly, I think that'll never happen.

  • by SerpentMage ( 13390 ) on Friday February 06, 2009 @05:53PM (#26758293)

    >Memory in all computers is mapped to address space. I get the idea that these guys are programmers who don't really understand how the hardware works.

    No I think they know what they are talking about. Instead what they are saying is that if you look at the VM concept (eg .NET with AppDomains) you can run everything into a single address space.

    Of course underneath there is an address space, but remember that each process has its own address space that the CPU has to maintain. There is quite a bit of legwork that the CPU does that he thinks is probably not necessary.

    >Nobody needs files? How, exactly, can I retrieve a document then? This FA is damned short on details.

    Have you read About Face from Alan Cooper? He explains in that the concept of a file is horrible from a user perspective. Files are added as a concept because it is a hack and makes it easier for the programmer. A user in fact does not want to have say, "oh I have to save this?"

    Thus the idea is that you have an entity that you can manipulate. And whatever changes you make are immediately persisted. This is what users expect.

    >I really don't think I'm interested in this OS.

    I am extremely interested in this OS because he is simplifying things. Remember one thing that we learned with Jit'ing is that "slower" apps can actually be very fast. C++ is not the fastest game in town. And that should make us all think.

  • by mangu ( 126918 ) on Friday February 06, 2009 @05:55PM (#26758325)

    From what I read, these "objects" are nothing but a fancy new name for files. For instance, if you are writing a program in Python you don't save a file, you pickle an object. Oh, wait, that's exactly what Python is able to do right now, in any OS that implements Python! Doh....

    FTFA:

    does it mean that not all the possible programming languages will be supported?
    A: Yes. Say goodbye to C and Assembler. On the other side, everything is in Java or C# now, or even in some even more dynamic language, such as Javascript or even PHP. All these languages will be supported.

    Think of that: you cannot program in C, but you can write programs in PHP or Javascript. How cute! I suppose it supports Logo, right?

  • by hal9000(jr) ( 316943 ) on Friday February 06, 2009 @06:00PM (#26758393)
    I am not affiliated with these guys, but from the faq and the site, here is what I get.

    Memory in all computers is mapped to address space.

    Right, but you, the programmer, don't worry about memory allocation or de-allocation in the same way. You don't do pointer math or any of that shit. The OS does it for you (which is what an OS should do). Think how Java manages memory is different than now C does. Hopefully, the OS manages memory well.

    Nobody needs files? How, exactly, can I retrieve a document then? This FA is damned short on details.

    Well, yes, there are "files" managed by the OS, but not directly reachable by a program. You treat a file like an object and just use it. No open, no close, no worrying about the proggie crashing and losing the unwritten data. The OS handles it.

    Same with processes. It seems cool. Not sure it has legs, but seems cool indeed.
  • by Rary ( 566291 ) on Friday February 06, 2009 @06:01PM (#26758411)

    I think he's talking about programmer-land, not user-land here.

    That's the problem. Everything about this appears to be designed for developers, not users. There's absolutely nothing that indicates anything that would make a user want to use this OS.

    So, basically, if you're a developer, and want an OS that makes it cool, easy, and fun to develop applications that no one will use, then this is for you.

  • Re:Doubt it. (Score:5, Insightful)

    by SatanicPuppy ( 611928 ) * <SatanicpuppyNO@SPAMgmail.com> on Friday February 06, 2009 @06:04PM (#26758441) Journal

    I still don't buy it. They're throwing an abstraction layer on top of a regular system and calling it something different, but all the underlying structures are the same.

    Except they're not because you're basically forbidden direct access to any system resources! Any gains that you would traditionally expect to be able to make through use of C or assembly are right out the window, and that is acknowledged right up front.

    Hardware abstraction is going to have a cost. All virtualization has a cost, and I'm not sure that this is the way to handle the problem. It seems more like a pipe dream than a practical application.

  • by dazedNconfuzed ( 154242 ) on Friday February 06, 2009 @06:14PM (#26758551)

    I'm inclined to agree.

    Linux is, indeed, based on what is now a very old paradigm - approaching half a century. Concepts have advanced since, and much of what we do is just to retain that backwards compatability.

    Windows, is, well, Windows. This being /., no more be said of that.

    Grokking object-oriented programming, and users' mindsets as well, I agree that it would be worth at least examining the concept of a "file-less OS", one that simply keeps a live OO system persistent. I'd like to write software knowing that when an object is instantiated, it persists until explicitly deleted - without having to awkwardly save state to something as non-orthoganal as a file. I want to be able to manipulate & transport objects as such, not as files. Obviously the prime issues are performance (storage vs. RAM consistency) and recovering from shutdown; resolving these is simply a geeky engineering challenge, not an impossibility. The concept of "files" is archaic. Storing/transferring what we call a "file" would be better served by persistence & portability of objects.

    A prime example is the notion of "restarting" a computer. Why, these days, should a computer startup time be so long? it should simply resume, but more robustly than "sleep" or "hibernate" - restoring the state of objects as they were, not restarting from practically scratch every time.

    Could be that the OS ultimately does store data as "files", but that is an implementation abstraction, not a core of the paradigm. Users do not intuitively think of "files", and programmers should not force them to due to ancient rock-and-chisel backwards compatability.

    "Those who say it cannot be done should not interrupt the person doing it."
    - Chinese proverb

  • Rebooting (Score:5, Insightful)

    by dmomo ( 256005 ) on Friday February 06, 2009 @06:23PM (#26758631)

    What does this model say for Memory Leaks? If the state is persisted... rebooting won't clear the memory. I imagine there must be a "reset state" mechanism. Perhaps this can be done without actually rebooting. I dunno.

  • by orclevegam ( 940336 ) on Friday February 06, 2009 @06:26PM (#26758667) Journal
    Biggest problem I see with this, is the whole persistent process thing. There have been similar things tried in the past, for instance PalmOS had a behavior very similar to this, but it tends to be more trouble than it's worth. There's also a very good reason why we use files in some instances, such as for storing documents that parallel physical ones (that is, most things that come out of Office type products). A file represents a very convenient discrete packet of information separate from the application that produced it, and that is easily transferable, archiveable, and processable, without adding the overhead of bundling a particular instance of an application along with it. Other problems this introduces include how to handle a crashed program, or one that has managed to get itself into an inoperable state. How difficult is it to "rollback" a process to an initial state, particularly without doing the same to every other process in the system. Does doing so wipe out your configuration options? What if those options are the reason the process isn't working?

    For an embedded device in certain specialized environments this sort of thing might work very well, but it's certainly not a good idea as a primary OS in your typical desktop or work environment.
  • by The End Of Days ( 1243248 ) on Friday February 06, 2009 @06:26PM (#26758669)

    So there's something wrong with a dude scratching an itch and having a little fun with it? There was a time when Linux was a niche system that had no real purpose aside from the fun of making it. That seems to have worked out well.

    In any case, there are interesting concepts in here that deserved to be explored, and the best way to explore programming concepts is the program them.

  • Re:Doubt it. (Score:1, Insightful)

    by Anonymous Coward on Friday February 06, 2009 @06:29PM (#26758713)
    While this is both amusing and true, the operating system is far worse placed to manage these locks than the application.
  • by ReeceTarbert ( 893612 ) on Friday February 06, 2009 @06:34PM (#26758779)

    I don't need a filename -- just give me the document based upon some quantifiable characteristic about the document, such as keywords, format, or even the visual layout.

    Maybe a long shot and not quite what you have in mind, but I think that Spotlight [apple.com] is close enough -- and it's fast too. So fast, in fact, that's also my application launcher of choice.

    Reece

  • by maharg ( 182366 ) on Friday February 06, 2009 @06:39PM (#26758855) Homepage Journal

    Everything about this appears to be designed for developers, not users. There's absolutely nothing that indicates anything that would make a user want to use this OS.

    I expect Babbage came up against the same attitude. Good job it didn't put him off, eh ! Not to compare this guy with Babbage, but really, does lack-of-user-appeal really mean that it's not worthwhile ? I think this is very interesting indeed. If you consider something like a database application, which needs to persist state changes to disk pronto, then why not let the OS handle this for you ? It needs to be done either way. I just wonder how a generalised object persistance layer can can handle specialised cases such as text storage (where you might want compression to save space at the expense of some speed) and video storage (where the object data is already compressed and you don't want to re-compress it). Actually thinking about video is interesting - what would the equivalent of seeking through a huge video file be if it was stored as an object ? Would the whole video object be loaded into RAM ? Some *very* interesting programming challenges here, which for some people makes it all worthwhile, even if it is ultimately a dead-end commercially, it *can* advance the field.

  • Re:Doubt it. (Score:5, Insightful)

    by drik00 ( 526104 ) on Friday February 06, 2009 @06:39PM (#26758857) Homepage

    IANAP, but isn't the notion of using "files" and "folders" and a "desktop" analogous to how an normal person would work WITHOUT a computer, hence the concepts being transferred to a tool used to speed up and improve the efficiency of a person's work? How are these referred to as antiquated concepts? We use compartmentalized words because of the balance of efficiency with modularity. Our brains inherently compartmentalize, so why should we try to move away from that in a new OS (that I'm betting will be on the vaporware list in the near future)?

    Capt Negativity here,
    J

  • by Anonymous Coward on Friday February 06, 2009 @06:42PM (#26758879)

    Really?

    It couldn't possibly mean, say, that English isn't their primary language?

  • by molarmass192 ( 608071 ) on Friday February 06, 2009 @06:55PM (#26759033) Homepage Journal
    Yeah ... I was thinking the same thing. Good bye to C and Assembler? Ahhh, they mean goodbye to any low level hardware I/O or custom drivers ... nice. We already have a Phantom OS, it's called HTML / JavaScript ... no files to persist, no access to hardware, no low level performance tuning, networking is built-in, everything is interpreted ... how exactly is Phantom OS any different? OSes succeed when they offer GREATER flexibility, not when they insulate developers for low level APIs. Look at what can be done on an iPhone versus what is possible on a Mac. I think I'll stick with my "dinosaur" UNIX variant, with all the terrifying freedom and non-restrictions it provides, thank you very much.
  • Re:Doubt it. (Score:5, Insightful)

    by black6host ( 469985 ) on Friday February 06, 2009 @07:06PM (#26759171)

    I mean no offense, but I can't help but read your comment and see myself, many years ago, feeling much the same when moving from DOS to Windows. I lost a level of control, at the hardware level, that made me question why I would want to give up peeking and poking video memory, etc. Back then, direct control meant a world of difference in performance. Of course, I have many more options now than I did then, and if I still want to get to the hardware bad enough, I still can. But I don't feel the need to nor do I feel the abstraction has held me back. We can do much more now, than we could then....

    Not to say that the OS in question is the way to handle the problem or not, but I've become a little less resistant to change, a bit more willing to be open-minded and much more appreciative of pioe dreams :)

  • by Bryansix ( 761547 ) on Friday February 06, 2009 @07:13PM (#26759267) Homepage
    Think outside the box for a moment. Nothing you brought up could not be fixed with a simple mechanism. You could still hit "save as" when you wanted a snapshot of a document but the point is that the document will persist even though you lost power in the middle of typing it.
  • by mr_stinky_britches ( 926212 ) on Friday February 06, 2009 @07:20PM (#26759353) Homepage Journal

    Mmm..tastes like reinvention. Who does that? Oh yeah.. ;) cracked out devs

  • by Ornedan ( 1093745 ) on Friday February 06, 2009 @07:21PM (#26759363)

    So you've got this really spiffy object-oriented OS automatically persisting your objects. What's the serialized representation of those objects? Any answer other than just having the system puke the memory representation of the object onto permanent storage media means that the programmer has to have a say in determining that representation. And this system was all about not having the programmer worry about those messy details. Except having the serialized form be a memory blob means the only thing you can ever deserialize it to is the exact same version of that particular object type.
    This is why we have files. Letting the programmer do the de/serialization just means you're calling your files something else and added some mandatory cruft on top. Also, without files (or equivalent), you can't have standard file formats. This kind of system would then be vendor lock-in heaven.

  • by __aasqbs9791 ( 1402899 ) on Friday February 06, 2009 @07:27PM (#26759427)

    I think their whole point is to make this easier for users, and the vast majority of users I have ever known could not be taught by anyone how to properly use a version control system. Many of them can barely understand how email works (still!). I have th same concerns as the poster before you about a system like this. Having to "save" a file is a feature, not a bug, once you understand how to use it.

  • by moderatorrater ( 1095745 ) on Friday February 06, 2009 @07:45PM (#26759619)

    Think outside the box for a moment

    Agreed. We seriously need his synergy.

    However, he's got the point that it introduces problems that might have a workaround, but one that's less efficient/effective than the original problem. Why not just add a library that can be used with the dynamic programs that allows them to do this easily while still retaining the ability to do things the old fashioned way?

    In addition, files are absolutely necessary. As someone pointed out, how do you take an object from one program to another? How do you find it to send it to your mom? These are all problems for which the file paradigm works very well. The solution will either be core to the OS that very closely resembles our current situation or specific to each application, requiring you to relearn everything every time. Doesn't sound all that efficient or paradigm changing to me.

    That said, I think the idea deserves exploration. I just don't think it's going to revolutionize computing, and that eventually the good ideas will be incorporated into our current offerings and we'll all move on, grateful that they explored the idea but knowing that the original scope of his plans wasn't realized.

  • by The Mighty Buzzard ( 878441 ) on Friday February 06, 2009 @08:12PM (#26759845)

    Having a computer that never forgets what you've done is, really, what people expect a computer to be. It's just that we've been amateur sysadmins for so long we think it's normal.

    Or it could be that we've been actual sysadmins long enough that we know the value of always having a working state to fall back on. Preferably one that doesn't erase all the work done in the past few years. Saying something as foolish as that can only mean you haven't had to repair a thoroughly hosed system in far too long.

  • by Jeremi ( 14640 ) on Friday February 06, 2009 @08:45PM (#26760149) Homepage

    So, how do people "expect things to happen" when it comes to computers?

    For years my grandmother had a post-it note pasted to the bottom of her computer monitor. On it was the following message, in large letters: SAVE!

    The reason for that was because she would often type in a document, then turn off the computer. When she turned it back on later, she would be surprised to find out that her document was gone. The concept of persistent vs non-persistent state did not come easily to her, and one has to ask, why should she have to learn about RAM and hard drives and filesystems just to type up a letter? Why can't the system work the way she expected it to, which is to say the way most other machines in the modern world work? When I stop using my notepad, my bicycle or my television, I don't have to remember to press SAVE anywhere or risk losing my work. It's an awkward and unintuitive extra step, and in an ideal world it wouldn't be necessary.

  • by AuMatar ( 183847 ) on Friday February 06, 2009 @08:47PM (#26760179)

    A database is the *last* thing that would want to let the OS handle something like this- how would it do transactions then? How would it efficiently lay out the data for reading (large databases can become disk bound easily).

    Its a rather pointless idea all in all. You want a simpler API than file? Write a function Save() that writes it, and call Save from then on. Want it to do so automatically? Do so in response to a timer. You can even write a function that sets up the timer, or have your constructor register the object with an auto-writer.

    There's a reason why files exist. They work, they work well, they're simple, and they're extremely flexible. A system like this has very small improvements to simplicity for basic work, at the cost of a lot of problems doing anything more than basic (opening data with another app, sending data to external disks, organizing data, writing efficient file formats, implementing standards, sharing data among multiple apps). Its not that this hasn't been considered before, its that we have a good solution already.

  • Re:Doubt it. (Score:5, Insightful)

    by c0p0n ( 770852 ) <copongNO@SPAMgmail.com> on Friday February 06, 2009 @08:52PM (#26760227)
    Small correction to what you said; if the file being moved is on the same logical and physical volume, it's not copied then erased, just a reference being changed. "Folders" have always been named "directories". Calling them otherwise is fairly recent, since windows 3 time iirc. I do, however, fail to see your point in how the folder/file metaphor is antiquated, or perhaps inadequate, users seem well happy with it. Technically, of course. But intelectually, it's a metaphor that has worked really well for over 30 years.
  • by shutdown -p now ( 807394 ) on Friday February 06, 2009 @09:01PM (#26760309) Journal

    ... it'll be a closed ecosystem: an OS that cannot run the a lot of the presently popular, mainstream programming languages. C/C++ - no, obviously (they say as much) - this means that vast majority of existing apps go right out of the window. Great already. Java? That should run, but how many good Java desktop apps you know? Now forget about those that use JNI in any way - now what? About 5-10 - The rest will have to be written from scratch. C#? Nope (no pointers), though a limited subset might be possible. Perl, Python, Ruby? Sure, all will work, but have fun rewriting the interpreters themselves in Java! And despite the claims that "JITs are fast enough", for stuff like that they aren't - you really need dirty tricks such as computed goto and jump tables to code fast bytecode interpreters.

    So, in the end, this is going to suffer the same fate as all OSes that came bundled with their own language - death and extinction. Remember Lisp machines? Or, say, Oberon-3? Yes, that's how it usually ends...

  • by dangitman ( 862676 ) on Friday February 06, 2009 @09:34PM (#26760553)

    The concept of persistent vs non-persistent state did not come easily to her, and one has to ask, why should she have to learn about RAM and hard drives and filesystems just to type up a letter?

    Because that's how it works. Any alternatives also have major downsides. Your grandmother is an isolated case. Most users now understand the concepts involved. Your grandmother could simply turn on auto-save.

    Why can't the system work the way she expected it to, which is to say the way most other machines in the modern world work?

    Because more people than your grandmother use computers, and shouldn't be limited by the least competent users. I could expect my computer to work like a magic elf that makes me snacks, but it wouldn't be realistic.

    The way most other machines work? What about my 35mm camera? When I take a picture, it needs to be developed and printed, with great care taken not to expose the film to light. I can't just open the camera and see the pictures. Or perhaps a more basic example - when your grandmother types a letter does she just leave it lying around outdoors, or does she store it in a drawer or some other more protected location? When she sends the letter, does she just put it in the mailbox, or does she put it in an envelope first?

    Trying to slavishly emulate other physical devices is generally not a good idea in computing. The whole benefit of computers is that they aren't bound to the limitations of mechanical devices.

  • by vadim_t ( 324782 ) on Friday February 06, 2009 @09:35PM (#26760569) Homepage

    Simple OO orientated access to a bit of everything, that's the true benefit of Phantom.
    You are still missing the points completely, or maybe you are just trolling?
    The increased intuitiviness is mostly on the development arena how i see Phantom, it will translate to some portions of the GUI definitely, and allow many neat things done which were harder on traditional OS.

    As a programmer, this tells me absolutely nothing. Concrete code examples, and pointers to documentation, please.

    You are arguing a technical solution is bad on the basis of the irrelevant GUI. How hard it is to see that method of how things are done and accessed beneath the curtain has no correlation on the final GUI necessarily?

    We're talking here about something that necessarily must result in the user getting a different sort of behavior. If the only difference is for the programmer, and the user doesn't see any improvements, then there's no point for the user to adopt a new OS.

    As for your O(n!) algo inefficiency: You take the algo, splice it on nice swallowable chunks, and all of sudden you have a nice small problem set algo ;) /me works on huge datasets daily

    Well, take the travelling salesman problem. You have a table of just 100 cities with their locations on the map. The problem is calculating the shortest (no approximations allowed) route for visiting all of them. Please tell me how you're going to split that into nice swallowable chunks (hint: 100! is a rather large number). Also, how much more hardware will you need if you add an extra city and want to get it done in the same amount of time?

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Saturday February 07, 2009 @12:11AM (#26761571) Journal

    Keep in mind, the whole OS is designed this way, including all programs.

    Let me give you an example of what happens when it's implemented as a library: GNOME and KDE sessions. At least in KDE, it's possible to save a session, or even to have it autosave when you logout. It will remember all open programs, and the geometry of their windows. It will even query the programs, asking them to save their state.

    Now, this would be awesome, wouldn't it? It'd be a lot more efficient than hibernate/resume, if it worked -- for example, an ODF (plus some simple geometry and state) is much smaller than the entire virtual image of OpenOffice. If the programs were written well, to load only what they need on demand (and thus start much faster), the whole system would shut down and wake faster.

    You could even start to have multiple sessions, maybe mapped to virtual desktops, maybe not, so that when you boot, you could choose whether to have it launch your web browser, text editor, and terminals, or have it launch your mail client, IM client, and softphone, or maybe have one that just launches whatever movie you were playing (which would resume from the exact moment it was at when you shut down)...

    Problem is, too many programs don't support this. Some, like Firefox, seem to supply their own session management. Some don't even try, and thus, when the DE tries to resume them, it ends up launching a fresh instance. Some can't be persisted, due to their fundamental architecture -- how would you propose to save the state of a running terminal?

    So, doing it as a library doesn't work, unless everything's using that library. If everything's using that library, that's pretty much what you get.

    And sometimes, you do have to enforce sometimes performance-decreasing features in order to provide a better user experience. Imagine if filesystem access was just a library, and programs had access to the entire disk. It might be interesting to build an OS that way, but even if you did, I imagine you'd want to restrict most user-level programs to dealing with the POSIX API, and being bound by Unix permissions and POSIX ACLs.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Saturday February 07, 2009 @12:15AM (#26761591) Journal

    Newton OS had the same thing. It caused me to lose data twice when I accidentally deleted a large part of a Newton Works document and then did something else. Undo only undid the something else; the deletion became permanent as soon as it passed out of the one-step undo buffer.

    Two things:

    First, your problem seems to be more with the fact that undo history was only one level deep than anything else.

    And second, you do need revision control, and it needs to be easy enough for the masses, but more powerful than just "undo".

  • by Anonymous Coward on Saturday February 07, 2009 @03:47AM (#26762327)

    Actually, its the computer's responsibility to ensure the document is stored in some form unless the user decides to destroy it. Even then there should be safeguards. This is why we have file systems, backups, etc etc.

    Plenty of systems are designed to ensure that data is not lost. Loss of data through forgetting to save isn't the user's fault - it is the developers.

    Personally I think it is developer *incompetence* that says people should *remember* to save (eg every 5 minutes). Computers shouldn't forget. This is why we spend time and money to build and buy them. Plenty of products exist that have auto-save.

    Not providing undo is another indication of idiocy.

  • Re:Doubt it. (Score:3, Insightful)

    by julesh ( 229690 ) on Saturday February 07, 2009 @01:34PM (#26764907)

    This could be implemented as a library. It seems to be basically a class which implements "auto-serializing" (maybe activated by a system callback) and make every class which needs to save data to extend it.

    Not really. The point is that it applies to all data in the system. And also it isn't serializing in the traditional sense (transforming objects into a format that can be written to a stream) but is rather directly storing the in-memory representations of the objects in a persistent storage system.

    In order to do this convincingly in a library, you'd need feedback from the operating system when data was modified. I don't believe most standard OSs do this.

  • by tylernt ( 581794 ) on Saturday February 07, 2009 @02:56PM (#26765645)

    Any alternatives also have major downsides. Your grandmother is an isolated case. Most users now understand the concepts involved. Your grandmother could simply turn on auto-save.

    Or, she could just use something like Phantom where a file is simply an object whose state is persisted.

    I'm not seeing any downsides to the "alternatives", either. In fact Phantom seems like an improvement in every possible sense except backwards compatibility. The way things are now are just a kludgy evolution of making humans work like a computer, instead of making the computer work the way a human expects.

  • by Anonymous Coward on Saturday February 07, 2009 @02:59PM (#26765687)

    Because that's how it works. Any alternatives also have major downsides. Your grandmother is an isolated case. Most users now understand the concepts involved. Your grandmother could simply turn on auto-save.

    Very practical, but also very shortsighted. "Your grandmother" is not an isolated case; most other real-world devices don't completely undo what you've done so far just because you turned them off (typewriter, camera (digital or otherwise), chainsaw. Even if 'most users now understand the concepts involved', wouldn't user education be simpler with persistent state? Isn't hibernate/sleep much simpler / easier for the user than powering on/off (when it works)? Auto-save is a poor kludge working around a completely broken system.

    If the 'save' concept isn't completely broken, why is it that every single program has to prompt you to save files before you exit the program? If you found a program that didn't, wouldn't you consider that program buggy?

    Because more people than your grandmother use computers, and shouldn't be limited by the least competent users. I could expect my computer to work like a magic elf that makes me snacks, but it wouldn't be realistic.

    You could expect that, and if you had a snack-making peripheral it wouldn't be unrealistic at all. I don't see the point of this comment, nor of burdening my brain with unnecessary data/details/steps/requirements.

    The way most other machines work? What about my 35mm camera? When I take a picture, it needs to be developed and printed, with great care taken not to expose the film to light. I can't just open the camera and see the pictures. Or perhaps a more basic example - when your grandmother types a letter does she just leave it lying around outdoors, or does she store it in a drawer or some other more protected location? When she sends the letter, does she just put it in the mailbox, or does she put it in an envelope first?

    I just don't see how you think these are at all parallel.

    If saving the file were a necessary part of the utility of the file, then maybe. But it isn't. You may want to print the document, or send it to someone, or keep it around for later, but none of those things explicitly say: "serialize this into an on disk format, making sure to ask for a label for it and location to store it, and absent all of these steps, I wish to lose this work forever."

    To address your examples:

    When you take a picture with a film camera, yes there may be additional processing and steps required. I don't see that as a feature, and neither does anyone else. That's why the digital cameras that have cut a bunch of steps out of that process are infinitely more popular.

    If your grandmother types a letter, she expects it to still be in the typewriter when she gets back to it, not to have vanished into the ether.

    The mailing process is another example of an unnecessarily complex process. The user doesn't strictly need to be aware of the envelope and addressing. If we had machines to do those for us very cheaply, they would sell like hotcakes. (Note that large businesses that do bulk mailing to the point where automatic enveloping and addressing is economical do that)

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...