Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Mozilla The Internet News

Firefox To Get Multi-Process Browsing 383

An anonymous reader writes with news that multi-process browsing will be coming to Firefox. The project is called Electrolysis, and the developers "have already assembled a prototype that renders a page in a separate process from the interface shell in which it is displayed." Mozilla's Benjamin Smedberg says they're currently "[sprinting] as fast as possible to get basic code working, running simple testcase plugins and content tabs in a separate process," after which they'll fix everything that breaks in the process. Further details of their plan are available on the Mozilla wiki, and a summary is up at TechFragments.
This discussion has been archived. No new comments can be posted.

Firefox To Get Multi-Process Browsing

Comments Filter:
  • Nice (Score:3, Insightful)

    by suso ( 153703 ) * on Wednesday July 08, 2009 @12:21PM (#28624885) Journal

    This is cool. Competition is good.

    • by Anonymous Coward on Wednesday July 08, 2009 @12:43PM (#28625279)

      The clowns working on Firefox had years, YEARS, to get their act together and rewrite the STINKING PILE OF SHIT that is the Firefox codebase. But they chose to flame anyone who dared talk about the massive architectural problem in the absurdly outdated Firefox process model.

      Memory protection for each tab? Not possible! Stop asking for something that can't be done! They cried!

      Threading for Javascript? Not possible! Stop asking for something that can't be done! The Firefox devs cried!

      That is why those AC posts from Firefox devs were so vicious and venomous for everyone pointing out the massive memory/resource leaks in Firefox that have only been somewhat lessened in the latest versions. The solution for those problems involves a complete rewrite of the process and memory model for Firefox.

      Now Google came out and humiliated the Firefox devs with Chrome and its amazing realworld threaded Javascript and memory and process protection/isolation.

      Nothing but pity and absolutely no sympathy for anyone faced with retrofitting Firefox into a semblance of a modern browser architecture.

      Now with full extension support in Chrome this is like hearing about Microsoft scrambling to fix their massive security problems in IE long after you dumped it.

      • by debrain ( 29228 ) on Wednesday July 08, 2009 @01:03PM (#28625629) Journal

        Threading for Javascript? Not possible! Stop asking for something that can't be done! The Firefox devs cried!

        Opposition to threading by Firefox devs came from, among others, Brendan Eich, the inventor of Javascript. You can read his well supported arguments on Bugzilla [mozilla.org].

        That doesn't excuse Firefox devs from not supporting a parallel architecture earlier, from which users would significantly benefit. But the conversation on that link is an oculus into the reasoning behind not having a parallel architecture earlier.

        • Re: (Score:3, Insightful)

          by BitZtream ( 692029 )

          Uhm, the firefox javascript engine supports multithreading just fine, and gecko 1.9 supports multithreaded javascript out of the box, the previous branch required some extra effort to do so what it most certainly would allow multiple javascript threads.

          Not really sure what the hell you're talking about but I have a couple Firefox extensions that depend on the fact that they can use multiple threads.

          This is all documented on mozdev, both the new methods for gecko 1.9 and the workarounds to do it in the 1.8.x

        • Re: (Score:3, Insightful)

          by b4dc0d3r ( 1268512 )

          Opposition to threading by Firefox devs came from, among others, Brendan Eich, the inventor of Javascript. You can read his well supported arguments on Bugzilla.

          BE's opposition was based on solving the problem, if there is one, rather than re-architecting the solution on principal.

          ...If you insist on defining the problem to dictate the
          solution, then of course "multitasking" is the OS's job.

          But responsive browser UI with windows and tabs galore is not "multitasking". I
          dissent. Many browsers are responsive (m

      • by suso ( 153703 ) * on Wednesday July 08, 2009 @01:08PM (#28625697) Journal

        Hey chill, give em a break. There is something to be said for filtering out every little feature request that gets sent your way. Good filters are how great software stays great (like Linux) and makes sure that the project doesn't veer in the wrong direction. I don't know much about the Firefox developers, but I'd say they have good reason to be filters for a lot of things.

        As a sysadmin, I deal all the time with users asking for the latest features, but I have to weigh which ones can be done now, which ones have to wait and which ones shouldn't be done because they are stupid. I try to keep an open mind, but sometimes you get stuck in a rut because of old information or "the way things used to work", so you just have to be patient, try to show the new way and hope that it sinks in.

      • by Blakey Rat ( 99501 ) on Wednesday July 08, 2009 @01:12PM (#28625769)

        They're not doing it because Chrome has it, they're doing it because IE8 has it. Microsoft putting this in Internet Explorer before Firefox is basically equivalent to kicking Firefox developers in the nuts.

      • Re: (Score:3, Funny)

        by RichM ( 754883 )

        The clowns working on Firefox had years, YEARS, to get their act together and rewrite the STINKING PILE OF SHIT that is the Firefox codebase. But they chose to flame anyone who dared talk about the massive architectural problem in the absurdly outdated Firefox process model.

        Dunno about anyone else, but it gives me a warm fuzzy feeling to know that everytime I start up Firefox there's probably a couple of lines in the code from Netscape 4.x.
        Simpler days back then, none of this Facebooktweetfrommyiphonegoog

  • by jeffb (2.718) ( 1189693 ) on Wednesday July 08, 2009 @12:23PM (#28624935)

    Mozilla's Benjamin Smedberg says they're currently "[sprinting] as fast as possible to get basic code working, running simple testcase plugins and content tabs in a separate process," after which they'll fix everything that breaks in the process.

    This sentence was a little hard to process.

    (I note that the "process" of Slashdot incremental improvement has now reached a point where clicking anywhere in the text-entry box causes the box to LOSE focus. If you don't want us using Safari, there are more efficient ways to get us to move.)

  • That's good (Score:4, Funny)

    by Junior J. Junior III ( 192702 ) on Wednesday July 08, 2009 @12:26PM (#28624963) Homepage

    I was concerned that Firefox wasn't using as much of my system's RAM as it could. I bought 8GB, and I intend to use it.

    In all seriousness, this is good. It should handle crashes and frozen processes better, like Chrome.

    Thanks google, and thanks mozilla, for helping to drive competition and make the web browser better.

  • Forking a process on unix-like systems if fairly lightweight but for Windows this will not scale well at all. Why not just have rendering worker threads? Have I missed something?
    • Re: (Score:2, Interesting)

      by Anonymous Coward
      They want separate processes as a crutch to deal with memory leaks ... the idea being the leak would be contained to one tab's own process rather than the entire browser, and when you close the tab, you close the process.
    • Security

      Speed ----------- You are here.

      Security ----------- The rest of the world is here.

      Speed

      Need to catch up mate. We'll be getting rid of virtual machines next too.

       

    • Re: (Score:3, Informative)

      by Millennium ( 2451 )

      Forking a process on unix-like systems if fairly lightweight but for Windows this will not scale well at all.

      The Microsoft folks don't seem concerned about this, at least not concerned enough to implement it in IE. While I don't doubt that Windows processes are fairly heavyweight, I doubt that they're big enough to cause trouble until the user has hundreds of tabs open.

      Why not just have rendering worker threads? Have I missed something?

      Although working in multiple threads can increase performance in much the same way that multiple processes can, that's not the major benefit of the multi-process architecture. The big benefit to multiple processes is that if one of them dies for som

    • That doesn't solve the stability problem. If one of those worker threads does something naughty, the whole process is going down.

      Although process creation time on Windows is slow compared to other OSes its more than fast enough for spawning a process per tab. Chrome and IE8 have already proved this in the real world.

    • Process creation is much cheaper under Windows than it used to be.

      And one crashed thread takes out all the threads, resulting in--gasp!--the current situation, as Firefox's tabs are nominally multithreaded.

      Process segmentation is the only way to retrofit that bad codebase into actually some sort of working order when compared to IE8 and Chrome. It should also help their astonishing memory leaks too.

    • Forking a process on unix-like systems if fairly lightweight but for Windows this will not scale well at all.

      Yeah, cuz multi-process Chrome on Windows is such a piece of shit?

      • Forking a process on unix-like systems if fairly lightweight but for Windows this will not scale well at all.

        Yeah, cuz multi-process Chrome on Windows is such a piece of shit?

        This is purely anecdotal... but as far as I can tell for my daily usage Chrome is no faster than Firefox 3.5 with adblock. (Adblock ftw)

        • This is purely anecdotal... but as far as I can tell for my daily usage Chrome is no faster than Firefox 3.5 with adblock. (Adblock ftw)

          Anecdotal is the best evidence in this case. I've run Chrome with over 90 tabs, and it kept on chugging. And what does speed have to do with a browser being multi-process? The benefits are security and reliability, with the downside being memory usage.

    • Forking a process on unix-like systems if fairly lightweight but for Windows this will not scale well at all. Why not just have rendering worker threads? Have I missed something?

      Er. This is an argument which applies to high-volume servers that handle hundreds/thousands of requests per second. Windows' process model is not so heavy-weight that you notice it opening a new browser tab once every few minutes.

    • Forking a process on unix-like systems if fairly lightweight but for Windows this will not scale well at all. Why not just have rendering worker threads? Have I missed something?

      One of the main reason's threads are more lightweight than processes is because they share an address space (so it's cheaper to switch between threads - you don't need to reload page tables), but the downside of that is that one thread can corrupt another. Processes don't share memory hence they are better isolated from each other.

  • Nice (Score:5, Interesting)

    by Craig Davison ( 37723 ) on Wednesday July 08, 2009 @12:40PM (#28625223)

    Competition from Chrome was a good thing: first the Javascript improvements, now separate processes for the plugins.

  • by DutchUncle ( 826473 ) on Wednesday July 08, 2009 @12:50PM (#28625395)
    For users with anything pre-multi-core (and that's only a few years old), this will result in things getting *slower* because of the process overhead. I hope it senses resources and optimizes appropriately, or all of the friends and relatives I tech-support will be cursing me when the update happens. Some of them are already ticked that when they double-click on the Firefox icon, it takes longer to load than IE because of all the update-phone-home (the sort of thing for which we would all get annoyed at M$).

    Eventually we'll get to the point where the window comes up and it takes a ludicrous time to fill . . . just like Windows already does now.

    Better philosophical architecture is a good thing. Running well in the practical typical system, in front of the average user, is good too. Disruptive change is not always the way to please your users.
    • by Eric52902 ( 1080393 ) <(eric.h.squires) (at) (gmail.com)> on Wednesday July 08, 2009 @01:10PM (#28625735)
      The machine I'm currently on is a single core machine running XP (1.6 GHz if I'm not mistaken...so lazy I don't even want to pull up the specs!). I've been using Chrome for months on this thing and it's lightning fast. Your concern over speed is unfounded.
    • Re: (Score:3, Informative)

      For users with anything pre-multi-core (and that's only a few years old), this will result in things getting *slower* because of the process overhead.

      You are vastly over-estimating the impact of a process switch on any hardware from the last decade or more. Right now, my WinXP PC is running 53 processes. If I click another application on my task bar, a full screen has redrawn with the new application's window before my finger has finished releasing the mouse button I clicked. Do you have any idea how many process switches took place in the fraction of a second while that happened?

      Better philosophical architecture is a good thing.

      There's a lot more than philosophy going on here.

      Independent processes allo

  • by CannonballHead ( 842625 ) on Wednesday July 08, 2009 @01:02PM (#28625605)

    I'll bite. It's about time.

    Even explorer.exe is able to open directories using different processes, if you want. Frankly, I found it frightfully annoying to have X+ tabs open and have ONE of those tabs cause the entire program to crash, usually due to a plugin issue. Made no sense to me. Multi-process/multi-threaded/multi-whatever programming has been around for quite a while now, and multi-core cpus have been pretty common, too.

    It's one of the huge advantages that I saw with Chrome (over Firefox). That and program open/new tab open speed. FF 3.5 seems to have addressed this somewhat, but it's still slower, I think.

    Hooray for competition, and hooray for finally taking advantage of the hardware out there. Really, for one of the most used applications someone will use, it seems silly to only allow it to use a single-process model.

  • by DrXym ( 126579 ) on Wednesday July 08, 2009 @01:05PM (#28625669)
    Most of Gecko is bound together with interfaces defined in IDL and implemented in C++ / JS. This model is called XPCOM and is based off Microsoft's COM in a large part. In theory (though not always in practice), it didn't matter in COM where the interfaces are implemented - single thread, multi-thread, multi-process or even across a network so long as the caller and callee abide by things such as rules for memory allocation, reference counting, object creation etc. I say in theory because some interfaces can be horribly inefficient when called repeatedly over a network, some interfaces might have broken IDL definitions and some interfaces might deal with things like handles or memory addresses which don't translate properly between processes.

    One way to implementing multi-process Firefox is first allow XPCOM to work across process. i.e. allow objects to be via XPCOM that are actually spawned in another process, one explicitly created for the task. In COM it had a thing called a running object table (ROT). When you create a process hosted object it looks to see if one is running already, and if not it uses the registry info to spawn one. Then it waits for it to start and then it tells the process to create the object, sets up all the marshaling etc. XPCOM could do something similar, though it would have to do so in a cross-platform manner. I assume that Firefox would have to determine when creating a browser object first if it was chrome or content, and if it was content to spawn a host process and then set up the interfaces. Once set up and assuming the interfaces were efficient, the effect would be largely transparent.

    The biggest performance hit would probably be on anything which tried to call or iterate over the DOM boundary between chrome and content. For example chrome which introspected content would suffer because all the calls would have to be serialized / deserialized.

    Personally I think its feasible but it would hit performance. An alternative would be to just host plugins in another process. Windowless plugins might be a pain to implement but at least you could kill the other process if a plugin goes nuts which seems to happen all too frequently for me.

  • Whether or not Chrome is adopted and used as a browser, the project was a success in spurring needed innovation.

  • Finally, apps are getting more multi-CPU focused. Very cool. All of us with multi processor systems thank you.

  • by Twillerror ( 536681 ) on Wednesday July 08, 2009 @02:30PM (#28627109) Homepage Journal

    One problem we have is that we want to open many of the same applications more than once. Imagine wanting to login to slashdot with two different logins.

    Right now in IE and Firefox each tab shares the same cookie space. So when you login with one tab you'll notice the cookie in the other tab getting "overwritten".

    Now with multiple processes is this the case. When one tab "open another window" resutling in another tab are these two tabs in the same processes sharing cookies and the like?

    The browser is general is a horrilbe state machine. It would be nice if Javascript would support some form of lighter weight cookie that could be access between page loads.

"Facts are stupid things." -- President Ronald Reagan (a blooper from his speeach at the '88 GOP convention)

Working...