Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Mozilla The Internet Upgrades IT

New Firefox Project Could Mean Multi-Processor Support 300

suraj.sun writes with this excerpt from Mozilla Links "Mozilla has started a new project to make Firefox split in several processes at a time: one running the main user interface (chrome), and another or several others running the web content in each tab. Like Chrome or Internet Explorer 8 which have implemented this behavior to some degree, the main benefit would be the increase of stability: a single tab crash would not take down the whole session with it, as well as performance improvements in multiprocessor systems that are progressively becoming the norm. The project, which lacks a catchy name like other Mozilla projects (like TaskFox, Ubiquity, or Chocolate Factory) is coordinated by long time Mozillian, Benjamin Smedberg; and also integrated by Joe Drew, Jason Duell, Ben Turner, and Boris Zbarsky in the core team. According to the loose roadmap published, a simple implementation that works with a single tab (not sessions support, no secure connections, either on Linux or Windows, probably not even based on Firefox) should be reached around mid-July."
This discussion has been archived. No new comments can be posted.

New Firefox Project Could Mean Multi-Processor Support

Comments Filter:
  • Finally! (Score:5, Interesting)

    by nausea_malvarma ( 1544887 ) on Thursday May 07, 2009 @05:11PM (#27867137)
    About time, mozilla. I've used firefox since it came out, and lately I've noticed it's not the hot-rod it once was. The web is changing - full of in-browser videos, web apps, and other resource intensive content, and firefox has had trouble catching up. I look forward to better speed and stability, assuming this project is seen through it's completion.

    Otherwise, I'd probably switch to google chrome eventually, which doesn't have the add-on support I enjoy from firefox.

  • How about threads? (Score:2, Interesting)

    by node159 ( 636992 ) on Thursday May 07, 2009 @05:17PM (#27867249)

    Processes vs Threads...

    I'm pretty certain that the usual 40-60 pages I have open are going to blow the memory if each runs in its own process.

  • by MoFoQ ( 584566 ) on Thursday May 07, 2009 @05:17PM (#27867251)

    I guess it can be useful in determining which site I visit tends to create the memory leaks I still experience (even with ff3).
    (as I type, this current browser session has ballooned to over 600MB...which is still better than my typical with ff2...which was 700-800MB)

    maybe they can dedicate a process just for "garbage collection".

  • by Anonymous Coward on Thursday May 07, 2009 @05:21PM (#27867337)

    Now, this sample browser with process isolation took a couple of hours to develop:

    http://ivan.fomentgroup.org/blog/2009/03/29/instant-chrome/ [fomentgroup.org]

  • by sopssa ( 1498795 ) <sopssa@email.com> on Thursday May 07, 2009 @05:21PM (#27867345) Journal

    I wish Opera will catch up soon to this aswell. Its a great browser, but when it does crash on some page whole browser goes down. They have to soon, seeing all other major browsers have implemented it.

  • by Tumbleweed ( 3706 ) * on Thursday May 07, 2009 @05:26PM (#27867437)

    Will Chrome mature to have a nice system of plugins to match the advantages of Firefox before Firefox rearchitects this very low level code?

    I sometimes wonder about the FF devs - I've been wondering about the lack of a multi-threaded (at least) UI for a few years now. That project kept getting put off and put off until there was too much code to change easily. Only now that a real competitor comes along do they bother with the obvious thing that should've been put in from the start. Do FF devs not actually USE FF? Or do they not browse sites with Flash apps that go out of control and make the browser completely unresponsive? I find that hard to believe.

    Whatever. At least it'll finally happen. One wonders how many people will have switched over to Chrome by the time they get this out the door, though.

  • by TheRaven64 ( 641858 ) on Thursday May 07, 2009 @05:36PM (#27867571) Journal
    No. Just no.

    On any modern system, there is very little memory overhead to having multiple copies of the same process. They will share read-only or copy-on-write versions of the executable code and resources loaded from shared libraries and the the program binary, as well as any resource files opened with mmap() or the Windows equivalent. The only real overhead is relocation symbols, which are a tiny fraction of most processes. In exchange for this small overhead, you have the huge benefit of having completely isolated instances which only communicate with each other through well-defined interfaces.

    Threads are an implementation trick. They should not be exposed as a programmer abstraction unless you want people to write terrible code. Go and learn Erlang for how parallel code should be written.

  • by RiotingPacifist ( 1228016 ) on Thursday May 07, 2009 @05:47PM (#27867765)

    I tried explaining this on DIGG, but to not have the title understand it on Slashdot is depressing!
    I think there is an advantage to processes pre tab against a code injection attack
    Also if you had Firefox-gui, Firefox-net, Firefox-Gecko, Firefox-Profile, Firefox-file you could give each one a different SElinux/apparmor/UAC profile.

    Im not sure what the performance trade off would be like so i sincerely hope that there is a single binary compiler option. I also think that a good balance to prevent the security hit on per-tab processes is to only put https tabs in separate processes (additionally it would be smart to prevent extensions running on these pages (GUI extensions would still work, but nothing that touched the page).

    Are processes even needed for security though? can threads be locked down to achieve this without the performance hit? (and additionally, lock down extensions?)

  • by Tony Hoyle ( 11698 ) * <tmh@nodomain.org> on Thursday May 07, 2009 @05:47PM (#27867767) Homepage

    Really? Not that I noticed. I was tought Pascal, Ada, 68000 machine code, and they let us play with a little C off the record. Oh and Cobol, of course. No threading at all. That was around 1990.

    Having talked to programmers who qualified more recently, it hasn't got any better except they now get to learn C 'officially'. It takes around 6-9 months for a new programmer to pick up how things are done in the real world after being through the education system.

  • Re:responsiveness (Score:3, Interesting)

    by coldmist ( 154493 ) on Thursday May 07, 2009 @06:26PM (#27868489) Homepage

    If I look at a page like The Drudge Report, I can ctrl-click on 10 links, creating 10 background tabs. Then, I click on the first article tab, read a bit, close the tab. That shows me the 2nd article. Close it, and I get the 3rd. etc.

    This way, I don't have to wait more than 50ms to go from article to article. They are already loaded in the background for me.

    Very handy!

    Doesn't everyone do this?

  • by MrMr ( 219533 ) on Friday May 08, 2009 @04:36AM (#27873641)
    Yes really, I did a Minor in CS in 1988, and we had to write our own semaphore based threading code, on a bunch of 3b2's connected by 10base2.
    I'm pretty sure that was plain textbook stuff (from the first chapter of Tanenbaum's Operating Systems), as you would fail the grade if you didn't get it to work.

The last thing one knows in constructing a work is what to put first. -- Blaise Pascal

Working...