Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Mozilla The Internet Software

Firefox Working to Fix Memory Leaks 555

Christopher Blanc writes "Many Mozilla community members, including both volunteers and Mozilla Corporation employees, have been helping to reduce Firefox's memory usage and fix memory leak bugs lately. Hopefully, the result of this effort will be that Firefox 3 uses less memory than Firefox 2 did, especially after it has been used for several hours." Here's hoping. Frequent restarts of things on my computer make me furious. I can't imagine why anyone would tolerate such things.
This discussion has been archived. No new comments can be posted.

Firefox Working to Fix Memory Leaks

Comments Filter:
  • but but (Score:5, Interesting)

    by svendsen ( 1029716 ) on Monday September 24, 2007 @12:33PM (#20730399)
    everytime I mentioned the memory issue I was always told it was a plugin or there was something wrong with my system or something about my mother and a donkey. Certainly firefox fan boys wouldn't have just attacked me because I questioned something...would they? :-D
  • Symmetry (Score:2, Interesting)

    by Harmonious Botch ( 921977 ) * on Monday September 24, 2007 @12:34PM (#20730419) Homepage Journal
    Are there really any memory problems that cannot be cured by strict adherance to the rule of "allocate memory at the beginging of a routine, deallocate same amount at the end"?
  • Re:And on three... (Score:5, Interesting)

    by Seumas ( 6865 ) on Monday September 24, 2007 @12:41PM (#20730525)
    Actually, from what I understood over the last year "THERE IS NO MEMORY PROBLEM".

    Every time someone mentions memory issues, the responses are either that it's supposed to consume a gigabyte of ram so that it speeds up the back button or that "there is no memory issue".

    Strange, now, that there are suddenly people paying attention to specifically attacking memory use issues that supposedly don't exist.
  • by Ryzzen ( 1078135 ) on Monday September 24, 2007 @12:45PM (#20730601)
    www.opera.com ;P
  • by waterbear ( 190559 ) on Monday September 24, 2007 @12:45PM (#20730613)
    I completely agree. I only have 384MB on the machine from which I'm writing this. There isn't room on the mb for any more than that. I got totally tired of the system seizing up when I used to use Firefox. So I switched to Opera. That's not completely immune to seize-up/memory concumption problems either. So I'll keep an eye open for significant improvements to FF, and just possibly switch back if they fix the memory bugs.

    I hope they won't totally forget the folks using older-specification systems, but I have my worries about the FF debugging process: I looked at the blog that was referenced in the article header, and some of the comments sound ominous for quality. The way some of them read, makes it seem as if patches to force memory-release in various situations are just going to be grafted on top of the buggy code. That looks like a recipe for performance loss, compared with the result of fixing the problems at their root?

    -wb-
  • Re:but but (Score:5, Interesting)

    by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Monday September 24, 2007 @12:48PM (#20730659) Homepage

    I've heard that too. I use FF on my desktop at work with one or two plugins (FlashBlock and FireBug, mostly). It does leak memory after enough time. Closing the browser always fixes it, so it's not much of a problem.

    That said, if a plugin leaks memory, there are a few options. First, the system should know. Even if the plugin in used constantly, I should be able to open the extensions options panel and see how much memory each one is using, so I can identify the culprit. There should be a warning system ("Plug-in 'MemHog2' is using 500MB of ram, close/ignore/disable?").

    Also, when a plugin isn't in use, then it shouldn't cause a problem. Let's say that the problem is Flashblock. If it isn't actively rendering (say I only have one window/tab open and it's pure text, no flash/etc) then it really shouldn't be using any memory. If I have FireBug inactive it should use next to no memory (when I have it actively checked CSS/JS/etc I expect it to use memory).

    I'm glad they are working on this. I've heard this complaint for a while. But even if the problem is the plugins, it needs fixing or roping in.

    How about being able to set memory limits for plug-ins, Mac OS 1-9.x style? Maybe total, maybe per active page, maybe both. Just a random idea.

  • by suv4x4 ( 956391 ) on Monday September 24, 2007 @12:50PM (#20730691)
    If you can't code without hand-holding tools like automatic garbage collection, perhaps you belong in a different profession!

    Firefox's problem is architectural and not one of garbage collection.

    XUL is inherently single-threaded and JavaScript based. Try out any XUL application out there and you'll see how you get the same poor performance, speed and resource usage as with Firefox (try Miro Player and Joost).

    The Firefox developers are literally throwing out more C code with every release, replacing it with JavaScript code.

    Leaks (in the classical sense) aren't what's causing Firefox's abysmal performance, and this is why Firefox 2 performs worse than Firefox 1.5, despite one of the "features" of Firefox 2 was supposedly plenty of fixed memory leaks.

    Actually I'm pretty sure they're in denial as to the cause of their problems. Announcing they're working on fixing "memory leaks" just supports their ability to continue their delusion.

  • by suv4x4 ( 956391 ) on Monday September 24, 2007 @12:58PM (#20730833)
    Nobody forces you to use Firefox. You can use Opera, Konqueror, links or IE, or any other browser out there...

    Firefox was supposed to be able to withstand popularity, unlike IE. Look at it now: people say it's slow, RAM hog, and hackers have started attacking it successfully just as much as IE.

    At least we see it for what it is: the stick in Microsoft's eye that made them resume IE development.

  • FireFox == Internet (Score:3, Interesting)

    by Frosty Piss ( 770223 ) on Monday September 24, 2007 @01:03PM (#20730935)

    Nobody forces you to use Firefox. You can use Opera, Konqueror, links or IE, or any other browser out there...

    Maybe not, but in the Windows World, Opera is not a viable alternative to many people who find the Opera UI to be excessively daunting for casual use.

    The thing that has irritated me about this is that for a very long time, the FireFox leadership has insisted that there where no memory issues, that it was a specific type of use profile, and that if you knew the secrets of how to tweek the configuration file, performance would improve. This is the lamest of excuses.

    FireFox is not sold as some kind of "leet" hacker browser, it's sold as a browser for the people. FireFox leadership needs to be more responsive to the feedback from "AVERAGE" users if they want FireFox to be a major player in the browser world. 10% is nice, but it's still only 10%.

  • by wizman ( 116087 ) on Monday September 24, 2007 @01:16PM (#20731141)
    I use Firefox on Mac (intel) and Windows, with the latest versions on both. I can have Firefox open for a full week on Windows without any problems, however on either Mac I have to restart Firefox about once every day or two, otherwise browsing slows to a crawl. At extremes the whole machine will start to bog down until I "force quit" (kill -9) Firefox. I'll also experience oddness where images will just stop loading.

    Running "bare bones" on all Firefox installs, no plugins other than whatever may have been included with the base distribution.

    Does anyone else notice this? I've switched back to Safari on the Mac in the meantime.
  • by Anonymous Coward on Monday September 24, 2007 @02:11PM (#20731985)
    Actually, the big language culprits would be those with auto-garbage collection, etc. as they tend to have lazier programmers that don't "need" to manage their own resources,

    People keep saying this. I wonder if it's true -- I haven't seen any evidence of it. I've seen lazy/stupid programmers in all kinds of languages. Wouldn't the same argument work just as well against memory protection, and preemptive multitasking, and compilers?

    and in some cases even prohibit the programmer from being able to manage their resources.

    That's a problem with your specific GC, not a problem with GC. JWZ even ranted [jwz.org] about this. You hear Java users complaining about memory, and Java programmers bitching about the GC, but you never hear Lisp programmers bitching about their GC. Any production Lisp GC gives the programmer far more control than Java's does. Hearing people rant against GC is like hearing Yugo owners rant that automobiles suck.

    C/C++ and similar languages, on the other hand, force the programmers to manage their resources.

    No, they don't. How would they? If I forget to free something and it goes out of scope, bang, memory leak. How am I forced to do anything about that by the language? (I've heard more than one story about somebody who learned Java, and then went back to writing C++, and wrote many lines of code before remembering that their C++, while correct, leaked left-and-right.)

    I want to be able to do more with the faster processors and additional RAM, rather than simply do the same job I could do yesterday only in "better" software.

    You're right. The easiest route to fast, working software is to get it correct, and then tune for performance -- and GCs are great for that.

    We also need to get back to writing applications that have good, if not great, performance with minimal resource requirements (e.g. RAM and processor). [...] But in either case it doesn't get done unless the programmers do their job, and use tools that allow them to do it.

    Again, you're right. And these requirements just scream "we need a GC that doesn't suck".
  • java memory profile (Score:5, Interesting)

    by sentientbrendan ( 316150 ) on Monday September 24, 2007 @02:33PM (#20732285)
    > Actually, the big language culprits would be those with auto-garbage collection,
    > etc. as they tend to have lazier programmers

    Actually, it isn't lazier programmers. The problem is that existing garbage collection implementations have horrible memory profile.

    If you look at the memory usage of a java program, it's about as bad as a c program that does nothing but leak memory. Practically speaking, java does little to free memory until it has *run out of memory*. Then when it does run out of memory and it needs to clean things up, things get slow as hell.

    >The real answer is doing your job right, and using the right tool - which is not necessarily
    >the easiest tool to use either.

    Yes! Unfortunately, academics and many novice programmers (who just got finished being trained by academics) are unfamiliar with the powerful tools available like C++. Going to school can give you the mistaken impression that garbage collection is *a good thing* because everyone uses it there. The truth is that C++ is a very complicated language with a steep learning curve, but that many times it is simply the only tool that is suitable for the job.

    If your program is IO bound, like a web application front end, you are in a great position, because essentially *any* tool will do the job, even if the performance is abysmal. You can use java, ruby, or whatever. And you should, becuase those languages don't present you with the complexity of c++.

    Unfortunately, many programs *are not IO bound* and the performance and memory profile of the underlying tool are very important. This is most true of interactive non parializable programs. So, a good example would be bittorrent programs. Consider utorrent vs azureas, one in c++ and one in java. utorrent is fairly light weight and easy to use because of its performance characteristics. Azureus is a powerful and well engineered program, but it sure as hell is slow and chews up memory.
  • by Futurepower(R) ( 558542 ) on Monday September 24, 2007 @02:47PM (#20732583) Homepage
    MOD PARENT UP.

    See this +5 comment posted below: Firefox's problem is architectural and not one of garbage collection. [slashdot.org].

    Quote: Actually I'm pretty sure they're in denial as to the cause of their problems. Announcing they're working on fixing "memory leaks" just supports their ability to continue their delusion. [my emphasis]

    A +4 comment to that comment discusses Firefox's "four separate memory allocation schemes":

    "1. Custom malloc/free implementation. (Yes, custom - not from libc.)
    2. C++ new/delete operators, which for all I know may be overridden to use their malloc/free.
    3. One of the first two with reference counting to decide when to free/delete.
    4. JavaScript mark-and-sweep GC.

    Dealing with this causes some truly insane hacks..."


    Then read the comment they don't want you to see: The memory bug is also a CPU hogging bug. [slashdot.org] At present, it is marked -1 Flamebait. However, that comment begins to discuss apparent social problems at the Mozilla Foundation, and some of the same material has been marked +5 in the past.
  • Re:Mod parent up (Score:5, Interesting)

    by BZ ( 40346 ) on Monday September 24, 2007 @02:59PM (#20732805)
    > 1. Custom malloc/free implementation. (Yes, custom - not from libc.)

    Uh... No. There are some arena allocators in use in the codebase for very specific tasks, but there is no custom malloc/free.

    > They're C++ objects, exposed as JavaScript objects, using something that's like XPCOM but isn't

    Actually, to be exposed to JS in Gecko something more or less has to be an XPCOM object at the moment. Then the XPConnect layer handles the glue between JS and C++.

    > It makes a project that's supposed to be open source effectively closed off to only the
    > "official" developers

    As someone who got into this project without being in any way "official", I beg to differ! ;)
  • by Herschel Cohen ( 568 ) on Monday September 24, 2007 @03:31PM (#20733315) Journal
    My experience has been quite counter to many of the complaining posts I have read. I now have nearly no lockups that I can ascribe to Firefox 2.0.x whereas it was very obvious that version 1.5 ate memory and at times locked a session. At last count I had 164 tabs open and it is likely nearer to 180 at the moment. At most times I have at least three sessions running email (Thunderbird), browser, coding with Subversion with several files open and in the last a connection to a remote server. If anything my system is much more reliable with 2.0 over 1.5.

    I used to be a casual tester for both versions 1.5 and 2.0 and I switched to each well prior to their offical release. What I noticed was much more effort was expended creating satisfactory working versions on Windows over either the Mac or Linux. Nonetheless, I have been pleased with the results. My use of the coming version 3.0 has been very limited running of most of the stable alpha versions. So far my limited exposure seems to see an improvement over 2.0. Another factor that could make my experience worse, is that I use the Mozilla version that I install by hand. I have long ago not bothered to even update the supplied distribution version. Nonetheless, I have no complaints.

    Another factor that should degrade my experience, is my machine's inners are not of that recent vitage having only one Gig of RAM and a much older AMD CPU. Might some of the problems so vociferously expressed here be due to other factors than so loudly proclaimed?

    I am well aware that supposedly identical machines, with the same software can behave very differently. Had that experience with corporate property using NT and XP, where I could get to Unix and the version control system while a neighbor could not. Nonetheless, I find it odd that my experience is so at odds with the many that have written.
  • by Anonymous Coward on Monday September 24, 2007 @03:48PM (#20733557)
    A computer, by definition, 'moves bytes around'.

    No. A computer, by definition, computes. At one level, this consists of moving bytes around. At another, it consists of moving bits around. At another, it's electrons flowing. At another, it's symbol manipulation. I'm a programmer, and even I don't care about bytes, or electrons; if Firefox worked by black magic, I would not care.

    If C++ is efficient, isn't assembly even more efficient? C++ isn't as good at dealing with individual bits as assembly. Won't somebody think of the bits?

    But even Linux is written in C, rather than assembly, today, even though Linus is skilled at writing assembly language. Is it less efficient now because it has less assembly than Linux 0.01 did? No, it's more efficient because Linus and friends can deal in useful abstractions. If you tried to write Firefox in machine code, you'd never finish.

    It sounds relatively safe to say "C++ yields superior performance and memory usage" because in one extreme case (say, an embedded system) it's true. You wouldn't use a GC there. But then, on a small enough system (say, 512 bytes of RAM) you wouldn't use a normal C library, either. There simply isn't room. Write your own. But this doesn't scale: on a gigabyte+ system, it's a net savings to use an existing (shared!) library for string manipulation, and it's a net savings to use an existing GC for memory allocation.
  • by Chandon Seldon ( 43083 ) on Monday September 24, 2007 @04:43PM (#20734407) Homepage

    Although Firefox does have some issues with memory usage (and occasionally memory leaks), that doesn't seem the be the primary cause of usability issues.

    In my personal experience with Firefox, I see two problems:

    • The whole browser locks up if a bit of JavaScript is slow - even if it's just one tab being slow out of ten.
    • The whole browser locks up if a plug-in or extension locks up - even if the plug-in is in just one tab out of ten.

    Both of these problems could be solved relatively easily with threads (in a number of different ways), but for some reason the Firefox developers have an irrational paranoia of anything that even vaguely resembles native concurrency. They say "the real problem is just response time, if we can respond fast enough in a single thread it's the same" - but then they never actually do it, and they definitely don't do anything that would let them recover from component crashes.

  • by Anonymous Coward on Monday September 24, 2007 @06:00PM (#20735469)

    ... proper memory management should simply fall into place on top of a clean architecture.

    And how, pray tell, does one "cleanly" deallocate cyclic, self-referential, dynamically-typed data structures, such as the ones that naturally arise with Javascript and HTML?

    With garbage collection, that's how. All possible solutions are isomorphic with garbage collection.

    So the real choice is between (1) the GC-nonexpert writes their own GC to cover a subset of the data structures when they should be writing their app, or (2) to use a modern, solid, pre-written GC for the whole app.

    I would also like to know how you plan to, using architecture alone, coalesce data to reduce memory fragmentation and return slack space to the system. Manually doing that to the heap is a nightmare verging on a total impossibility, whilst a good GC can do it automatically if you can afford the execution time. (And a really good GC could re-lay-out data w.r.t. cachelines to optimize performance on actual runtime data.)

  • by tknd ( 979052 ) on Monday September 24, 2007 @07:07PM (#20736295)

    utorrent is fairly light weight and easy to use because of its performance characteristics

    I call bs. utorrent is not easy to use because of performance characteristics. In some ways I find it stupid to use. Why do I have to use a 2nd level context menu to set a specfic upload/download rate maximum for a specific item? Where are my keyboard shortcuts? utorrent only wins on usability by copying what already exists. It's not a bad strategy, but I'm not sure I've seen anything greatly innovative in terms of UI in utorrent.

  • by Verte ( 1053342 ) on Tuesday September 25, 2007 @01:05AM (#20739027)

    Maybe the chaining of the event handler code with numerous windows open is an issue.
    I think you've hit the nail on the head. Here's an experiment. Find a page with a heap of links on it. Make sure what they link to is not too huge, and doesn't open any plugins, or anything. Now, quickly, try to open the links by right clicking and choosing 'open in new tab'. You should be spending a while waiting for the menu to open. That the page you are viewing becomes unresponsive while a new tab is being created points at really horrible threading. It seems the browser is, to some degree, doing work the operating system should be doing- managing multiple [conceptual] threads.

    I guess that maybe this is a little off topic, but I think it shows that there are certainly resource problems in the tab implementation. Someone suggested separate processes for tabs with different SLDs, and complete JS & Plugin environment clean when changing SLDs, which could practically eliminate many XSS problems I assume [note: I haven't done my research on that one].

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...