Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Windows Technology

86% of Windows 7 PCs Maxing Out Memory 613

CWmike writes "Citing data from Devil Mountain Software's community-based Exo.performance.network (XPnet), Craig Barth, the company's chief technology officer, said that new metrics reveal an unsettling trend. On average, 86% of Windows 7 machines in the XPnet pool are regularly consuming 90%-95% of their available RAM, resulting in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks. The 86% mark for Windows 7 is more than twice the average number of Windows XP machines that run at the memory 'saturation' point, and this comes despite more RAM being available on most Windows 7 machines. 'This is alarming,' Barth said of Windows 7 machines' resource consumption. 'For the OS to be pushing the hardware limits this quickly is amazing. Windows 7 is not the lean, mean version of Vista that you may think it is.'"
This discussion has been archived. No new comments can be posted.

86% of Windows 7 PCs Maxing Out Memory

Comments Filter:
  • by sopssa ( 1498795 ) * <sopssa@email.com> on Thursday February 18, 2010 @07:57AM (#31183000) Journal

    RAM is wasted when it isn't in use. The fact that the task manager in Windows says your RAM is used 95% tells nothing, and no it won't "result in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks". I'm actually really surprised, and not in a good way, that "chief technology officer" of the company doesn't know this.

    The new memory models in recent OS's try to utilize all the available RAM (as they should) to speed up things otherwise. It makes a lot of sense to cache things from hard-drive in low-peak usage points, and in such such way that it doesn't interfere with other perfomance. When the things that are most often used are already cached in RAM, their loading works a lot faster. This doesn't include only files, icons or such, but everything the OS could use or do that takes time.

    If theres a sudden need for more RAM, the cached data can be "dropped" in no time. It doesn't matter if it averages at 25% or 95%, just that the perfomance overally is better when you utilize all the resources you can to speed up things in general.

    • by Mr Thinly Sliced ( 73041 ) on Thursday February 18, 2010 @08:03AM (#31183036) Journal

      My understanding was that memory used for disk caching doesn't show up in task manager as "used".

      It's been a while since I booted win7 though, so I might be mistaken.

      Certainly under linux ram used as disk cache is marked "free".

      It wouldn't surprise me that win7 has a heavier memory footprint though - as more applications move to .net and web browsers use lots of flash / silverlight etc - all of these things have a RAM cost.

      • by Anonymous Coward on Thursday February 18, 2010 @08:11AM (#31183104)

        I think the issue here is that the system is turning to swap. Caching stuff that may be referenced again is fine and dandy, but if the system regularly turns to swap just to keep itself afloat, then you have a problem.

        • by snemarch ( 1086057 ) on Thursday February 18, 2010 @08:38AM (#31183414)
          Yep, that would be a problem - but neither the TFA nor xpnet mentions if this is actually happening, it seems that they're looking almost exclusively at "free physical memory", which isn't a useful stat in this regard. The xpnet site does say they factor in "how often it relies on virtual memory", but not how they do this (there's multiple metrics to choose from, some fairly uninteresting) and the fact that they seem to factor this in as a part of "memory usage" rather than keeping it as a separate stat makes me pretty wary of trusting any analysis from them.
          • Comment removed (Score:4, Interesting)

            by account_deleted ( 4530225 ) on Thursday February 18, 2010 @09:12AM (#31183814)
            Comment removed based on user account deletion
            • by snemarch ( 1086057 ) on Thursday February 18, 2010 @09:19AM (#31183898)

              actaully the windows 7 caching model is great. on games the difference between the first loading of a level and subsequent loads are night and day thanks to it's caching model.

              That's the windows cache system generally, from way back in the NT days... Vista and later SuperFetch is more than that.

              btw, regarding the article more directly: they shows no figure about the actual _swap_ usage, a thing that may or may not disprove their theory.

              Indeed. The xpnet site does mention that they factor in paging somehow, but that's still pretty useless - paging activity needs to be a separate statistic. Also, simply looking at pagefile usage isn't terribly useful, an inactive app can have it's working set trimmed and pages flushed out to disk, and this won't matter much in the big picture.

              What you need to look at is the rate of pagefile activity (ie., pages/second) as well as how often it happens - not just static numbers (even if having 1gig of data in the pf is probably a warning sign :))

              • by afidel ( 530433 ) on Thursday February 18, 2010 @09:49AM (#31184300)
                Actually what you really need to do is calculate hard pages/second which for some retarded reason isn't available as a default stat counter, ie those pages which actually go to the secondary backing store (disk).
                • by TheLink ( 130905 ) on Thursday February 18, 2010 @10:02AM (#31184488) Journal
                  Yeah I don't know why they don't set up the counter by default.

                  Anyway to set it up yourself:

                  Start perfmon.msc
                  Then add counters
                  go to Memory, add "Pages Output/sec".

                  I'm not an authority on virtual memory but from what I know:
                  Page Faults/sec is not usually relevant for this - the virtual memory stuff will have page fault even if it's not swapping to/from disk - it's part of how virtual memory works.
                  Page Inputs/sec could happen when you launch programs (then the O/S starts paging in the stuff it needs) - it's no indication of running out of memory.
                  Page Output/sec on the other hand is when the O/S is low and needs to copy stuff in RAM and write it OUT to disk so that it can reuse that RAM for something else. This is the one you want to monitor.
                  • Re: (Score:3, Informative)

                    by Eivind ( 15695 )

                    Actually, even that is inaccurate.

                    You see, it can make sense for the OS to swap out some not-recently-used pages of a program, to free up more memory for caching. For example. Say you're playing a game, but you've got firefox open. It could make sense to page out the entirety of firefox, so as to have more physical ram free for caching of game-content.

                    Life ain't so simple in a virtual world :-)

                    • by TheLink ( 130905 ) on Thursday February 18, 2010 @11:04AM (#31185514) Journal
                      If stuff slows down due to that swap out, then it's still accurate enough for me.

                      Maybe the O/S could get it right and swap out and swap in Firefox in a way so I won't notice any slow downs. Give me an example of such an O/S please.

                      So far in my experience, if Windows or Linux swaps out Firefox for whatever reason, if I then switch to Firefox, I have to wait for it to be swapped back in.

                      Why "page out" and not "page in"?

                      "Page in" doesn't necessarily mean that I'll have to wait if I switch to different programs- the O/S is bringing stuff from disk to ram - I believe in some cases the O/S pages in stuff as part of running a new program - so it's not such a useful metric for "not enough memory".

                      But "page out" means something in RAM is going to disk - if I ever want it back in RAM, I'll have to wait.

                      If stuff in RAM is going to disk needlessly and causing unnecessary waits then the O/S virtual memory algorithm is getting things wrong.
                    • Re: (Score:3, Informative)

                      by Bakkster ( 1529253 )

                      But more page-faults doesn't always correlate to more slowdowns. An OS with better page-allocation prediction will run faster (from the user's perspective) with the same number of page-faults. It's only a problem if the page-faults are on cached data that the user is requesting at that moment.

                      Continuing the Firefox example: it might be one page of memory to each page you want to view. A smart OS will leave the pages with the main Firefox program and current tab in RAM and cache the others first. Then w

                    • Re: (Score:3, Informative)

                      by Foolhardy ( 664051 )

                      But "page out" means something in RAM is going to disk - if I ever want it back in RAM, I'll have to wait.

                      On Windows it doesn't necessarily mean that. Writing a page to disk != needing to read it back from disk later.

                      Each process has a working set. Pages in the working set are mapped actively into the process's VM with page tables. The memory manager aggressively trims these pages from the working set and puts them into standby memory. A page in standby is not mapped for reading (and more importantly for

          • by TheLink ( 130905 ) on Thursday February 18, 2010 @09:47AM (#31184270) Journal
            Yeah. I don't have low mem problems with Windows 7. There's stuff I don't like about Windows 7 but "memory hog" is not on the list.

            For work I'm using Windows 7 64 bit on a 4GB notebook PC with tons of windows open e.g. a few Explorer windows open, a few Excel "windows"[1], a few Word windows, one Visio doc, Notepad++, Google Chrome, Firefox, putty, Outlook (a resource hog), Communicator, MSN Messenger windows, a Virtual Box Linux vm machine, Microsoft Security Essentials (it's my work PC so it's supposed to have AV) and it typically says 1700 to 2000MB _available_ (depending on how many firefox tabs, how many word docs and virtual machines etc). But overall no mem problem.

            And guess which is using the most RAM? Not Virtual Box, not Word, outlook or Excel. It's Firefox with a 173MB working set and 142MB Private Working Set!

            Yes it only has 500MB free memory, but so what? The O/S says there's 1700MB available. And so far I haven't had much slowdowns due to low memory issues.

            To me the relevant metric for "low on memory" is "Pages Output/sec" (go launch perfmon.msc and add that counter). If that's a constant zero when you or the O/S switches from app to app, window to window, it means it's not swapping out. If it's not swapping out and not getting "out of memory" messages, it's not low in RAM no matter what some random "expert" thinks. And it's zero for me.

            The equivalent in Linux for that is the swap "so" column when you run vmstat 1 (or vmstat 2). Same thing there - stuck at zero = not swapping.

            I don't think my usage can be considered "light", as it is, what are those users running that's using up so much memory? Symantec or McAfee antivirus? ;).

            FWIW, my laptop is not running any of the "OEM crapware" - I did a clean install of Windows 7 months ago when I got the laptop.

            If that "expert CTO" can't even give an example of one memory hogging program (or show where Windows 7 itself is using so much memory that it's a problem), then it's likely he's full of crap.

            Lastly, it's true my taskbar looks messy with two rows of task buttons, but I don't see the advantage of closing and reopening documents or programs if I'm not running out of RAM yet. I close them if I really do not need them (e.g. the document is out of date and not used for comparison). Otherwise it's much faster to just click a button to show the desired doc, rather than have to reopen it again from scratch (uses less battery power too - except in the case of MS Word which seems to use CPU even when "idle" - haven't figured that one out yet).

            [1] By default Excel actually just has one window which changes to display the relevant document depending on which Excel taskbar button you click, whereas Word actually has separate windows for each doc.
          • Re: (Score:3, Insightful)

            by Locklin ( 1074657 )

            The xpnet site does say they factor in "how often it relies on virtual memory", but not how they do this

            Even that isn't ideal. It makes sense to swap out some library that hasn't been used in hours or days. The more ram available for disk cache the better. The only solution is to look at Memory used minus disk cache (like the 'free' command on Linux does).

        • Comment removed (Score:4, Interesting)

          by account_deleted ( 4530225 ) on Thursday February 18, 2010 @09:21AM (#31183942)
          Comment removed based on user account deletion
          • Re: (Score:3, Informative)

            by Lumpy ( 12016 )

            but really there's no reason not to do so on any XP machine with 2 gigs of RAM or Vista/Win 7 machine with 4 gigs of RAM.

            Yes if you do the typical office no taxing tasks on a pc. If you do anything big it's no where near enough.

            If you edit HD Video or very large Photo arrays you bump up against 4gig without effort. I hit the 16 gig mark on a regular basis.

          • by afidel ( 530433 ) on Thursday February 18, 2010 @09:57AM (#31184430)
            Windows gets really cranky when it doesn't have a pagefile. We tried it for performance reasons and we saw an almost 40% drop in performance despite the server not being under any kind of memory pressure.
            • Re: (Score:3, Interesting)

              by ztransform ( 929641 )

              Windows gets really cranky when it doesn't have a pagefile. We tried it for performance reasons and we saw an almost 40% drop in performance despite the server not being under any kind of memory pressure.

              And yet when I turn off swap on my 32-bit Vista laptop performance increases 1,000 - 10,000% easily. The difference between waiting 10 minutes for the computer to stop thrashing the swap file and near-instantaneous action is immeasurable.

              Typical "don't turn off your pagefile" responses are fraught with lack of experience. There are times when a system performs better with a pagefile. There are many times when a system performs so much better without a pagefile that one wouldn't dream of ever turning it ba

          • Re: (Score:3, Informative)

            If you have a reasonable amount of RAM there's no reason to leave it turned on.

            If the swapping algorithm is so bad that it swaps unnecessarily, then yes, turning off swap will help. But a good swapping algorithm remains useful even if you have 16 GB of RAM. Large sections of many processes are basically "run once, then ignore" or even "never run". Most processes have a decent amount of startup code that is never referenced after the first half second of execution, or load multi-megabyte shared libraries into process memory space to get two short functions (or, similarly, contain code

            • Re: (Score:3, Informative)

              One other minor note: Windows's use of pre-emptive paging makes for a much faster hybrid sleep and/or hibernation. If your page file is larger than main memory, and you're not paging excessively, most of your memory is probably already paged out. Thus, the hibernate file only needs to have the unique data written to it; on a laptop with 4 GB of mostly used RAM and a relatively slow hard disk, it could take two minutes to hibernate the machine (hope your battery lasts). Every bit of memory paged out preempti
      • by snemarch ( 1086057 ) on Thursday February 18, 2010 @08:16AM (#31183164)

        It shows up as part of the memory commit bar - which is what regular users will look at, and then go off screaming about "OMG IT USES ALL MY SYSTEM MEMORY!1!!! one one". It's also deducted from the "free" count, since technically it isn't free (it can be freed quickly, but has to be zeroed out before it can be handed off to a new app - security and all).

        The Win7 task manager does show a "cached" stat, though, so your effectively free memory is "free"+"cached". And if you want more comprehensive memory stats, you should look at perfmon.msc or SysInternals' Process Explorer.

        I wonder if TFA has actually measured that disk swapping happens (easy with procexp or perfmon), or are just shouting their heads off without understanding what's going on... it's well-known that SuperFetch utilizes otherwise unused memory for disk caching, and does so proactively :)

        • Re: (Score:2, Insightful)

          Ahh fair enough. "Colour me learned something today". :-)

        • by Bert64 ( 520050 )

          The article mentions that they use a program of their own creation for monitoring memory usage, so how the users interpret the data is irrelevant as the program will send what it believes to be correct.
          Wether the program is accurate or not is another matter, but the fact it doesn't report every system as using 100% of its memory suggests it is at least somewhat aware of superfetch etc.

      • I had a great reply all typed up but the stupid filter thinks it used too many "junk" characters... not entirely sure what those junk characters are, but meh.

        I'm on a 2-year old laptop with 4GB of RAM, which I use for gaming. Said system is running Windows 7 x64. Looking at the memory tab, there's two numbers that guage the available memory... of they're only looking at the one labelled "Free", they're going to see that I'm using 80% of my available memory. If they look at the one labelled "Available", they

      • by Eskarel ( 565631 )

        It depends what they're monitoring.

        Looking at my windows 7 system. it does appear that you are correct and disk cache is not included in the "In Use" Category. However it isn't included in the "Free" category either, but rather in "Available", so if they're crappy software is reading the "free" category instead of "available" that won't include cached disk.

        It's also possible of course that they're monitoring software leaks like a sieve on 86% of windows machines, which is entirely plausible.

      • Re: (Score:3, Informative)

        by lagfest ( 959022 )

        My current* windows 7 stats say:
        Total: 2046MB
        Used: 1.26GB
        Cache: 634MB
        Available: 743MB
        Free: 132MB

        So used RAM does not include cache. And 'available' is a nice way of telling grandma that RAM used for cache is actually available to apps.

        * not a snapshot, i can't type that fast :)

    • Re: (Score:3, Insightful)

      by dr.newton ( 648217 )

      If all that RAM was simply being used for a filesystem cache, the system would not have to "increasingly turn to disk-based virtual memory to handle tasks" - it would just drop some cache when it needed to start a new task, as you said.

      It seems that something else is going on.

    • Re: (Score:3, Insightful)

      by Bazer ( 760541 )
      I don't know how caching works in W7 but on Linux, if the system has to "turn to disk-based virtual memory to handle tasks" then the memory utilization isn't caused by buffers because buffers are never swapped out to disk. If W7 behaves in a similar manner the it's either a memory leak, system bloat or the caching mechanism is broken.
      • Re: (Score:2, Interesting)

        by jernejk ( 984031 )
        I used Ubuntu for almost a year and I think linux cachnig / virtual memory is implemented better than win7. It seems win7 cache is too aggressive and it dumps active programs from RAM to page files when it should not. Maybe it works OK for most desktop users, but it doesn't work very well for a development machine. I have nothing but my subjective feeling to backup my observations.
      • Re: (Score:3, Interesting)

        by spxero ( 782496 )
        ...or the caching mechanism is broken.

        I'm inclined to think it's this, at least for my Vista machine. I currently have 6GB RAM, but at any given time with Outlook, FireFox, and a handful of Explorer windows open there isn't any more than 2-3GB showing to be in use. The rest is cached. This becomes a problem only when I need to fire up a 2GB Linux VM for testing, the VM will pause itself on startup, citing not enough RAM available. I'm no expert, but I have a sneaking suspicion that the caching mechanism i
    • Re: (Score:3, Insightful)

      by eldavojohn ( 898314 ) *

      The new memory models in recent OS's try to utilize all the available RAM (as they should) to speed up things otherwise. It makes a lot of sense to cache things from hard-drive in low-peak usage points, and in such such way that it doesn't interfere with other perfomance. When the things that are most often used are already cached in RAM, their loading works a lot faster. This doesn't include only files, icons or such, but everything the OS could use or do that takes time.

      If theres a sudden need for more RAM, the cached data can be "dropped" in no time. It doesn't matter if it averages at 25% or 95%, just that the perfomance overally is better when you utilize all the resources you can to speed up things in general.

      Assuming your claims of how Windows 7 is implemented are true, then the claims from the person who actually collected all the empirical evidence must be false:

      resulting in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks.

      If the memory was freed up dynamically as needed then no processes would ever be forced to resort to disk-based virtual memory. So either you work at Microsoft and are assuring us that the implementation protects against this or you're speculating against someone who has claimed to gathered a large enough to make such accusations.

      No offense but

      • by Sockatume ( 732728 ) on Thursday February 18, 2010 @08:16AM (#31183160)

        If they'd measured pagefaults, they could've reported pagefaults. They didn't. RAM usage appears to be the total basis for the article, so his concern is a genuine one. We don't know enough about the study at this stage to dismiss it.

        • by Anonymous Coward on Thursday February 18, 2010 @08:26AM (#31183272)

          You cannot study virtual memory performance without considering how many page faults occur.

          It is perfectly reasonable to use RAM as a filesystem cache, which is why Linux has adopted this approach. The effect is that almost all of the physical RAM is always in use. The cost is that pages are more likely to be wrongly swapped out - however, in typical cases, this increased cost is tiny in relation to the huge reduction in the number of disk accesses.

      • by phatcabbage ( 986219 ) on Thursday February 18, 2010 @08:20AM (#31183214)
        Barth acknowledged that XPnet's data couldn't determine whether the memory usage was by the operating system itself, or an increased number of applications.
        So yeah, it doesn't seem like the author really knows what's going on...
        • Barth acknowledged that XPnet's data couldn't determine whether the memory usage was by the operating system itself, or an increased number of applications. So yeah, it doesn't seem like the author really knows what's going on...

          While that's true, one would probably make the assumption that it is normalized in XP vs Windows 7 since they have no way of tracking it. What I mean is that you would assume the Windows 7 user runs the same number of programs as the XP user.

          I actually followed the blog link [blogspot.com] in the story and while they can't pin it on application or OS, they can say that the disk I/O is backlogged on 36% of XP machines sampled, 83% of Vista machines sampled and 85% of Windows 7 machines sampled.

          While they don't k

      • by dhavleak ( 912889 ) on Thursday February 18, 2010 @08:21AM (#31183228)

        If the memory was freed up dynamically as needed then no processes would ever be forced to resort to disk-based virtual memory.

        The trouble is, the TFA doesn't actually say (at least not clearly) that the Win7 machines are indeed turning to swap more regularly. It just states that fetching stuff from the swap file is a consequence of running out of RAM and causes perf degradation. So if the Win7 machines are indeed utilizing all available RAM and yet not swapping at a significanly higher rate, it means they're making more optimum use of available RAM.

      • by dunezone ( 899268 ) on Thursday February 18, 2010 @08:27AM (#31183282) Journal
        Hi there.

        Ive been running Windows 7 since the BETA release. I have never experienced any issues that result in I/O thrashing against the hard drive as result of all my ram being utilized. I also have numerous friends running Windows 7 none have reported any issues like this, if anything its been praise for the operating system. So I am amazed to see a 86% number being thrown out there yet never seen this problem before.

        No offense but I'm going to side with the guy who appears to make his living testing these sorts of things ... the guy who is offering me numbers.

        And now lets quote something from the article...

        Barth acknowledged that XPnet's data couldn't determine whether the memory usage was by the operating system itself, or an increased number of applications, but said that Devil Mountain would start working on finding which is the dominant factor in increased memory use.

        This single sentence makes the article rubbish. They have no clue whats causing the heavy memory usage, its just an assumption that the OS is causing it and they're yelling fire before looking through all the data or completely analyzing the problem.

      • by tgd ( 2822 )

        Frightening, huh? CTO and, in fact, he has no idea what he's talking about.

        Rather than blindly following presumed authority, perhaps you should make use of some critical thinking skills and a couple Google searches. You'll (apparently) be surprised what you learn.

      • by cgenman ( 325138 ) on Thursday February 18, 2010 @08:45AM (#31183482) Homepage

        Read TFA. It just claims that Windows 7 consumes all available RAM. That is the "empirical evidence." System slowdown was NOT measured.

        Utilizing all available RAM is a pretty well understood technique at this point. All web browsers do this now, as do many other applications. One would expect a well-designed modern OS to do this. Consuming all memory itself is not a sign of poor programming itself, so long as disk caching of things that should be in RAM doesn't occur. This is not something that the people in the article has measured.

        I'm going to side with the guy who appears to make his living testing these sorts of things.

        Bad science is bad irrespective of the person conducting it. And whatever the original tester said is getting filtered through the viewpoint of the gentleman writing the article. Considering that he says that "Windows 7 is not the lean, mean version of Vista that you may think it is," yet never once compares statistics to Vista (or even mentions it outside of this statement), I'd take the conclusions from these stats with a grain of salt.

    • Re: (Score:3, Insightful)

      by Rockoon ( 1252108 )
      According to their website blog, they are "rethinking windows performance."

      So instead of thinking about what actually effects OS performance, they are rethinking things so that they don't have to sell real solutions to their customers, "where [they] maintain several large installations of our commercial DMS Clarity Suite performance analysis solution."
    • RAM is wasted when it isn't in use. The fact that the task manager in Windows says your RAM is used 95% tells nothing, and no it won't "result in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks".

      While there are approaches using virtual memory that will limit the slow downs, the data still has to come form the HD even SSDs are not faster then ram. Grandma checking her email may not notice but data processing and other memory intensive functions need memory and putting in on a HD will slow things down.

    • Re: (Score:2, Insightful)

      by slim ( 1652 )

      Yes. I have a similar problem when people running servers complain that the CPU is at 100%.

      If you're seeing an actual slowdown in performance, fine, worry about it.

      Otherwise, 100% CPU usage is a good thing: it means there's a process that's not IO bound.

      • If you're seeing an actual slowdown in performance, fine, worry about it.

        User base increases over time. Even on an intranet server, your company will probably add users when it grows. As your user base increases, you will see slowdowns. If you can catch slowdowns before they happen, you will be more prepared for the DDOS attack that comes when your site gets mentioned in a Slashdot article or when a bunch of new employees go through orientation.

        100% CPU usage is a good thing: it means there's a process that's not IO bound.

        Or it could mean that you need to optimize the process that uses the most CPU time so that it becomes I/O bound. All other things equal,

    • The new memory models in recent OS's try to utilize all the available RAM (as they should) to speed up things otherwise.

      BeOS and Haiku did/do this, but I don't think any other OS has implemented total RAM usage to such a degree.

    • by nmg196 ( 184961 ) on Thursday February 18, 2010 @08:27AM (#31183290)

      Totally agree. If you don't want Windows 7 to use the 4GB of RAM you've paid for to speed up your computer, take out 2GB and put it in the drawer. Otherwise, be thankful that it's actually making the most of the RAM you're using.

      What next? People complaining that games use 100% CPU to give them maximum framerate when it could just use 30% CPU and give them 10 FPS?

      • What next? People complaining that games use 100% CPU to give them maximum framerate when it could just use 30% CPU and give them 10 FPS?

        That’s not a fair analogy. When you are playing a game, you are not multi-tasking. You are basically using the computer for a single task. I don’t care if the game sucks up 100% of the CPU and all of the remaining memory, as long as it frees it up when I close it.

        If the system is gratuitously using 95% of the RAM nearly all the time, then it’s a completely different scenario. Everything I try to open that wasn’t cached already will force the system to dump some memory to the swap fil

        • Re: (Score:3, Informative)

          by sopssa ( 1498795 ) *

          If the system is gratuitously using 95% of the RAM nearly all the time, then it’s a completely different scenario. Everything I try to open that wasn’t cached already will force the system to dump some memory to the swap file to make room for the new application.

          Uh no. The point here is that the RAM is utilized with data that speeds up things, but that can be instantly freed if needed. It doesn't need to put that in swap file.

    • RAM is wasted when it isn't in use.

      You make decent arguments that I've heard before, many times. And I actually agree with them to a point. At least in theory, given somewhat unrealistic assumptions about the system. But there are two major flaws in this idea. First, in practice, where real developers are making complex decisions about how they structure their programs and the resources they use, I think this sentiment results in wasteful and excessive programming practices.

      In other words, the resources programs demand may be managed ver

    • Windows is notoriously reluctant to invalidate caches to free RAM for applications. I don't know if Win7 fixed that, but XP would much rather send an idle app to swap than free some disk buffers. That's why switching swap off entirely tended to speed it up so much - it was forced not to swap out any active data and free up buffers instead.

    • Re: (Score:3, Informative)

      by Z00L00K ( 682162 )

      Just because RAM is available doesn't mean that the OS should hog it. You may want to use that RAM for something different. It may be legal to use "excess" RAM for buffers, but then those buffers must be freed fast whenever necessary.

      If you use large amounts of RAM for buffers you will either freeing the least used buffers and use them for the application, and then you will get memory fragmentation. This can be bad for some applications. The other scenario is that you will just kill a block of buffers and t

  • If these claims are true, isn't it possible that this could be seen to the user as a source for the battery life problems [slashdot.org]? I suppose that disk-based virtual memory would incur a little more read/write on your hard disk as well ... possibly decreasing the mean time to failure for Windows 7 users.
  • by A beautiful mind ( 821714 ) on Thursday February 18, 2010 @08:01AM (#31183030)
    If it is filesystem cache, then it's not wasted or "maxed out". If it is application/system memory, then it is indeed a problem.
    • Re: (Score:3, Interesting)

      by Bazer ( 760541 )
      Would a filesystem cache cause the system to swap?
      • by A beautiful mind ( 821714 ) on Thursday February 18, 2010 @08:14AM (#31183128)
        It's Windows. It might be a bug.
      • by Spad ( 470073 )

        No, but they don't offer any actual evidence that the systems are swapping a lot, just that their memory usage is high. The "and this causes swapping" part of the claim just seems to be an assumption rather than a fact.

      • FIle cache will definitely swap out your applications in XP.

        In previous Microsoft OSs you could set maximum file cache via the Windows .ini file (and it was like a breath of fresh air for Windows performance - suddenly you could do other things while burning CDs, etc.

        On XP they took out that feature and it's performed like a dog ever since because of it. A whole new generation of CD burners had to be developed with "SafeBurn" technology, etc.

        I'm sure the stupidity has continued in Vista/7 but I don't have i

    • Not necessarily -- it could be application / system memory that's pre-cached (based on profiling). If at any point, your machine has RAM available, and idle cycles, pre-caching would be a good way to use them. See here [wikipedia.org].
  • Bogus Story (Score:3, Insightful)

    by filesiteguy ( 695431 ) <perfectreign@gmail.com> on Thursday February 18, 2010 @08:05AM (#31183062)
    Let's start from the story (which I *did* read) - 'Barth acknowledged that XPnet's data couldn't determine whether the memory usage was by the operating system itself, or an increased number of applications,"'

    Right there I'd be suspect whether this is even an issue or not. Given Windows (which I generally regard as inferior) as an OS having lots of functionality, I wouldn't be suprised if it takes up all available RAM prior to utilizing swap. I'm on my 2GB Ubuntu system right now and am running at 18% of 2GB with just Mozilla (with two tabs) and Thunderbird. But there's also my network layer (Network Monitor), KTorrent, and my bluetooth daemon running in the background. All told, System Monitor says i have 31 processes running.

    Let's do a like comparison - run the exact number of apps and processes before declaring a memory leak.

    Sheesh!
  • by dhavleak ( 912889 ) on Thursday February 18, 2010 @08:09AM (#31183094)

    I guess Devil Mountain or whoever don't know about SuperFetch. Or need publicity.

    And I guess slashdot editors don't know about SuperFetch. Or maybe an article like this gets them more traffic, revenue, etc.

    The fucking bullshit that passes for articles these days..

  • by hitech69 ( 78566 ) on Thursday February 18, 2010 @08:14AM (#31183126) Homepage

    Computerworld should just close up shop for this worthless piece of journalism, or at least give their author the boot for doing any work with Craig Barth who represents a team of morons. samzenpus should be given a troll rating for getting this to Slashdot.

  • Page Faults (Score:5, Insightful)

    by Dr_Barnowl ( 709838 ) on Thursday February 18, 2010 @08:16AM (#31183152)

    The metric to count is the number of page faults, an indicator of the number of times that the OS addresses memory that isn't in RAM.

    As others point out, measuring just the fraction of memory consumption is stupid. I have 6GB of RAM ; my processes are using about 1.7GB of that, but the OS is claiming that 3.8GB is consumed. So that's 2.1GB of cached data that I no longer have to wait for from disk. Hooray.

    TFA hints that they may be measuring page faults, and does mention that Win7 is hitting the disk for virtual memory more often. But they should make that clearer if it's the case.

  • People are either being unbelievably stupid, have only 1 gig of RAM installed, or this is FUD. Example: I'm currently running Windows 7 64 bit. On my secondary monitor, I have a bunch of system monitoring widgets... hard drive space, CPU load and temp, video card load and temp, memory usage, etc. Just last night I was playing Bioshock 2, all settings at max. Even with those widgets running, with Aqua Teen Hunger Force playing in MPC on the secondary monitor in a window, and Bioshock 2 running full bore

    • The plural of anecdote is not data... but I still have to throw in my own 2 cents. It seems that as long as you have at least 1.5GB ram, win7 will be using about 700MB of that on startup, and the rest goes to apps. I have 4GB ram and I've never seen more than 3GB of ram in use, including running Supreme Commander with other stuff open.

      I agree that Win7 can't run on systems with less than 1GB ram, and only runs "alright" on systems with exactly 1GB ram, but if its using ram for something other than disk cach

  • I'm running Windows 7 x64 with 4Gb of RAM; currently I'm running Outlook, Firefox, IE, Excel, FeedDemon, Office Communicator, AV, AD management tools, call management software, a couple of powershell instances, Context, RDTabs, Putty and the usual assortment of drivers, plugins and background apps. I'm at 2.4Gb of RAM; even on a 2Gb machine it would be usable, though I'd probably have to be a bit more zealous with closing unused apps to avoid swapping.

    I can only assume that it's the usual nonsense of vendor

    • by 3seas ( 184403 )

      your using the wrong apps and data.

      You can take small efficient application as was used to be created because ram was to expensive and run hundreds if not thousands of them without a slow down, especially if teh data they are dealing with is near empty. Or you can take one autocad application and large cad file and tax it to its limits and beyond.

      I'm sure there are other such applications that are massive and resource hungryas well as data files.

      Increased speed and memory of hardware is generally preceived

  • "Current generation hardware"? Seriously, how many machines in this very small sample set are using i series intel chips? The way windows 7 was marketed, I'd bet that many of these machines were upgraded XP boxes. Top that with the 32 bit memory caps and people's general hesitation to install a 64 bit desktop OS, and I am not surprised at all that many machines are hitting memory saturation. Add to that that the Windows 7 interface leads to leaving more apps open at any given time than the XP interface...
  • Oh come on (Score:5, Funny)

    by megla ( 859600 ) on Thursday February 18, 2010 @08:24AM (#31183258)
    First we had submitters who didn't read the stories they were posting. Then we had editors who didn't read the stories they were approving. Now we have companies who don't read the articles they put out. Seriously, it's called a file cache. That's how it's supposed to work. Nice job, idiots.
  • First off, kudos to ComputerWorld for this shocking newsflash "New Windows Operating System is Bloated and Disappoints Users". Is it 1995 again when I foolishly believed Microsoft and loaded Windows95 on my happy Windows 3.1 computer only to discover the 4MB minimum RAM requirement left my computer a useless lump of plastic with an endlessly spinning hard drive? Four more MB of memory for $130 from a shady computer dealer finally slowed the paging down. I have seen this cycle repeated 6 more times since

  • No kidding. (Score:4, Interesting)

    by endus ( 698588 ) on Thursday February 18, 2010 @11:02AM (#31185490)
    Installed x64 on my 4gb machine and the performance was just ridiculously bad. I am a photographer and do a lot of image editing...couldn't even keep Cap One, Photoshop and iTunes open at the same time...especially if 7 was trying to thumbnail images in a folder (thumbnailing is broken and ridiculously resource intensive despite Microsoft's claims that its more robust in this OS). Noticed that it was swapping to the disk like crazy and ordered another 4gb. Definitely much better now, though I think 12gb would not be totally out of line for x64 with heavy applications. I haven't even TRIED to edit HD video yet...that is not a prospect I am looking forward to.

    I pity people running the x86 version of the OS that are maxing out at 4gb. Definitely buy 64 bit (even though its more of a beta than a real OS) if you do anything memory intensive.

    The one good thing about all this is that HOPEFULLY...FINALLY...maybe this will push Microsoft to push 64 bit more. They need to abandon 32 bit and force application writers and hardware manufacturers to start making 64 bit native applications. Working in a medical environment BELIEVE me I understand the need for backwards compatability, but the fact is that the resources are just not being put in to 64 bit to make it a really viable platform and even moderate power users are going to start bumping up against the 4gb limit.

    Yea, I read that thing saying that the 4gb limit is a product-based limit rather than a technical limit but either way...it appears that x64 is where MS is choosing to support > 4gb so lets get serious about it.

FORTUNE'S FUN FACTS TO KNOW AND TELL: A guinea pig is not from Guinea but a rodent from South America.

Working...