86% of Windows 7 PCs Maxing Out Memory 613
CWmike writes "Citing data from Devil Mountain Software's community-based Exo.performance.network (XPnet), Craig Barth, the company's chief technology officer, said that new metrics reveal an unsettling trend. On average, 86% of Windows 7 machines in the XPnet pool are regularly consuming 90%-95% of their available RAM, resulting in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks. The 86% mark for Windows 7 is more than twice the average number of Windows XP machines that run at the memory 'saturation' point, and this comes despite more RAM being available on most Windows 7 machines. 'This is alarming,' Barth said of Windows 7 machines' resource consumption. 'For the OS to be pushing the hardware limits this quickly is amazing. Windows 7 is not the lean, mean version of Vista that you may think it is.'"
When do people get this (Score:5, Informative)
RAM is wasted when it isn't in use. The fact that the task manager in Windows says your RAM is used 95% tells nothing, and no it won't "result in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks". I'm actually really surprised, and not in a good way, that "chief technology officer" of the company doesn't know this.
The new memory models in recent OS's try to utilize all the available RAM (as they should) to speed up things otherwise. It makes a lot of sense to cache things from hard-drive in low-peak usage points, and in such such way that it doesn't interfere with other perfomance. When the things that are most often used are already cached in RAM, their loading works a lot faster. This doesn't include only files, icons or such, but everything the OS could use or do that takes time.
If theres a sudden need for more RAM, the cached data can be "dropped" in no time. It doesn't matter if it averages at 25% or 95%, just that the perfomance overally is better when you utilize all the resources you can to speed up things in general.
Re:When do people get this (Score:5, Informative)
My understanding was that memory used for disk caching doesn't show up in task manager as "used".
It's been a while since I booted win7 though, so I might be mistaken.
Certainly under linux ram used as disk cache is marked "free".
It wouldn't surprise me that win7 has a heavier memory footprint though - as more applications move to .net and web browsers use lots of flash / silverlight etc - all of these things have a RAM cost.
Re:When do people get this (Score:5, Insightful)
I think the issue here is that the system is turning to swap. Caching stuff that may be referenced again is fine and dandy, but if the system regularly turns to swap just to keep itself afloat, then you have a problem.
Re:When do people get this (Score:5, Interesting)
Comment removed (Score:4, Interesting)
Re:When do people get this (Score:4, Interesting)
actaully the windows 7 caching model is great. on games the difference between the first loading of a level and subsequent loads are night and day thanks to it's caching model.
That's the windows cache system generally, from way back in the NT days... Vista and later SuperFetch is more than that.
btw, regarding the article more directly: they shows no figure about the actual _swap_ usage, a thing that may or may not disprove their theory.
Indeed. The xpnet site does mention that they factor in paging somehow, but that's still pretty useless - paging activity needs to be a separate statistic. Also, simply looking at pagefile usage isn't terribly useful, an inactive app can have it's working set trimmed and pages flushed out to disk, and this won't matter much in the big picture.
What you need to look at is the rate of pagefile activity (ie., pages/second) as well as how often it happens - not just static numbers (even if having 1gig of data in the pf is probably a warning sign :))
Re:When do people get this (Score:5, Informative)
Re:When do people get this (Score:5, Informative)
Anyway to set it up yourself:
Start perfmon.msc
Then add counters
go to Memory, add "Pages Output/sec".
I'm not an authority on virtual memory but from what I know:
Page Faults/sec is not usually relevant for this - the virtual memory stuff will have page fault even if it's not swapping to/from disk - it's part of how virtual memory works.
Page Inputs/sec could happen when you launch programs (then the O/S starts paging in the stuff it needs) - it's no indication of running out of memory.
Page Output/sec on the other hand is when the O/S is low and needs to copy stuff in RAM and write it OUT to disk so that it can reuse that RAM for something else. This is the one you want to monitor.
Re: (Score:3, Informative)
Actually, even that is inaccurate.
You see, it can make sense for the OS to swap out some not-recently-used pages of a program, to free up more memory for caching. For example. Say you're playing a game, but you've got firefox open. It could make sense to page out the entirety of firefox, so as to have more physical ram free for caching of game-content.
Life ain't so simple in a virtual world :-)
Re:When do people get this (Score:4, Insightful)
Maybe the O/S could get it right and swap out and swap in Firefox in a way so I won't notice any slow downs. Give me an example of such an O/S please.
So far in my experience, if Windows or Linux swaps out Firefox for whatever reason, if I then switch to Firefox, I have to wait for it to be swapped back in.
Why "page out" and not "page in"?
"Page in" doesn't necessarily mean that I'll have to wait if I switch to different programs- the O/S is bringing stuff from disk to ram - I believe in some cases the O/S pages in stuff as part of running a new program - so it's not such a useful metric for "not enough memory".
But "page out" means something in RAM is going to disk - if I ever want it back in RAM, I'll have to wait.
If stuff in RAM is going to disk needlessly and causing unnecessary waits then the O/S virtual memory algorithm is getting things wrong.
Re: (Score:3, Informative)
But more page-faults doesn't always correlate to more slowdowns. An OS with better page-allocation prediction will run faster (from the user's perspective) with the same number of page-faults. It's only a problem if the page-faults are on cached data that the user is requesting at that moment.
Continuing the Firefox example: it might be one page of memory to each page you want to view. A smart OS will leave the pages with the main Firefox program and current tab in RAM and cache the others first. Then w
Re: (Score:3, Informative)
On Windows it doesn't necessarily mean that. Writing a page to disk != needing to read it back from disk later.
Each process has a working set. Pages in the working set are mapped actively into the process's VM with page tables. The memory manager aggressively trims these pages from the working set and puts them into standby memory. A page in standby is not mapped for reading (and more importantly for
Re: (Score:3, Insightful)
Re:When do people get this (Score:4, Funny)
You mean your machine with 8GB RAM never hits swap? Wow. Shock. Color me surprised! O_o
Available memory != Free memory (Score:5, Interesting)
For work I'm using Windows 7 64 bit on a 4GB notebook PC with tons of windows open e.g. a few Explorer windows open, a few Excel "windows"[1], a few Word windows, one Visio doc, Notepad++, Google Chrome, Firefox, putty, Outlook (a resource hog), Communicator, MSN Messenger windows, a Virtual Box Linux vm machine, Microsoft Security Essentials (it's my work PC so it's supposed to have AV) and it typically says 1700 to 2000MB _available_ (depending on how many firefox tabs, how many word docs and virtual machines etc). But overall no mem problem.
And guess which is using the most RAM? Not Virtual Box, not Word, outlook or Excel. It's Firefox with a 173MB working set and 142MB Private Working Set!
Yes it only has 500MB free memory, but so what? The O/S says there's 1700MB available. And so far I haven't had much slowdowns due to low memory issues.
To me the relevant metric for "low on memory" is "Pages Output/sec" (go launch perfmon.msc and add that counter). If that's a constant zero when you or the O/S switches from app to app, window to window, it means it's not swapping out. If it's not swapping out and not getting "out of memory" messages, it's not low in RAM no matter what some random "expert" thinks. And it's zero for me.
The equivalent in Linux for that is the swap "so" column when you run vmstat 1 (or vmstat 2). Same thing there - stuck at zero = not swapping.
I don't think my usage can be considered "light", as it is, what are those users running that's using up so much memory? Symantec or McAfee antivirus?
FWIW, my laptop is not running any of the "OEM crapware" - I did a clean install of Windows 7 months ago when I got the laptop.
If that "expert CTO" can't even give an example of one memory hogging program (or show where Windows 7 itself is using so much memory that it's a problem), then it's likely he's full of crap.
Lastly, it's true my taskbar looks messy with two rows of task buttons, but I don't see the advantage of closing and reopening documents or programs if I'm not running out of RAM yet. I close them if I really do not need them (e.g. the document is out of date and not used for comparison). Otherwise it's much faster to just click a button to show the desired doc, rather than have to reopen it again from scratch (uses less battery power too - except in the case of MS Word which seems to use CPU even when "idle" - haven't figured that one out yet).
[1] By default Excel actually just has one window which changes to display the relevant document depending on which Excel taskbar button you click, whereas Word actually has separate windows for each doc.
Re: (Score:3, Insightful)
The xpnet site does say they factor in "how often it relies on virtual memory", but not how they do this
Even that isn't ideal. It makes sense to swap out some library that hasn't been used in hours or days. The more ram available for disk cache the better. The only solution is to look at Memory used minus disk cache (like the 'free' command on Linux does).
Comment removed (Score:4, Interesting)
Re: (Score:3, Informative)
but really there's no reason not to do so on any XP machine with 2 gigs of RAM or Vista/Win 7 machine with 4 gigs of RAM.
Yes if you do the typical office no taxing tasks on a pc. If you do anything big it's no where near enough.
If you edit HD Video or very large Photo arrays you bump up against 4gig without effort. I hit the 16 gig mark on a regular basis.
Re:When do people get this (Score:5, Interesting)
Re: (Score:3, Interesting)
Windows gets really cranky when it doesn't have a pagefile. We tried it for performance reasons and we saw an almost 40% drop in performance despite the server not being under any kind of memory pressure.
And yet when I turn off swap on my 32-bit Vista laptop performance increases 1,000 - 10,000% easily. The difference between waiting 10 minutes for the computer to stop thrashing the swap file and near-instantaneous action is immeasurable.
Typical "don't turn off your pagefile" responses are fraught with lack of experience. There are times when a system performs better with a pagefile. There are many times when a system performs so much better without a pagefile that one wouldn't dream of ever turning it ba
Re: (Score:3, Informative)
If you have a reasonable amount of RAM there's no reason to leave it turned on.
If the swapping algorithm is so bad that it swaps unnecessarily, then yes, turning off swap will help. But a good swapping algorithm remains useful even if you have 16 GB of RAM. Large sections of many processes are basically "run once, then ignore" or even "never run". Most processes have a decent amount of startup code that is never referenced after the first half second of execution, or load multi-megabyte shared libraries into process memory space to get two short functions (or, similarly, contain code
Re: (Score:3, Informative)
Re:When do people get this (Score:5, Informative)
It shows up as part of the memory commit bar - which is what regular users will look at, and then go off screaming about "OMG IT USES ALL MY SYSTEM MEMORY!1!!! one one". It's also deducted from the "free" count, since technically it isn't free (it can be freed quickly, but has to be zeroed out before it can be handed off to a new app - security and all).
The Win7 task manager does show a "cached" stat, though, so your effectively free memory is "free"+"cached". And if you want more comprehensive memory stats, you should look at perfmon.msc or SysInternals' Process Explorer.
I wonder if TFA has actually measured that disk swapping happens (easy with procexp or perfmon), or are just shouting their heads off without understanding what's going on... it's well-known that SuperFetch utilizes otherwise unused memory for disk caching, and does so proactively :)
Re: (Score:2, Insightful)
Ahh fair enough. "Colour me learned something today". :-)
Re: (Score:2)
The article mentions that they use a program of their own creation for monitoring memory usage, so how the users interpret the data is irrelevant as the program will send what it believes to be correct.
Wether the program is accurate or not is another matter, but the fact it doesn't report every system as using 100% of its memory suggests it is at least somewhat aware of superfetch etc.
Re: (Score:3, Interesting)
To be fair, windows 7 will swap data out, my PC atm is sitting here with 8GB of ram, 1.1GB used by programs, 5.9GB used by cache and its reporting 1.2GB free, so it still pages out data. However I have never noticed it pageing, so likely its paging out the "right data", in other words stuff that is not used, just how an OS should work :)
Re: (Score:2)
I had a great reply all typed up but the stupid filter thinks it used too many "junk" characters... not entirely sure what those junk characters are, but meh.
I'm on a 2-year old laptop with 4GB of RAM, which I use for gaming. Said system is running Windows 7 x64. Looking at the memory tab, there's two numbers that guage the available memory... of they're only looking at the one labelled "Free", they're going to see that I'm using 80% of my available memory. If they look at the one labelled "Available", they
Re: (Score:2)
It depends what they're monitoring.
Looking at my windows 7 system. it does appear that you are correct and disk cache is not included in the "In Use" Category. However it isn't included in the "Free" category either, but rather in "Available", so if they're crappy software is reading the "free" category instead of "available" that won't include cached disk.
It's also possible of course that they're monitoring software leaks like a sieve on 86% of windows machines, which is entirely plausible.
Re: (Score:3, Informative)
My current* windows 7 stats say:
Total: 2046MB
Used: 1.26GB
Cache: 634MB
Available: 743MB
Free: 132MB
So used RAM does not include cache. And 'available' is a nice way of telling grandma that RAM used for cache is actually available to apps.
* not a snapshot, i can't type that fast :)
Re:When do people get this (Score:4, Insightful)
Here we see why /. needs a "-1, Wrong" mod.
Re: (Score:2, Insightful)
You're making an apples-to-oranges comparison, and you don't mention what software is running on either of your machines...
I've found Vista and Win7 to generally work better, giving decent enough hardware. Because of SuperFetch, Visual Studio loads faster on my Vista64 dualcore laptop with 7200rpm drive and 2 gigs of memory than on XP64 quadcore workstation with 10000rpm raptor drive and 8 gigs of memory. But this isn't a fair comparison either, since the rest of the software suite on those two machines are
Re: (Score:3, Insightful)
>>>You're making an apples-to-oranges comparison, and you don't mention what software is running on either of your machines...
Nothing exotic. The Windows OS plus Firefox browser. My P4-XP-512MB machine runs faster than my brother's AMD X2-WIN7-3GB machine. XP is more responsive.
.
>>>I've found Vista and Win7 to generally work better, giving decent enough hardware.
Well my Vista install was running Microsoft's minimum recommendation (512 megabytes) and ran like a snail through molasses. W
Re:When do people get this (Score:4, Informative)
Re:When do people get this (Score:5, Insightful)
Linux does the same things as Windows: it caches as much stuff from disk into main memory as possible. Try running:
cat large_video_file.avi >
You'll see that after running the command, your memory usage jumps up by the size of the video file. Now try running the same command again, it's now an order of a magnitude faster.
On Linux things like this are stored in main memory in the form of caches and buffers. I don't know about Windows, but Linux clears some caches and buffers if applications need real memory. Caches and buffers show up in memory usage reporting tools like 'free', so it's quite normal to see Linux systems using 90% or more RAM, most of which go to caches and buffers. It seems that most people who complain about memory usage don't know how memory is managed on modern operating systems, so they go all apeshit about "OMG HELP linux is using so much memory it sux0rz!!!" and I have to explain again and again how they're not getting it. Same goes to you. Now, Windows is suffering from the same problem.
FYI, here's the memory usage of my Linux server:
total used free shared buffers cached
Mem: 720 702 17 0 55 510
-/+ buffers/cache: 136 583
Swap: 399 0 399
It says 702 MB of used memory. Now look at "-/+ buffers/cache", it says 136 MB. That's the amount of memory *actually* used by applications.
Re: (Score:3, Funny)
And just to swim in anecdotal waters, when I copied big piles of files to my GRiDPad 1910 via null modem cable using Microsoft's classic INTERLNK and INTERSVR for file sharing, SMARTDRV sped up the copy operations by about an order of magnitude. Life without disk caching isn't worth living.
Re:When do people get this (Score:4, Informative)
Maybe the part about HDD caching slowing things down?
I could be wrong there, since I'm not an expert but I remember the dark, dark days when my computer when spend 2-3 minutes just to redraw a Word document. Why? Because it was using the HDD like memory, instead of using the actual memory. It seems to me that this problem, while minimized, has never completely gone away.
Anyway telling me "you're wrong" doesn't enlighten either me, or the other readers. Please elucidate.
There's a lot of variables, but in simple terms the theory goes that something which you have recently accessed (be it an application, a document or whatever) you are likely to want again in the near future. Hence it's worth keeping a copy in memory on the offchance.
On the other hand, you really don't want to be swapping. So if a program needs more physical memory than what you have immediately available, it makes more sense to allocate memory which was recently holding cached data and just reduce the cache size than it does to start swapping, which is what any sane OS will do.
If there's any real intelligence involved in this, the OS will re-allocate an area which hasn't been used in a while.
The cache would only cause a problem in the way you describe it if the OS did not dynamically resize cache to account for other demands on system RAM.
I can't explain the differences between yours and your brother's computer but I can tell you that OEM builds of Windows tend to have so much garbage loaded at boot that they often need serious work before they're genuinely usable. Some of the builds I've seen, it is a wonder they boot at all.
Windows has a bigger swapPINESS (Score:3)
If the RAM is needed for another purpose the blocks can be freed by simply changing a flag and mapping them as quickly as if they had been totally free. This does not slow things down. After a while most of the blocks you need will be in RAM when you need them. Thus caching makes the system faster, not slower.
The above describes Linux. I assume Windows works similarly.
As I understand other comments to this article, Windows runs at the equivalent of 100% swappiness [kerneltrap.org], making the system far more likely to evict a process from RAM than to evict things from disk cache. Writes are slower than reads, especially on a RAID or on a laptop with a low-cost SSD. Or to express it more graphically:
8==D : Swappiness on Linux properly tuned for slow writes
8=========D : Swappiness on a Windows box
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2, Insightful)
Actually in modern operating systems RAM can be used as a disk cache is such a way as it can be "freed" at little to no cost when needed by programs. This in effect means you get quick access to often used programs as they are already cached in RAM.
The only situation in which "RAM is filled with crap and the OS goes to disk" (paraphrased) is when RAM is
Re: (Score:2)
There is no speed difference between loading data off the hard drive into empty memory vs loading the data into memory which is being used for cache. You still have to load the same memory either way, and wait for the same delays.
The reason you get memory issues when your memory is over full is because the data which is being taken out of memory may either be modified(in which case it needs to be written back to the disk cache) or reread at a later date.
If the data doesn't need to be written back to the sys
Re:When do people get this (Score:5, Informative)
You obviously don't understand memory access design. It's all about feeding the CPU. There are two sorts of relationships we can use to make this work: temporal and sequential.
Hard drives are the largest-capacity storage (well unless you want to go to tape). But they're slow. Even the fastest high-RPM SCSI or SATA drives are SLOW compared to what's above them. This is mitigated, somewhat, by putting some cache memory on the drive's controller board itself. Still, having to "hit" the hard drive for information is, as you say, a slowdown. Same goes for "external" storage (Optical media, USB media, etc).
So you try to keep as much information as possible in RAM (next step up). Hitting RAM is less expensive than hitting the H/D in terms of a performance hit. In the original days of computing (up until the 486DX line for Intel CPUs), RAM and CPU operated on a 1:1 clock speed match, so that was that.
Once you factor in the "clock multiplier" of later CPU's, even the fastest RAM available today can't keep from "starving" the CPU. So we add in cache - L3, L2, and L1. the 486 implemented 8KB (yeah a whole 8K, wow!) in order to keep itself from starving. L3 is the "slowest", but largest, L2 is faster still but smaller, and L1's the smallest of all, but the fastest because it is literally on the same die as the CPU. That distinction is important, and in general you'll find that a "slower" CPU with more L1 Cache will benchmark better than a "faster" CPU with less.
The CPU looks for what it wants as follows:
- I want something. Is it in L1? Nope.
- Is it in L2? Nope.
- Is it in L3? Nope.
- Is it in RAM? Nope.
- Is it in the H/D Cache? (helps avoid spin-up and seek times) Nope.
- Crap, it's on the H/D. Big performance hit.
Everything except for the L1 check, technically, was a performance it. The reason for pre-caching things (based on temporal and sequential relationships) is all about predicting and getting what will be needed next into the fastest available place.
Yes, I suppose you can run an entire system where it all goes into "RAM", and you'll see it as "more responsive" simply because you never have to touch the hard drive. But turning off HDD caching is a BAD idea. It makes cache misses that much more expensive because then, instead of having even the chance of finding what you needed in RAM or in the HD's onboard cache, you have to wait for the H/D to spin up and seek to the right sector.
Re: (Score:2)
The only system where it makes sense to disable swap space is a system with no HDD at all.
Or an OS with terrible swap algorithms.
Anecdotal, subjective and unscientific: I perceived an improvement in performance when I disabled swap in XP.
Re: (Score:2)
While your point may be valid, I have a system without swap but with a hard drive (three actually). Why? Because I never want it to use swap. Ever. I loathe the idea that my system should resort to that. If I ever find it suffering due to lack of memory, I shall buy more. Elininating swap is done on aesthetic grounds. I renounce it entirely!
Re: (Score:3, Insightful)
Doesn't require advanced knowledge to make sense of that ouput,
I'd argue that knowledge of what "wired" means in that context is advanced knowledge, required to make sense of that output. I've been working with computers for a hell of a long time, and I have absolutely no clue what "wired" means in regards to allocation of memory.
I also don't have whatever knowledge (advanced or otherwise) to understand the difference between "buffer" and "cache." And I'm only guessing that "inact" means "inactive", althoug
Re: (Score:3, Insightful)
If all that RAM was simply being used for a filesystem cache, the system would not have to "increasingly turn to disk-based virtual memory to handle tasks" - it would just drop some cache when it needed to start a new task, as you said.
It seems that something else is going on.
Re:When do people get this (Score:4, Informative)
From TFA:
"On average, 86% of Windows 7 machines in the XPnet pool are regularly consuming 90%-95% of their available RAM, resulting in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks."
Re:When do people get this (Score:5, Insightful)
You missed the most important emphasis:
Ostensibly means "to all outward appearances." In other words, they're admitting they don't really know the inner workings, the true cause of the delays. They're just supposing it's due to RAM swapping, as opposed to increased networking activity, aero glass, more concurrent programs being run on average, or any other number of other wag'd reasons. Basically, they picked two measurements that are both higher in W7, and just said, "Well, it stands to reason that A causes B." What's that phrase that internet smarty-pantses use all the time about this?
Re: (Score:3, Insightful)
Re: (Score:2, Interesting)
Re: (Score:3, Interesting)
I'm inclined to think it's this, at least for my Vista machine. I currently have 6GB RAM, but at any given time with Outlook, FireFox, and a handful of Explorer windows open there isn't any more than 2-3GB showing to be in use. The rest is cached. This becomes a problem only when I need to fire up a 2GB Linux VM for testing, the VM will pause itself on startup, citing not enough RAM available. I'm no expert, but I have a sneaking suspicion that the caching mechanism i
Re: (Score:3, Insightful)
The new memory models in recent OS's try to utilize all the available RAM (as they should) to speed up things otherwise. It makes a lot of sense to cache things from hard-drive in low-peak usage points, and in such such way that it doesn't interfere with other perfomance. When the things that are most often used are already cached in RAM, their loading works a lot faster. This doesn't include only files, icons or such, but everything the OS could use or do that takes time.
If theres a sudden need for more RAM, the cached data can be "dropped" in no time. It doesn't matter if it averages at 25% or 95%, just that the perfomance overally is better when you utilize all the resources you can to speed up things in general.
Assuming your claims of how Windows 7 is implemented are true, then the claims from the person who actually collected all the empirical evidence must be false:
resulting in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks.
If the memory was freed up dynamically as needed then no processes would ever be forced to resort to disk-based virtual memory. So either you work at Microsoft and are assuring us that the implementation protects against this or you're speculating against someone who has claimed to gathered a large enough to make such accusations.
No offense but
Re:When do people get this (Score:5, Informative)
If they'd measured pagefaults, they could've reported pagefaults. They didn't. RAM usage appears to be the total basis for the article, so his concern is a genuine one. We don't know enough about the study at this stage to dismiss it.
Parent is +1 informative (Score:5, Informative)
You cannot study virtual memory performance without considering how many page faults occur.
It is perfectly reasonable to use RAM as a filesystem cache, which is why Linux has adopted this approach. The effect is that almost all of the physical RAM is always in use. The cost is that pages are more likely to be wrongly swapped out - however, in typical cases, this increased cost is tiny in relation to the huge reduction in the number of disk accesses.
Re:When do people get this (Score:5, Interesting)
So yeah, it doesn't seem like the author really knows what's going on...
They Report Disk I/O Backlog Percentages (Score:2)
Barth acknowledged that XPnet's data couldn't determine whether the memory usage was by the operating system itself, or an increased number of applications. So yeah, it doesn't seem like the author really knows what's going on...
While that's true, one would probably make the assumption that it is normalized in XP vs Windows 7 since they have no way of tracking it. What I mean is that you would assume the Windows 7 user runs the same number of programs as the XP user.
I actually followed the blog link [blogspot.com] in the story and while they can't pin it on application or OS, they can say that the disk I/O is backlogged on 36% of XP machines sampled, 83% of Vista machines sampled and 85% of Windows 7 machines sampled.
While they don't k
Re:When do people get this (Score:5, Interesting)
If the memory was freed up dynamically as needed then no processes would ever be forced to resort to disk-based virtual memory.
The trouble is, the TFA doesn't actually say (at least not clearly) that the Win7 machines are indeed turning to swap more regularly. It just states that fetching stuff from the swap file is a consequence of running out of RAM and causes perf degradation. So if the Win7 machines are indeed utilizing all available RAM and yet not swapping at a significanly higher rate, it means they're making more optimum use of available RAM.
Re: (Score:2)
the TFA
At least you didn't write "the TFA article"
Re:When do people get this (Score:4, Insightful)
Ive been running Windows 7 since the BETA release. I have never experienced any issues that result in I/O thrashing against the hard drive as result of all my ram being utilized. I also have numerous friends running Windows 7 none have reported any issues like this, if anything its been praise for the operating system. So I am amazed to see a 86% number being thrown out there yet never seen this problem before.
No offense but I'm going to side with the guy who appears to make his living testing these sorts of things ... the guy who is offering me numbers.
And now lets quote something from the article...
Barth acknowledged that XPnet's data couldn't determine whether the memory usage was by the operating system itself, or an increased number of applications, but said that Devil Mountain would start working on finding which is the dominant factor in increased memory use.
This single sentence makes the article rubbish. They have no clue whats causing the heavy memory usage, its just an assumption that the OS is causing it and they're yelling fire before looking through all the data or completely analyzing the problem.
Re: (Score:2)
Frightening, huh? CTO and, in fact, he has no idea what he's talking about.
Rather than blindly following presumed authority, perhaps you should make use of some critical thinking skills and a couple Google searches. You'll (apparently) be surprised what you learn.
Re:When do people get this (Score:5, Insightful)
Read TFA. It just claims that Windows 7 consumes all available RAM. That is the "empirical evidence." System slowdown was NOT measured.
Utilizing all available RAM is a pretty well understood technique at this point. All web browsers do this now, as do many other applications. One would expect a well-designed modern OS to do this. Consuming all memory itself is not a sign of poor programming itself, so long as disk caching of things that should be in RAM doesn't occur. This is not something that the people in the article has measured.
I'm going to side with the guy who appears to make his living testing these sorts of things.
Bad science is bad irrespective of the person conducting it. And whatever the original tester said is getting filtered through the viewpoint of the gentleman writing the article. Considering that he says that "Windows 7 is not the lean, mean version of Vista that you may think it is," yet never once compares statistics to Vista (or even mentions it outside of this statement), I'd take the conclusions from these stats with a grain of salt.
Re: (Score:3, Insightful)
So instead of thinking about what actually effects OS performance, they are rethinking things so that they don't have to sell real solutions to their customers, "where [they] maintain several large installations of our commercial DMS Clarity Suite performance analysis solution."
Re: (Score:2)
RAM is wasted when it isn't in use. The fact that the task manager in Windows says your RAM is used 95% tells nothing, and no it won't "result in slow-downs as the systems were forced to increasingly turn to disk-based virtual memory to handle tasks".
While there are approaches using virtual memory that will limit the slow downs, the data still has to come form the HD even SSDs are not faster then ram. Grandma checking her email may not notice but data processing and other memory intensive functions need memory and putting in on a HD will slow things down.
Re: (Score:2, Insightful)
Yes. I have a similar problem when people running servers complain that the CPU is at 100%.
If you're seeing an actual slowdown in performance, fine, worry about it.
Otherwise, 100% CPU usage is a good thing: it means there's a process that's not IO bound.
User base increases over time. (Score:3, Informative)
If you're seeing an actual slowdown in performance, fine, worry about it.
User base increases over time. Even on an intranet server, your company will probably add users when it grows. As your user base increases, you will see slowdowns. If you can catch slowdowns before they happen, you will be more prepared for the DDOS attack that comes when your site gets mentioned in a Slashdot article or when a bunch of new employees go through orientation.
100% CPU usage is a good thing: it means there's a process that's not IO bound.
Or it could mean that you need to optimize the process that uses the most CPU time so that it becomes I/O bound. All other things equal,
Re: (Score:2)
The new memory models in recent OS's try to utilize all the available RAM (as they should) to speed up things otherwise.
BeOS and Haiku did/do this, but I don't think any other OS has implemented total RAM usage to such a degree.
Re:When do people get this (Score:5, Interesting)
Totally agree. If you don't want Windows 7 to use the 4GB of RAM you've paid for to speed up your computer, take out 2GB and put it in the drawer. Otherwise, be thankful that it's actually making the most of the RAM you're using.
What next? People complaining that games use 100% CPU to give them maximum framerate when it could just use 30% CPU and give them 10 FPS?
Re: (Score:2)
What next? People complaining that games use 100% CPU to give them maximum framerate when it could just use 30% CPU and give them 10 FPS?
That’s not a fair analogy. When you are playing a game, you are not multi-tasking. You are basically using the computer for a single task. I don’t care if the game sucks up 100% of the CPU and all of the remaining memory, as long as it frees it up when I close it.
If the system is gratuitously using 95% of the RAM nearly all the time, then it’s a completely different scenario. Everything I try to open that wasn’t cached already will force the system to dump some memory to the swap fil
Re: (Score:3, Informative)
If the system is gratuitously using 95% of the RAM nearly all the time, then it’s a completely different scenario. Everything I try to open that wasn’t cached already will force the system to dump some memory to the swap file to make room for the new application.
Uh no. The point here is that the RAM is utilized with data that speeds up things, but that can be instantly freed if needed. It doesn't need to put that in swap file.
Re: (Score:2)
RAM is wasted when it isn't in use.
You make decent arguments that I've heard before, many times. And I actually agree with them to a point. At least in theory, given somewhat unrealistic assumptions about the system. But there are two major flaws in this idea. First, in practice, where real developers are making complex decisions about how they structure their programs and the resources they use, I think this sentiment results in wasteful and excessive programming practices.
In other words, the resources programs demand may be managed ver
Re: (Score:2)
Windows is notoriously reluctant to invalidate caches to free RAM for applications. I don't know if Win7 fixed that, but XP would much rather send an idle app to swap than free some disk buffers. That's why switching swap off entirely tended to speed it up so much - it was forced not to swap out any active data and free up buffers instead.
Re: (Score:3, Informative)
Just because RAM is available doesn't mean that the OS should hog it. You may want to use that RAM for something different. It may be legal to use "excess" RAM for buffers, but then those buffers must be freed fast whenever necessary.
If you use large amounts of RAM for buffers you will either freeing the least used buffers and use them for the application, and then you will get memory fragmentation. This can be bad for some applications. The other scenario is that you will just kill a block of buffers and t
Re: (Score:2)
Re: (Score:2)
Oh, right, I can read. RAM.
Disregard the above. I'm dumb. =[
Re:When do people get this (Score:5, Informative)
Re: (Score:2, Interesting)
You'll excuse my ignorance, but from college I remember that usually you have 0-2V represent 0 and 3-5V represent 1. Does a 0 have a corresponding increase in amperage so that it levels out and uses the same amount of power?
It seems natural to me that it would be initialized with zeroes on power-up, so that it would minimize power consumption.
Furthermore, more advanced chips, especially in mobile devices, have a variety of power-saving tricks. I would expect RAM would be no exception in having ways to clock
Re: (Score:3, Informative)
Battery Problem Explanation? (Score:2)
Depends on what kind of memory (Score:4, Informative)
Re: (Score:3, Interesting)
Re:Depends on what kind of memory (Score:5, Funny)
Re: (Score:2)
No, but they don't offer any actual evidence that the systems are swapping a lot, just that their memory usage is high. The "and this causes swapping" part of the claim just seems to be an assumption rather than a fact.
In XP? Definitely YES (Score:2)
FIle cache will definitely swap out your applications in XP.
In previous Microsoft OSs you could set maximum file cache via the Windows .ini file (and it was like a breath of fresh air for Windows performance - suddenly you could do other things while burning CDs, etc.
On XP they took out that feature and it's performed like a dog ever since because of it. A whole new generation of CD burners had to be developed with "SafeBurn" technology, etc.
I'm sure the stupidity has continued in Vista/7 but I don't have i
Re: (Score:2)
Bogus Story (Score:3, Insightful)
Right there I'd be suspect whether this is even an issue or not. Given Windows (which I generally regard as inferior) as an OS having lots of functionality, I wouldn't be suprised if it takes up all available RAM prior to utilizing swap. I'm on my 2GB Ubuntu system right now and am running at 18% of 2GB with just Mozilla (with two tabs) and Thunderbird. But there's also my network layer (Network Monitor), KTorrent, and my bluetooth daemon running in the background. All told, System Monitor says i have 31 processes running.
Let's do a like comparison - run the exact number of apps and processes before declaring a memory leak.
Sheesh!
It's called SuperFetch (Score:5, Insightful)
I guess Devil Mountain or whoever don't know about SuperFetch. Or need publicity.
And I guess slashdot editors don't know about SuperFetch. Or maybe an article like this gets them more traffic, revenue, etc.
The fucking bullshit that passes for articles these days..
Trollworthy rating for posting this... (Score:5, Interesting)
Computerworld should just close up shop for this worthless piece of journalism, or at least give their author the boot for doing any work with Craig Barth who represents a team of morons. samzenpus should be given a troll rating for getting this to Slashdot.
Page Faults (Score:5, Insightful)
The metric to count is the number of page faults, an indicator of the number of times that the OS addresses memory that isn't in RAM.
As others point out, measuring just the fraction of memory consumption is stupid. I have 6GB of RAM ; my processes are using about 1.7GB of that, but the OS is claiming that 3.8GB is consumed. So that's 2.1GB of cached data that I no longer have to wait for from disk. Hooray.
TFA hints that they may be measuring page faults, and does mention that Win7 is hitting the disk for virtual memory more often. But they should make that clearer if it's the case.
Uhn...no. (Score:2)
People are either being unbelievably stupid, have only 1 gig of RAM installed, or this is FUD. Example: I'm currently running Windows 7 64 bit. On my secondary monitor, I have a bunch of system monitoring widgets... hard drive space, CPU load and temp, video card load and temp, memory usage, etc. Just last night I was playing Bioshock 2, all settings at max. Even with those widgets running, with Aqua Teen Hunger Force playing in MPC on the secondary monitor in a window, and Bioshock 2 running full bore
Re: (Score:2)
The plural of anecdote is not data... but I still have to throw in my own 2 cents. It seems that as long as you have at least 1.5GB ram, win7 will be using about 700MB of that on startup, and the rest goes to apps. I have 4GB ram and I've never seen more than 3GB of ram in use, including running Supreme Commander with other stuff open.
I agree that Win7 can't run on systems with less than 1GB ram, and only runs "alright" on systems with exactly 1GB ram, but if its using ram for something other than disk cach
My experience (Score:2)
I'm running Windows 7 x64 with 4Gb of RAM; currently I'm running Outlook, Firefox, IE, Excel, FeedDemon, Office Communicator, AV, AD management tools, call management software, a couple of powershell instances, Context, RDTabs, Putty and the usual assortment of drivers, plugins and background apps. I'm at 2.4Gb of RAM; even on a 2Gb machine it would be usable, though I'd probably have to be a bit more zealous with closing unused apps to avoid swapping.
I can only assume that it's the usual nonsense of vendor
Re: (Score:2)
your using the wrong apps and data.
You can take small efficient application as was used to be created because ram was to expensive and run hundreds if not thousands of them without a slow down, especially if teh data they are dealing with is near empty. Or you can take one autocad application and large cad file and tax it to its limits and beyond.
I'm sure there are other such applications that are massive and resource hungryas well as data files.
Increased speed and memory of hardware is generally preceived
People are buying cheap PC's and upgrading old one (Score:2, Interesting)
Oh come on (Score:5, Funny)
New Windows OS != performance (Score:2)
First off, kudos to ComputerWorld for this shocking newsflash "New Windows Operating System is Bloated and Disappoints Users". Is it 1995 again when I foolishly believed Microsoft and loaded Windows95 on my happy Windows 3.1 computer only to discover the 4MB minimum RAM requirement left my computer a useless lump of plastic with an endlessly spinning hard drive? Four more MB of memory for $130 from a shady computer dealer finally slowed the paging down. I have seen this cycle repeated 6 more times since
No kidding. (Score:4, Interesting)
I pity people running the x86 version of the OS that are maxing out at 4gb. Definitely buy 64 bit (even though its more of a beta than a real OS) if you do anything memory intensive.
The one good thing about all this is that HOPEFULLY...FINALLY...maybe this will push Microsoft to push 64 bit more. They need to abandon 32 bit and force application writers and hardware manufacturers to start making 64 bit native applications. Working in a medical environment BELIEVE me I understand the need for backwards compatability, but the fact is that the resources are just not being put in to 64 bit to make it a really viable platform and even moderate power users are going to start bumping up against the 4gb limit.
Yea, I read that thing saying that the 4gb limit is a product-based limit rather than a technical limit but either way...it appears that x64 is where MS is choosing to support > 4gb so lets get serious about it.
Re: (Score:2)
If you are running a PC at 4 GHz, you are either running a Pentium 4 (in which case your technical knowledge is questionable at best) or are running a very overclocked system (in which case you would be smart enough to use something other than Windows 7 if it was causing you that much grief).
Considering your "4 GHz" claim combined with "many thousands of dollar" for your PC, I'm going to go ahead and call BS on you.
Re: (Score:2)
Considering your "4 GHz" claim combined with "many thousands of dollar" for your PC, I'm going to go ahead and call BS on you.
Yeah, or he's a regular consumer who spent a lot of money on his computer and simply believed the twenty-two year old in the tie who added the clock speed of each core of the multi-core computer together while promoting the sale. After all, when you look at MS's "System Properties" it does that math right there on the screen for you and you feel all special.
-And apparently, if you use Windows 7, rather annoyed as well.
-FL
Re: (Score:2)
Quoted from a post I made earlier in this article:
People are either being unbelievably stupid, have only 1 gig of RAM installed, or this is FUD. Example: I'm currently running Windows 7 64 bit. On my secondary monitor, I have a bunch of system monitoring widgets... hard drive space, CPU load and temp, video card load and temp, memory usage, etc. Just last night I was playing Bioshock 2, all settings at max. Even with those widgets running, with Aqua Teen Hunger Force playing in MPC on the secondary monitor in a window, and Bioshock 2 running full bore, I was still using only 52-58% of my available 4 gigs of ram. I call BS on this article.
Here are my system specs, to back up my claims. As you can see, nothing special (copy and pasted from my [H]ard|Forum sig):
Display: Asus VH236H | Dell 2005FPW
Foundation: Cooler Master Storm Scout | OCZ ModXStream Pro 700w
System: Gigabyte GA-MA785GM | AMD Athlon 64 X2 5400+ | Corsair XMS2 4GB DDR2 800 | ATI 4850
Internal Storage: Diamondmax 21 system | WD15EADS archives
External Storage: 1.25TB in a KINGWIN DK-32U-S | WDMER1600TN
Input: Kensington 64325 Expert Mouse | Saitek Eclipse II | M-Audio Axiom 25
Audio: Logitech Z4 2.1 | Audio Technica ATH-AD700
For the record, I do mail merge programming for a living and have a collection of old systems ranging from a (fully functional) TRS-80 all the way up to my current system.
Re: (Score:3, Insightful)
Ok that's retarded.
Using 100% of your RAM can easily make your computer slower. Firstly, if a page is "dirty", it needs to be saved to disk before it can be used for another activity. This causes a write to disk before a read, automatically making the read slower.
While that's true, it doesn't matter because those pages would be in-use no matter how much RAM you have.
The "free" RAM (that is, RAM not being used for applications, that is, RAM pages that won't ever be marked dirty) is all caching-- disk and DLL