Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Microsoft Windows

4 GB May Be Vista's RAM Sweet Spot 767

jcatcw writes "David Short, an IBM consultant who works in the Global Services Division and has been beta testing Vista for two years, says users should consider 4GB of RAM if they really want optimum Vista performance. With Vista's minimum requirement of 512MB of RAM, Vista will deliver performance that's 'sub-XP,' he says. (Dell and others recommend 2GB.) One reason: SuperFetch, which fetches applications and data, and feeds them into RAM to make them accessible more quickly. More RAM means more caching."
This discussion has been archived. No new comments can be posted.

4 GB May Be Vista's RAM Sweet Spot

Comments Filter:
  • x64 (Score:1, Insightful)

    by Anonymous Coward on Tuesday February 20, 2007 @08:12PM (#18089978)
    It's an MS way to get people interested in the 64-bit edition which doesn't have a RAM 4GB limit :-)
  • Re:x64 (Score:5, Insightful)

    by sepiid ( 1060020 ) on Tuesday February 20, 2007 @08:14PM (#18090010)
    was gonna say, you toss 4g in a 32bit box you will only see about 3gig. unless you go 64bit, but then you will see even less driver support available
  • by wandm ( 969392 ) on Tuesday February 20, 2007 @08:15PM (#18090034)
    Right, I have 512Mb, I need to buy 3.5 Gb, that's about £245 in UK prices, or about $460. Another number to add on the price of Vista upgrade..
  • by SEMW ( 967629 ) on Tuesday February 20, 2007 @08:21PM (#18090106)

    "More RAM means more caching."
    Well, Duh...
    You say it's obvious; but it's amazing how many Slashdot posts I've seen which consist of "I've got XGB of RAM [where X>1] and Vista's using up 75% of it running the OS alone; therefore Vista must need XGB of RAM to even run, never mind applications!" -- conveniently ignoring that Vista's just using the extra RAM to cache frequently used apps, documents, etc., and it'll automatically be freed up if any application requests it...
  • by Jeremiah Cornelius ( 137 ) * on Tuesday February 20, 2007 @08:22PM (#18090126) Homepage Journal
    I remember the $40/MB RAM!

    OS/2 reccomended 4MB
    Vista? 4GB

    Too bad we aren't doing exponetially better things with these boxes...
  • by dsanfte ( 443781 ) on Tuesday February 20, 2007 @08:31PM (#18090244) Journal
    What does Vista do that's really NEW and WANTED in an operating system? Not much. More eye candy? That's worth $300? The customer will decide, but I'll say this:

    This much bloat simply isn't necessary. Caching is one thing, but the RAM requirements of Vista simply for code space are massive compared to XP for roughly the same functionality. That's a center that cannot hold.

    What we expect from an OS is pretty well-known and well-defined now. This means the innovation will slow and there will be increasing reluctance to upgrade simply for the sake of upgrading, especially when the upgrade is a worse performer than the software being upgraded!

    This is fertile ground for optimization.

    An example:

    Compare the executable size and memory utilisation of uTorrent and Azureus. Azureus represents the old guard of BT clients, you might say. A large, bloated code base in Java, implementing features that you wouldn't think would require that much code. And boy it's a dog, and crawls on any sub-1.5Ghz laptop. Enter uTorrent. I would say Azureus is the Vista to uTorrent's microLinux. For the uninitiated, in terms of program size (exe + libs) and memory utilization, we're talking about 170kB/4MB to 7.6MB/16.3MB, respectively. uTorrent was able to bring just about all the features present in Azureus and compact it into a 170kB .exe. And lo, the damn thing is snappy even on my old P233/64MB laptop.

    I think this will be the end of Microsoft. The API expected for a Windows box is known. It's publicized. The time is ripe for a competitor to come in and reimplement it, using less RAM and resources while conforming to the same standards, and for a fraction of the price. If this were to happen, and if the software companies were to realize they didn't have to sit beholden to *Microsoft's* "Windows" anymore, then we'd really see some fur fly in the marketplace.
  • Re:Seriously (Score:3, Insightful)

    by HomelessInLaJolla ( 1026842 ) * <sab93badger@yahoo.com> on Tuesday February 20, 2007 @08:34PM (#18090294) Homepage Journal
    > PC Gaming is DEAD. It's only a matter of time before MMORPGs move to the consoles too

    You have expressed the reason for the assertion. PC gaming is no longer about gaming. Gaming could be described as a system of rules around a logic puzzle. PC gaming is now about social networking and appeal (mostly visual).

    Computers are the realm of intellectuals. PC games, the really good ones, were intense intellectual puzzles. A good transition to recognize is the shift in RPG style: from symbolic display to a concentration on realism. Times of Lore [wikipedia.org] marked this event. Before ToL were games such as Ultima 3 and Phantasie (Nintendo had Zelda) and even earlier were the text based games such as Zork. After ToL were the AD&D games and, later, the anime (eg. Final Fantasy series) style realism RPGs.

    Developmentally the earlier games had more intriguing game plots, puzzles, and intrigue. The later games were more visually appealing and spectacular.
  • by Anonymous Coward on Tuesday February 20, 2007 @08:36PM (#18090302)
    Note to *nix users: You want to run *nix? Then shut up and pay for driver/app development.

    Note to Mac users: You want to run OS X? Then shut up and pay for the pretty hardware.

    Note to Windows users: You want to run Vista? Then shut up and buy the extra memory.
  • by tritone ( 189506 ) on Tuesday February 20, 2007 @08:39PM (#18090342) Homepage
    From Dell's website [dell.com] A Windows Capable PC has 512 MB RAM and is "Great for... Booting the Operating System, without running applications or games.
  • by drDugan ( 219551 ) * on Tuesday February 20, 2007 @08:45PM (#18090408) Homepage
    That is the best reason yet to dump Micro$oft.

    The cycle looks something like this: Dell makes money when they sell new hardware. Microsoft makes money when they sell new OS and software. The reality is, most people don't need either - they just want systems to surf the web, do email, buy clothes and watch porn. Dell can't force you to upgrade that 3 year-old computer, unless the software runs slllooooooowwwwwww. So, Dell LIKES Microsoft products. Microsoft writes software that needs nice shiny new hardware to run well, with and insane amount of RAM just for the OS. Ironically, the worse the efficiency of the Microsoft software, the more money they BOTH make. Intel is not out of the game either - they make money for new chips sold too - but mostly they are just along for the ride because their product has not become commodity yet like PC memory.

    I freed myself from the MS empire when my laptop was stolen and I switched to a Mac laptop in Nov 2005. Now everything is either OSX or Linux, and I havent missed it at all. I still use Word and Excel on Mac - but EVERYTHING else is now gone from my computer life from Microsoft and I like it that way.

    I read freshmeat for the first time this morning in like 6 months. I was very happy to see many many packages at post-1.0 realease numbers. Not that it means anything quantitative, but encouraging nevertheless.

  • Re:What? (Score:3, Insightful)

    by SEMW ( 967629 ) on Tuesday February 20, 2007 @08:47PM (#18090440)

    The apps don't use that memory, the os does. The application programs are stored in ram (you know, like a "ram disk"), so that when the program is actually called upon, the rogram is already in ram and doesn't need to be read from the hard drive (you know, cause the hard drive is slower than the ram).`
    Well, yeah. Just like the summary says. Hence my "didn't you even read the summary". If you really felt my quote was taken out of context and somehow implied that the memory use was due to running applications, the summary was only a scroll away.

    This is a "feature" of the operating system.
    Well, yes. IMHO, it's a damn good feature. If I have XGB of RAM, I may as well be using it to speed up my system, rather than have it sitting there like a lemon. Where's the harm? It frees it up when anything requests it.
  • Re:Seriously (Score:5, Insightful)

    by SilentChris ( 452960 ) on Tuesday February 20, 2007 @08:48PM (#18090452) Homepage
    Twenty years ago I remember an 80-character email program my school used that required remembering about 40 shortcuts. None of them were displayed. You could work on one email at a time -- that's it. There was no GUI email program with easy to understand menus. There was no way to work on more than one email at a time. You were fortunate if you got copy and paste.

    Twenty years ago I remember the "media" I "collected". Amazing 256-color graphic files. Mostly of stupid things like bowls of fruit (porn really wasn't all it was cracked up to be at the time). No pictures of family and friends in high detail. No means of easily storing said photos for extended periods of time.

    Twenty years ago I remember when a "state of the art" game was one that wasn't entirely text-based. When an adventure game's inventory had a max of 16 items and enemies were scripted (and therefore dumb as bricks). No photorealistic visuals to draw you in. No fairly natural AI to breathe life to the world. And certainly no way to play with thousands of others at the same time.

    My point?

    All of these changes have been the result of higher memory, faster processors, etc. Yes, we use a bigger memory footprint nowadays. So what? Isn't broadening the appeal of the PC (families storing photos and grandmothers that can actually work the email program) worth it? Yes, the fundamental operations haven't changed (write email, send email, etc). Big deal. Call that a testament to stellar original design than a foible of modern design.

    Fact of the matter is I *can* do more, much more, than I could with my PC from 20 years ago. And I can do it in an easier way (blame Vista/OS X all you want -- they're still better UIs than what we used in '87). That's called "progress", regardless if the memory footprint grows or not (and the fundamental tenants of computing stay largely the same).
  • by amcdiarmid ( 856796 ) <amcdiarm.gmail@com> on Tuesday February 20, 2007 @08:51PM (#18090490) Journal
    It's not really an interesting article. To summarize:
              Guy says you need 4GB for sweet spot.
              Same Guy says you need 2GB for XP sweet spot.

    I'll give you that nowadays you might want 1GB for XP, but 2GB is excessive for most. I know plenty who are happy with 512MB running OS + AV + Word + Browser. (Although 768MB is better.)

    Take Minimum Spec, Multiply by 4. That's more likely to be the minimum usable. (See minimum specs for previous MS operating systems for comparison purposes.)
  • by Sj0 ( 472011 ) on Tuesday February 20, 2007 @08:55PM (#18090538) Journal
    Except for the hour and half at startup where Windows loads every application you've ever loved into memory, right?

    Ever turn off swap in a modern Windows? All things considered, I'd like to disable executable caching, and just keep swapping for file reads and writes within programs. Not swapping out the programs you're actually using is a pretty damned good first step towards a zippy system, in my experience.
  • by SimonInOz ( 579741 ) on Tuesday February 20, 2007 @08:56PM (#18090540)
    Some time back (ok, 1979) I built a system to monitor a Dutch nuclear reactor. It monitored temperatures, rod positions, and so on. Nothing important (cough). There was no suggestion of keeping costs down to save money (and I'm glad).

    The system had two colour graphic displays, a printer or two, and 4 operator terminals. It ran a real time, multi tasking operating system (called RSX11).

    The main system had 128kb of memory. Yes, 128kb.

    Today my dev machine has 2Gb of memory and the 3Ghz processor must - surely - be some thousands of times as fast.
    So I have 15,000 times as much memory, a processor perhaps 3,000 times as fast (I'm guessing, as figures are hard to pin down). That sounds like 445 million times as much power to me.

    And what do we do with all this grunt? Well damn, solitare looks good these days.

    So, were the old programmers really, really good? [We were, we were ...]
    Are the new ones really, really bad? [hang on, I'm still at it ...]
    Have we stopped caring about size and performance of programs?

    I think all of these things are slightly true - we used to care deeply about program speed and footprint. Now we don't.
    I suspect it has gone much too far - programs are far slower to load than they were even 5 years ago - they are large and bloated, and don't share things well. Anybody remember Sidekick - it was wonderful - and it was available at the touch of key (ok, 2 keys). Remember how FAST it was? I know it didn't do much, but it was dashed useful.

    And I still can't beleive I still write "for" loops.

  • by Brett Buck ( 811747 ) on Tuesday February 20, 2007 @08:56PM (#18090548)
    Sure enough, that's exactly what it says. What in the hell use is a computer with just an OS running and nothing else? This is what that call "capable"? Ay Carumba!

          Brett
  • by AcquaCow ( 56720 ) * <acquacow@nOspAM.hotmail.com> on Tuesday February 20, 2007 @08:58PM (#18090576) Homepage
    This just goes back to the old saying that "unused memory is wasted memory."

    You should always cache as much as possible.

    The problem is, if consumers saw their memory usage at 100% all the time, they would freak out.

    I've had 4gb for a while, as I use Photoshop heavily. I'm going to make the vista jump just so that I can run more/all of that 4gb, plus get some 64 bit action.

            -- Dave
  • by Joe U ( 443617 ) on Tuesday February 20, 2007 @09:09PM (#18090686) Homepage Journal
    It is hard to see how 3G can be gobbled up by some eye candy and other "UI innovations".

    It's not actually. Vista is much more aggressive in memory usage, it will claim as much as it can for caching and release when needed. Once superfetch (and readyboost) auto-optimize themselves (it takes a little while for it to learn what you're doing and adapt itself), you'll understand why the extra memory gives a nice boost.

    2GB is great, which is what I used in XP. (I'm running developer tools and VMs, so 4GB would be great, even in XP)

  • by WillAffleckUW ( 858324 ) on Tuesday February 20, 2007 @09:15PM (#18090768) Homepage Journal
    Turn off aero. Turn on "Windows Classic" desktop theme. You're good to go with 1GB of memory. Microsoft could tell you the same thing, but then the best features that they offer in this bloated release won't even be used (and it is these features MS is stressing based on print ads and commercials).

    But if you turn off Aero and all that stuff, why bother upgrading in the first place?

    So that you can see the Black Screen of Are You Sure You Want To Run That Program?
  • Re:x64 (Score:3, Insightful)

    by CastrTroy ( 595695 ) on Tuesday February 20, 2007 @09:31PM (#18090924)
    However, that's only if your enable /3GB switch in your boot.ini. Oh, and then if you want applications like SQL server that are PAE aware, don't expect them to turn that feature on automatically. Oh, but then SQL server 2000 takes all it's memory up at the beginning, so if you want it to have 7 GB, it's going to always have 7GB. I guess it's nice they fixed it in SQL Server 2005. I'm not too familiar with running enterprise Linux systems, but does Linux/Unix have all these crazy limitations and set up issues? What's involved in getting an opteron with 64 GB of RAM and using it for something like PostgreSQL? Is it as difficult as setting up SQL server to use memory above 2 GB?
  • "Have we stopped caring about size and performance of programs?"

    In apps like codecs and statistical analysis (both of which commonly use FFTW), we haven't. Though, a lot of the time, we just throw it up to good 'ole SSE.

    Though, I feel our dependance on interpreted languages is getting to be a bit much. Same for XML. Same for all the UI sparkliness. All that extra processing power is going to parsing human-readable data and pretty, and I'm not exactly for it.

    Yeah. I'll stick to XFCE. I just wich there was a non-commercial bash compiler around; that would make things a bit quicker.
  • by pilkul ( 667659 ) on Tuesday February 20, 2007 @10:07PM (#18091276)

    In industry almost everybody uses loops instead of recursion unless there's a really good reason to use recursion (e.g. tree traversal). More because of readability than efficiency; in principle your optimizer should be able to convert tail recursion to iteration anyway (though whether this will actually happen or not does depend on the specific language and implementation). Academics just love recursion because it maps neatly to mathematical induction and hence makes algorithm correctness more easily provable.

    The reason "bloat" happens is more because programming teams have deadlines and if there's a choice between a new feature, a bugfix or some not-strictly-necessary optimization (and there's always a choice), the optimization's never going to get done. It's just good business sense; sure everybody complains about slowness, but if application A is mean-and-lean and application B is bloated but has a feature you need to do your job, you'll whine and cavil and buy B anyway.

  • by Nightspirit ( 846159 ) on Tuesday February 20, 2007 @11:12PM (#18091854)
    Sounds like you havn't used anything between 1979 and 2000s. I can clearly remember my commdore64 taking 10 minutes to load up an app I had written, about the same time for windows 95 to boot up on a 486.

    The average consumer has seen mass improvement. Today I can simultaneously rip a DVD, listen to MP3s, browse the internet, and play a game with a core 2 duo. I was lucky to get 1 of these working at a time back in win95 days. It takes less than a second to load most apps (well, pretty much anything but adobe).

    I agree that we have stopped caring about size/performance because in most cases it doesn't matter.
  • by WhiteWolf666 ( 145211 ) <sherwinNO@SPAMamiran.us> on Tuesday February 20, 2007 @11:16PM (#18091898) Homepage Journal
    Your understanding of the LDDM (WGL, or whatever the heck you want to call it) is grossly oversimplified and vastly fanboyish.

    "Intelligently sharing textures between video card ram and system ram".

    You keep saying that, yet I do not think you know what it means.....
  • by Anonymous Coward on Tuesday February 20, 2007 @11:21PM (#18091946)
    I'm quite happy using *nix without paying anyone for anything.
  • by BeanBunny ( 936648 ) on Wednesday February 21, 2007 @12:31AM (#18092462)

    Woah, woah. That's a lot of assumptions, all grossly incorrect.

    Are you a developer who runs Vista to write and test software?

    No.

    Would you recommend Vista to people because it might reduce the number of people running XP and mean you don't need to test on XP as much?

    Let's pretend that I am a "developer who runs Vista to write and test software." Regardless of my occupation, I consider it disrespectful when you put words into my mouth. First, I would never recommend a product that I didn't think stood on its own merits. Second, I can think of very few circumstances where XP would cease to be a platform I would need to test for within the next five years, regardless of how many people I personally encouraged to adopt Vista.

    Your comment is a leading question designed to prove your point that Vista is crap and I am a shill. I have a hard time understanding where such vilification comes from that you would attempt to discredit me, and my opinion, without any sound rationale.

    Or do you honestly think Vista is better than XP and that's why people should upgrade? Cause if that's the case I'm gunna have to suggest that you're in a freakishly small minority.

    In fact, I am a developer, although not for Vista specifically. As a professional in my field, I believe I can discern whether or not a product I am using is "good" or not. To be fair, I did not upgrade, I bought a new PC with Vista pre-installed. I also will not be upgrading my XP machine to Vista. However, I am very satisfied with Vista as a product in and of itself. It performs well all of the tasks that I require of it.

    Would I recommend Vista to others? It depends on the individual and their requirements. In general, I believe that operating systems, and most software tools, should be evaluated without respect to partisanship. In this case, I do not care whether or not Microsoft made Vista. It is a fine product in most respects, although it is not without its flaws. I believe it will also continue to get better.

    Should people shy away from it? No. If a new computer comes with Vista, keep it. It does its job, and does it well.

    Should people upgrade? Probably not, unless they require specific features found only in Vista. For the average user, there isn't enough value over and above XP. However, as Vista gains marketshare, and as Vista-only products are developed, that will change.

    In any case, there is a learning curve with Vista, but I do not believe that should stop people from adopting the product. If that were the case, I would never recommend Linux, ever. As for the increased hardware requirements, this is not unusual, and the same rationale that has always applied to Windows 1.0, 2.0, 286, 386, 3.0, 3.1, 3.11, 95, 98, 98SE, ME, NT 3.1, NT 3.5, NT 3.51, NT 4.0, 2000, and XP apply here. As for the problems that Vista seems to have, this is normal for a fresh product. This will undoubtedly improve as the product matures. Maybe this is a reason to hold off adopting Vista for now, but there are many benefits of Vista that may account for its drawbacks, depending on the willingness of the customer to put up with a few rough edges.

    I am trying to present a balanced point of view that looks as Vista in a realistic, pragmatic light. I'm not promoting Vista exclusively over other vendors' products. There is a place for Linux, Mac OS, and even other operating systems as well, depending on the customer's requirements. I'm not married to Microsoft. But neither do I think Vista is crap.

  • by adolf ( 21054 ) <flodadolf@gmail.com> on Wednesday February 21, 2007 @01:04AM (#18092680) Journal
    what sense does it make to load 450 or so MB in when I decide to just play Oblivion or something and none of that is used, and might even be written over?

    What sense does it make? It helps you out substantially when you're operating the computer in your typical fashion. You deviate from the norm, you get a cache miss. Nothing new here.

    What's new is the flurry of crazy-eyed weird fuckers like yourself who keep missing the point: It's faster this way, and it costs absolutely nothing in performance. Who gives a shit if it misses from time to time? It's -free-, and harmed you none by missing.

    At any rate, this caching happens at low priority. If the computer had something better to do (like load Oblivion), it'd be bloody doing it. Instead, it's keeping itself busy trying to prepare itself for the next thing that you might ask of it.

  • by oldhack ( 1037484 ) on Wednesday February 21, 2007 @01:36AM (#18092836)
    "...I think all of these things are slightly true - we used to care deeply about program speed and footprint. Now we don't..."

    "Sometime back" (you geezer), computers were expensive, and so people did important things with them. Now most computers, people use them to jerk off (in all the glorious senses of the phrase). Rest of the paragraph is left for all yous to make up your owns.

    Better look into NASA systems and embedded medical systems for fairer comparisons with the "good ol' days."

  • by Duhavid ( 677874 ) on Wednesday February 21, 2007 @01:37AM (#18092840)
    Just one, get the stuff I asked for done faster... :-)

    And we have had GUIs for a while now, each iteration of Windows* takes
    more hardware, and the things that GUI is capable of doing have not
    gotten any better, really. I have not seen aero yet, so I dont know
    if there is something offered aside from eye candy there or not, but
    I am betting on eye candy, thus far.

    *I am mostly thinking NT 4 to Vista, as that is a mostly level playing field.
  • by Stewie241 ( 1035724 ) on Wednesday February 21, 2007 @01:41AM (#18092856)
    right, but the fact that we're seeing OSs constantly eat more and more resources, and for what? Yes... there have been huge advances in the kinds of computations that computers could do. But usability doesn't have to take gigs and gigs of ram... There are GUIs that have a smaller footprint than Windows. Usability is important, but in many cases we've gone past usability to putting on a cheap facade to make something look better. If we got organized, we could do a lot of useful things with spare CPU cycles... pick a research project and donate the cycles. Provide a good reason that Vista's system requirements have to be more than XPs.
  • by Daengbo ( 523424 ) <daengbo&gmail,com> on Wednesday February 21, 2007 @01:56AM (#18092950) Homepage Journal
    I don't se why so many /.ers are having trouble with this idea, since it appears to be exactly like top vs. free for memory usage.
  • by mcrbids ( 148650 ) on Wednesday February 21, 2007 @03:01AM (#18093228) Journal
    Have we stopped caring about size and performance of programs?

    No. But our limits of acceptability have changed. As processing power has gotten cheaper, developers (myself included) have focused more on getting features out to market faster, rather than application performance.

    I think all of these things are slightly true - we used to care deeply about program speed and footprint. Now we don't.

    That's always been correct. We care more about how many features are available at what cost, so long as performance isn't noticably bad on commodity hardware.

    Do you remember when c was considered a "high level language"? What about the debates on how slow programs written in c were? I do. Times have changed....

    I suspect it has gone much too far - programs are far slower to load than they were even 5 years ago - they are large and bloated, and don't share things well.

    I don't know about that. Perhaps you don't remember loading DOS programs like PC-Write on an 8086 processer with 512K RAM? That was my word processor of choice, and it got slower the longer your document was. By the time you passed 100k, it was a dog.

    Anybody remember Sidekick - it was wonderful - and it was available at the touch of key (ok, 2 keys). Remember how FAST it was? I know it didn't do much, but it was dashed useful.

    I sure do. I also remember the care with with I never hit the two space bars together in a graphics program. (That would universally crash my computer). It shared TEXT ok, but anything graphical was another mess entirely.

    And I still can't beleive I still write "for" loops.

    If you don't mind me asking, what would you RATHER be writing?
  • Re:I disagree (Score:2, Insightful)

    by stewbacca ( 1033764 ) on Wednesday February 21, 2007 @03:41AM (#18093398)
    If anything, ram requirements on OS X will go down as Rosetta is slowly rendered useless. I run 1Gig on my MacBook, and it runs fast enough. For more heavy stuff, I'd rather be using my Intel iMac with the 2Gigs of ram (and the roughly .5ghz speed boost on the cpu).
  • by TheNetAvenger ( 624455 ) on Wednesday February 21, 2007 @05:19AM (#18093768)
    OS/2 reccomended 4MB

    Not to be picky, but OS/2 (even assuming 2.0, since it was the first 16/32bit release) REQUIRED 4MB of RAM, but didn't run well unless you 12MB of RAM, although I do know some people that got by with 8MB of RAM, I also even know peeps that ran NT 3.1 with 8MB of RAM as well, even though it was just as painful to watch.

    So bascially people are here making fun of Vista for wanting 512MB, and running 'much' faster than XP when configured with > 512MB...

    Last I checked OSX even wants 512MB and 1GB of RAM for acceptable performance if you run a lot of concurrent apps since the windows are double buffered in system RAM for the composer.

    Also any *nix distribution with XWindows and a Windows Manager like KDE running, easly scale to where 512MB and 1GB are a sweet spot as well.

    Since this is the year 2007, I don't see Vista being far out of the ballpark, except for the fact it has some really smart caching technology that allows it to better use > 1GB of RAM via its Superfetch caching technology in ways other OSes don't unless they have the application load demanding it.

    Which is the point most everyone seems to keep missing in this post. They are in a fuss because Vista continues to get faster and faster as more RAM is added.

    Most OSes 'desktop performance' top out at 1-2GB of RAM and don't use the extra RAM for anything but dumb/lazy caching.

    So instead of making fun of Vista for actually taking advantage of this extra 'free' RAM and scaling it in a way that 'continues' to add performance even when applications don't need it, maybe we should focus our efforts in the OSS community to work on caching technology so all OSS OSes will scale RAM as well as Vista.

    (PS, Even though I'm responding to your OS/2 numbers, this post is meant more of a general response to everyone in here, so nothing personal to you, the OS/2 numbers were just a fun place to jump in :) .)
  • Re:More RAM (Score:5, Insightful)

    by MojoStan ( 776183 ) on Wednesday February 21, 2007 @07:11AM (#18094168)

    Had a friend who tried to buy a Dell box today. They wouldn't sell it to him with XP on it; only Vista.
    Does the friend know that "business" Dell PCs (e.g. Optiplex desktops, Latitude notebooks, Precision workstations) can be configured with XP? Only the "home" PCs (e.g. Dimension desktops, Inspiron notebooks) are restricted to Vista only. (Dimensions and Inspirons are also sold in the "business" section, but they are really meant for home users.)

    I can only imagine what kind of deals Dell and MSFT have cut...
    I think it's reasonable to believe that phasing out XP support might be worth the relatively few sales they lose by not offering XP to home users. Maybe my imagination should be more cynical.
  • by RAMMS+EIN ( 578166 ) on Wednesday February 21, 2007 @07:40AM (#18094282) Homepage Journal
    ``You say it's obvious; but it's amazing how many Slashdot posts I've seen which consist of "I've got XGB of RAM [where X>1] and Vista's using up 75% of it running the OS alone; therefore Vista must need XGB of RAM to even run, never mind applications!"''

    That sounds so much like what people have been saying about *nix systems for years. What's interesting is that, on my system, the available RAM has outgrown my ability to use it:

    Mem: 1426528k total, 1116820k used, 309708k free, 121976k buffers
    Swap: 0k total, 0k used, 0k free, 777864k cached


    In other words, even though about 900 MB is being used for buffering and caching, there is still about 300 MB free, because the actual system takes up only about 300 MB (a little more than reported here; I think the above does not include memory eaten by the kernel).
  • by drsmithy ( 35869 ) <drsmithy@nOSPAm.gmail.com> on Wednesday February 21, 2007 @07:45AM (#18094310)

    I was thinking the same thing, how much more than eye candy has every release been since 95. Obviously the switch to the NT kernel was big but really the biggest difference in each release has been eye candy.

    No, it hasn't. Typically the eye candy has been the _least_ significant part of OS updates (albeit the most user-visible).

    Can you imagine how fast win 95 would run on an AMD Athlon 64 6000+ with a gig of ram.

    Nowhere near as well as XP. Windows 95 was optimised for slow machines with very little RAM. It simply can't make good use of the extra hardware.

    This is a pretty common occurrence in OS development. Early versions of Linux can't make any near as good use fo rmultiple processors and large amounts of memory as more modern versions can.

  • by Lumpy ( 12016 ) on Wednesday February 21, 2007 @08:56AM (#18094646) Homepage
    Yeah for low end websurfing use. Me I'm sitting here looking at it going "HOLY SHIT!" as I do video editing and that means I need to bump ram up to 8Gig or higher because the OS is such a pig.

    If I need 4Gig as the OS's sweet spot and I also need 4 gig for my editor app sweet spot, I start looking at different platforms.

    Problem is that these finding that "sweet spot" are not telling the full story. What apps are they running? if they are simply using low impact apps like office and IE/firefox and a few games then it's hands down the OS is being a ram pig and is incredibly unacceptable to those of us that use ram intensive applications.

    Reinforces my decision that the next upgrade I take is to the Mac.. Until then I need to find a NLE that will be happy in XP for a few years.
  • by Lanu2000 ( 972889 ) on Wednesday February 21, 2007 @09:13AM (#18094754)
    While on the surface this seems like a good request, it seems to me that doing this would be more harmful to Apple's reputation than helpful. Unlike Microsoft, which (not counting peripherals) is in primarily the software market, Apple integrates their OS and hardware, so they have fewer hardware configurations to support. If they opened it up to the beige boxes of the world the percieved quality of their OS would suffer... this wouldn't "just work" like they do now.
  • by div_2n ( 525075 ) on Wednesday February 21, 2007 @09:34AM (#18094912)
    Since this is the year 2007, I don't see Vista being far out of the ballpark,

    Since there are a very large number of desktops and laptops still being (successfully) used that won't even hold more than 1GB of RAM, I'd say the fact that 1GB of RAM will not provide good performance is beyond out of the ballpark. It's stupid.

    As far as I know, Vista is the only OS in existence that won't run that great with 1GB of RAM. So is Vista so much more advanced that it needs that much RAM? Since all the new play pretty features seem to be ripoffs of OSX which runs just fine on 1GB of RAM, I'd say no.
  • Re:not just Dell (Score:2, Insightful)

    by Hes Nikke ( 237581 ) on Wednesday February 21, 2007 @10:58AM (#18095732) Journal

    The only way to get XP is to go with a used, refurbished, or off lease machine, or like another person said, go to the business machine section.

    come on man! your a geek/nerd/dork, you should be able to think outside the box! i mean you could buy the os [newegg.com] you want [ubuntu.com].... or you could just get a mac [apple.com] and not worry about it... i dunno, just a few suggestions rather than complaining that manufacturers aren't serving your needs...
  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday February 21, 2007 @02:36PM (#18099002) Homepage Journal

    While you used DOS to handle text editing, I was using Micro-Emacs (multiple buffers, copy-paste etc). I was using scalable fonts with a GUI.

    There are editors with that functionality on DOS. And there is GEOS for DOS, which is a multitasking OS/shell with scalable fonts. (GEOS supposedly uses DOS only for filesystem access.)

    You collected 256 color images, and I collected 4096 color images (and later 16 million color images, thanks to A1200).

    A1200 still suffers from inherent limitations of the Amiga architecture; only 2MB RAM accessible to the custom chips.

    Also you couldn't do HAM Animation without using up the CPU, and you couldn't do it fast enough for video, so the 8bpp mode on the PC (when VGA came along) was superior for most purposes to the Amiga, aside from the bitblt and such routines.

    You played text adventures or adventures with 16 items at most, while I was playing Shadow Of the Beast (over 400 colors on the screen, 18 levels of scrolling, screen-sized sprites, 60 frames per second, incredible digital sound).

    And still the same boring platformer play. And really lousy collision detection. And a one-button joystick, sigh. Some games I found more impressive included Powerdrome and even Blood Money, which is a pretty standard fly-n-shooter. Also Indianapolis 500; the Amiga 500 experience was much like the $4,000 386 experience.

    Amiga sound back in those days was 4 channel and 22kHz, much more impressive than PC (Adlib, FM synthesis, whee!) but still annoying :)

    Still, sound was one of the best things about the Amiga. MOD files are still neat.

    You configured interrupts manually, I had auto-configurable zorro slots.

    This is THE best thing about the Amiga, on top of the hardware-level autoconfig there's the fact that AmigaDOS is a microkernel-based system where drivers are user space processes. The driver could be just another program, stored in ROM on the card.

    The Amiga was destined to die because custom chips don't scale. You have to make new custom silicon to take advantage of modern processors. I mean, that's why cpublit exists; if you have a decent processor in your OCS machine (which in Amiga-land is like, 25MHz) the CPU is faster at doing a bitblit than the custom chips. Even if Commodore hadn't been mismanaged into oblivion the Amiga still had no chance to survive.

  • by IgnoramusMaximus ( 692000 ) on Wednesday February 21, 2007 @11:04PM (#18104604)

    Vista maps the GPU and system RAM used for 'video' together, so by using the new WDDM model and a AGP or PCI/e bus, there is no need for Vista to shove the System used RAM for video back into the GPU RAM space in order to let the GPU use it or draw with it. It can stay in System RAM, and the GPU sees it as native GPU RAM.

    Except of course that this is impossible with any GPU using dedicated on-board RAM. The whole point of on-board RAM is that its bandwidth is much, much wider then that of even the PCI/e bus. Also because system buses are prone to being bottlenecks hampering application performance when large numbers of large textures are involved, most higher end GPUs use texture compression algorithms coupled with GPU-bound hardware decompression schemes, thus effectively precluding any attempts at using system RAM for such activities. In other words what you are describing is only possible on cheesy, sub-$100 "GPU"s with laughable 3D performance.

    I think this is the bigger point you are missing

    See above.

    99.9% of desktop users that have 128MB on their Video card will not see this effect, the chance that you have enough active Windows open to eat into system RAM is not as likely as you might think.

    This assumes that no other 3D apps/applets/what-not are in use (thus no virtualisation of any kind is in effect) and also that some of the textures used are not duplicated in system RAM, as it is usually the case with complex 3D scenes since textures tend to expand rapidly after decompression in hardware. The moment even one non-DirectX-10 3D app in use, the whole scheme blows apart and up to 128MB has to be virtualised per application in addition to the OS.

    Now with games, this WILL happen; however, in gaming what is more beneficial to you in terms of performance? Loading the game textures off the Hard Drive continuously, or letting the system pretend you have more GPU RAM so the textures are 'virtualized' in system RAM?

    This of course is completely irrelevant from the point of view of analysis of Vista since all current games optimise texture loading by caching them in system RAM. They also set up complicated, fine-tuned rendering pipelines and what not. If anything, the virtualisation will screw them up (as is the case with most games now) since the designers were not expecting to be sharing the GPU and subsquently optimised for that case. Vista is introducing unexpected timing and memory/disk access behaviour which causes most of these games to malfunction. Just check out the various gaming forums for all the moaning that is coming from Vista users. Turning off Aero is pretty much a pre-requisite to getting most of the current games to run with any reasonable stability.

    If you know anything about gaming, yanking crap off the hard drive compared to being able to use another small chunk of system RAM to hold them is going to yield a far better experience. And this is also superior to just caching them, as the Game is not having to load/request the textures continously. Instead the Game sees the textures as already loaded in GPU space and can use them as if they were sitting on the Video card itself.

    See above

    Vista will move textures to the GPU RAM if they are in high performance use. So some of the initial calls to a texture will be virtualized and used from System RAM, but if the texture is frequently used and another texture is not being used, Vista will flip them out so the higher priority texture will then be sitting in GPU RAM, which is faster.

    This, naturally, is complete nonsense.

    As I pointed out, swapping textures into the GPU RAM from system RAM is anything but "high performance".

    Also, Vista has no business messing with "optimising" per-application textures since it is impossible for an OS to estimate the usage patterns an

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...