Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Microsoft Windows

Microsoft To Dump 32-Bit After Vista 527

SlinkySausage writes "Microsoft has used its annual hardware engineering conference to announce that Windows Vista and Server 2008 will be the last versions of Windows capable of booting on 32-bit CPUs such as Intel Pentium 4 and Core Duo. AMD, which introduced 64-bit CPUs early — much to the derision of Intel, which said there was no use for them at the time — must be delighted with Microsoft's decision. Owners of first-generation Intel Macs that used (32-bit only) Core Duo CPUs may not be so happy knowing that Vista will be the last Windows they will be able to run."
This discussion has been archived. No new comments can be posted.

Microsoft To Dump 32-Bit After Vista

Comments Filter:
  • by Spamalope ( 91802 ) on Thursday May 17, 2007 @11:20AM (#19162387)
    made today will be able to run the Microsoft replacement for Vista. Why worry?
  • by Gharbad ( 647620 ) on Thursday May 17, 2007 @11:20AM (#19162397)
    Wasn't all the talk during vista's development that it would be the last operating system they'd make?

    I know that was taken back a while ago. Just saying.
  • The real questions are:
    • will hardware vendors stop releasing 32-bit chips?
    • Will companies upgrade hardware in orer to get the latest version of Windows?
    • Will this help provide more incentive for a Linux desktop?
    • Will this increase the amount of lead going into our landfills?
  • Huh? (Score:5, Insightful)

    by bakes ( 87194 ) on Thursday May 17, 2007 @11:24AM (#19162525) Journal
    Microsoft themselves still don't support 64-bit yet. I installed the 64-bit version of SQL Server 2005 only to find it doesn't support 64-bit for SQL Mail and SSIS - you have to run the 32-bit versions of them under WoW64. Someone else has already mentioned drivers. If Microsoft can't or won't support their own software under 64 bit environments, they are going to have a heck of a time convincing developers to push everything over.

    I fear there will be a loooooooong transition time - just as well they gave everyone an early warning.
  • by gumbi west ( 610122 ) on Thursday May 17, 2007 @11:25AM (#19162537) Journal
    What, are you nuts? This means that in seven years you're computer won't be able to run a newer MS OS that's worse than Vista, but with MS games that lock out Vista. You'll be stuck with OS 10.8 with a dual boot to XP or Vista or any of a number of *nix OSs. A sad, sad computer it will be.

    Actually, what I thought was crazy is that Apple customers aren't the only ones using the Core processors, why single them out? Is Apple even the largest customer of Intel 32-bit processors?

  • by Sancho ( 17056 ) on Thursday May 17, 2007 @11:25AM (#19162543) Homepage
    I don't know. Some people attribute the raging success of Apple's computer line in the past couple of years to the switch, because virtualization is now much better. Certainly most of the geeks I know that run Apple only switched because they could use virtualization to run those apps that they could not live without, as well as for testing in other OSs.
  • by postbigbang ( 761081 ) on Thursday May 17, 2007 @11:27AM (#19162603)
    when you consider that it took many more years for Vista than was planned; the next Windows release ought to come about retirement age for most of us.

    That and as Microsoft seems to feel that your next PC will be a cell/mobile phone, I'm waiting for the advent of the 64-bit mobile phone processor. Imagine its 128-bit predecessor. You'll be able to address every bit in the known universe with the memory map on *that* one.

    Or, perhaps 'legacy' hardware will get some much needed added life, by utilizing ultra-fast 32-bit processors that just do work far faster than their 64-bit equivalents-simply because code maturity will force opmitizations.
  • by Chris Burke ( 6130 ) on Thursday May 17, 2007 @12:04PM (#19163415) Homepage
    will hardware vendors stop releasing 32-bit chips?

    As far as AMD and Intel are concerned, 32-bit-only processors are nearly gone already.

    Will companies upgrade hardware in orer to get the latest version of Windows?

    Maybe, but it's more likely just to upgrade the system specs like they're having to do with Vista rather than to support 64-bit. The upgrade needed to run Vista probably entails purchasing a 64-bit processor, even if they don't use a 64-bit OS.

    Will this help provide more incentive for a Linux desktop?

    Nope.

    Will this increase the amount of lead going into our landfills?

    Can't see how, but I don't know.
  • Yeah, I know. What's the deal with the OP?

    Owners of first-generation Intel Macs that used (32-bit only) Core Duo CPUs may not be so happy knowing that Vista will be the last Windows they will be able to run.

    This leads me to a few questions:

    • Of all the people using 32-bit processors, why single out Mac users? Mac users often don't even use Windows at all.
    • ... which leads me to a second question: Is this supposed to be sarcastic?
    • What makes you think Microsoft will stick to this?
    • What makes you think we won't all have new computers before Microsoft releases their successor to Vista?

    Microsoft is notorious for having high expectations and grand plans, taking too long to execute, and dropping most of their features, improvements, and changes before the end product is released.

  • by iainl ( 136759 ) on Thursday May 17, 2007 @12:10PM (#19163537)
    2017 might be comedy exaggeration. But

    1) Who bought a 32-bit processor for Christmas?

    2) Who bought something capable of running Vista in 2001 when XP launched?
  • by 0123456 ( 636235 ) on Thursday May 17, 2007 @12:28PM (#19163905)
    "All those other 64-bit CPUs you mention aren't x86 compatible and hence irrelevant to the workstation market."

    A few years ago, those 64-bit CPUs _WERE_ the workstation market.
  • by DrYak ( 748999 ) on Thursday May 17, 2007 @02:18PM (#19166001) Homepage

    cause most of the competing processors (PARISC, Alpha, MIPS) to abandon the market even before the Intanium shipped

    Not exactly.
    HP collaborated [wikipedia.org] with the design of the Itanium. Thus pulling out of the market before the Itanium arrived was a logical decision from the point of view that they were going to replace the Alpha with their newer baby jointly-developed with Intel.

    MIPS continued for some time after the Itanium and was progressively dropped when its sales went to low, both from workstation and embed markets.

    (PARISC : I don't know. I suspect it to be somewhat similar situation).

    What kills processors is two fold :
    The main reasons is a big ball and chain called "binary legacy".
    - Business are used to run Windows on their workstation. Microsoft has never supported additional platforms for long time (Alpha has only had 1 NT version made for it. Itanium had only 1 XP version made for it) because supporting multiple platform is hard for them (the only reason they'll keep support for 64bit x86_64, is that they'll kill 32bits instead and thus they'll still have only 1 main architecture to support). Thus there was a lack of interest from the largest consumers because of this absence (of course there are a lot of shops running Unices. But they aren't profitable on the same level as Dell's consumers).
    - And even if you had the corresponding OS for your platform, you still need to have software to run on it. And the problem at any time that something new arrives, is that there a huge decade of legacy to run of it. Yes every time a new designs arrive, it may be largely superior. But corporation's PHB don't give a damn about deisng quality. They only want to know if their applications will run. Alpha wasn't compatible at all with x86 application. Neither 68k back in its days. Nor MIPS. Nor PARISC. Itanium had lousy compatibility because it was mostly done in software and thus was running much slower than the rest. Transmeta only survived because it has decent speed in running x86 code and has very low power requirements. The main reason that Itanium flunked and AMD64 prevailed is that the later was an extension to x86. Yes, this extension is a hack. But a hack that has full backward compatibility, and that can be plugged into a PC and run today's OS with today's applications. Backward compatibility has been both the x86's main advantage and main problem.
    - Also there's the problem of drivers. Even if the newer arch. uses standard bus such as PCI where you can plug all your existing hardware, you still have to get drivers for it. And few manufacturer are going through the hassle of supporting another additional binary format. Linux is already too much for them, exotic CPU are beyond them. Early adopter of Windows XP 64bits (both Itanium and AMD64) will remember.
    - Compare this to the Linux world where, becuase the source is freely available, and there project such as Debian that take care to have only very stable code. : switching to a newer architecture is mostly a matter of recompiling the code for the new platform. That's why Linux was almost available overnight for AMD64, running on Transmeta simulators before the actual hardware was available. It mostly had only to be recompiled and could be done easily because it leveraged work done to port the code to other 64bits platforms (Sparc64, or the other 64bits arch you mention).

    In a perfect world were everyone is using opensource, those arch. you mention could have survived, no mater what marketing is heard from Intel and every one could benefit from the latest great processor with clean and perfect architecture. But in a world were Microsoft has a huge monopoly on desktop machines and everyone runs binary apps and drivers, only hacks of the pre-existing arch can have significant impact. That's why you haven't seen anything revolutionary for the past 35 years (software compatible since the 8bit 8008 in 1972, binary compatible since the

  • by DrYak ( 748999 ) on Thursday May 17, 2007 @02:30PM (#19166245) Homepage
    This decision by Microsoft to drop 32bit support *may* boost Linux (and other OS) adoption.

    Currently I know some friends who uses old machines and a lot of machines in the university (specially in labs). All those are still based on P2/P3 or other CPUs of that era. 10 years old processors.

    "A next Windows" has no chance of happening before 2013, considering their current release speed of 6 years between XP and Vista. Worse if we take into account that Microsoft has promised to build an entirely new capability-based microkernel OS. Which is very unlikely, given their tendency of scraping newer non-eyecandy idea out of Vista because of time constraints.

    By the time Microsoft finally releases their next piece of shit, there will be a lot of 10 years old, 2003-era processors everywhere (Intel Pentium-IV, 32-bits only Intel Core, AMD Athlon XP, early 32bits AMD Semprons) :
    This mean that when Windows-the-next (tm) comes out, either there will be a massive switch toward other OS (very likely in university labs) or the new OS will see an even slower reception than Windows Vista is currently experiencing (very likely on Joe 6-pack's older 32bits home machine).

    The last similar switch of technology requirement was Windows 95 : the first consumer oriented widely diffused Microsoft OS that could only run on 32bits protected mode CPUs.
    In 1995 (okay, 1996) when it came out, Intel 80386 where 10 years old and had finished displacing the 16bit only older 80286.
    99% of home computer where equipped with 32bits Windows 95-"mostly"-capable CPU ranging from 386 to Pentium.
    That's why it went went "somewhat more smoothly".
    Throwing out the 32bit arch is TOO MUCH early. Microsoft should wait until it is completely phased out of the market, in most segment (if possible, including the small embed/ITX market of people making low-power boxes. Current VIA chips are 32bits only). The problem is, maintaining compatibility for more than 1 architecture has always been too much work for Microsoft (Alpha and MIPS got only a couple of NT releases. Itanium hasn't got a much high number of OSes), in contrary of the OpenSource community.
  • by Chris Burke ( 6130 ) on Thursday May 17, 2007 @02:38PM (#19166431) Homepage
    However, there's a reason why x86 is still the dominant platform extant. Underneath all the hacks and kludges and other cruft, the basic platform is stable, completely documented, and TIME TESTED.

    BWA HA HA HA HA HA HA!

    Sorry, I'm not trying to make fun, that was an excellent post so stumbling across that small clause thrown in there made me Laugh Out Loud.

    Sadly, there is a lot in x86 that isn't documented. Especially if you're looking for all that documentation in one place, but even without you're never going to find every piece of undocumented behavior. The worst part is that a lot of it you would never think could matter but ends up mattering a lot. Some of it has been discovered and documented on the net, others is "documented" only in the heads of the engineers who made the chips. This is ultimately in my opinion one of the only relevent digs against x86. It makes it extremely difficult to make fully compatible x86 chips, which is part of why there are so few people making them.

    Still, as long as all the AMD and Intel engineers aren't wiped out simultaneously, we should be okay. Transmeta and Via still know how to make x86 chips too. But pretty much any other ISA is better documented than x86.
  • by SEMW ( 967629 ) on Thursday May 17, 2007 @02:52PM (#19166701)
    Many other people have pointed out your mistakes-- using a line graph with a non-continuous x-axis (i.e. name of release), using a linear graph to plot something that should be expected to grow exponentially (think RAM equivalent of Moore's law; doubling every x years is exponential, not linear), etc. I've fixed most by taking your data and plotting the natural logarithm of recommended RAM against the release data:

    Google spreadsheets: http://spreadsheets.google.com/ccc?key=pLElDZW8EaP djJ0gOQ4r0MQ [google.com]

    PNG (for those who can't view Google Spreadsheets): http://img511.imageshack.us/img511/6696/memoryrequ irementscb5.png [imageshack.us]

    As you can see, it's pretty much a straight line, exactly as you'd expect.
  • Re:YES! (Score:3, Insightful)

    by Chris Burke ( 6130 ) on Thursday May 17, 2007 @03:19PM (#19167277) Homepage
    You think the hardware vendors are waiting around? They are already dreaming of 128-bit CPU's.

    Well I see what you're getting at (hardware vendors wanting to sell upgrades), but no, they aren't dreaming of 128-bit cpus. Because 64 bits is really going to be enough for a long time. 2^64 is huge.

    Previous jumps made a lot more sense. 4 to 8 to 16 was automatic, as soon as transistor budgets was high enough it made sense to do it. 16 bits wasn't ever sufficient, either -- 64k isn't even a very long text file, and PCs had ten times that much ram already that needed to be addressed through segments. 32 bits gives you 4GB of address space, which is starting to get pretty reasonable, and was more than sufficient for quite a while, but also not ridiculously huge. Servers bumped up against it first, but when AMD released the Opteron even my not-too-expensive home desktop had 2GB of RAM. Intel may have been right that desktops didn't exactly need 64-bit, but overal the time was ripe to change.

    The thing is, though, that while Moore's Law is exponential, increasing bits is super-exponential, as in 2^2^N. So every time we double the number of address bits, we double the number of generations it takes for memory densities to catch up. 32-bit can address 64,000 times more than a 16 bit machine. 64-bit machines can address 4 billion times more memory than 32-bit.

    So it's going to be a while -- at least twenty years even if the exponential growth in memory capacity continues unabated -- before there's any point to even considering 128-bit addressing. Yes, hardware vendors may like to promote upgrades, but it's easy enough to do so just by offering more performance/lower power/whatever other features. Adding bits means adding cost in datapaths, in pins, and in having to convince software vendors to re-write so there's any point in having those extra bits and datapaths in the first place, and if none of the software people want those bits they won't buy in and then it's just wasted.

    Oh, and if/when we do ever switch to 128-bit addressing, which I'm predicting won't be for another 2 decades at least, then we will never switch to 256-bit addressing, at least not until we leave the Milky Way and are no longer satisfied by being able to address every particle in a single galaxy uniquely.
  • by noewun ( 591275 ) on Thursday May 17, 2007 @03:46PM (#19167869) Journal

    None of this touches the twin problems which makes Microsoft's release schedules so awful: the religion of backwards compatibility and a overly-managed, near-chaotic corporate culture which emphasizes endless meetings and paper trails over innovation. Both of these items stem from something Microsoft can't control, which is the necesity of leaning on Windows/Word as their two dominant profit engines. Essentially, Microsoft has worked their way into a position in which true innovation (of the kind Apple was forced into with the failures of Copland and Pink and the adoption of OS X) is nearly impossible, because anything which threatens to cut off a sizeable portion of their user base directly threatens the company's bottom line.

    In other words, the problem isn't Windows per se, or 32- versus 64-bit, or any other technical issue. The problem is Microsoft needs Windows simultaneously to be the same old operating system you've been using for years and the latest, greatest thing, and it can't be both. For a technology point of view, the best thing would be to really remake Windows from the ground up as Apple was forced to do with OS X and just tell people that if they bought their machine before 2001 they're out of luck. But they can't, and won't, do this, so their release schedule will continue to be contrained by the need to do two opposing things at the same time.

  • by Anonymous Coward on Thursday May 17, 2007 @05:22PM (#19169799)
    What are you smoking? The two top-end game consoles (Xbox360 and PS3) have 512MB of main memory, which is shared between graphics AND everything else. Compared to that, a limitation of 2-3 GB on PC games is practically no limitation at all.

    The people who are running out of memory are DBA and server farm operators. Maybe MMO server developers have problems with the 4GB limit.

    In practice, everybody who struggles with the 4GB limit is buying their own hardware anyway, so they can buy PAE-capable boards and run PAE-capable software on them.

    4GB is not a practical limitation for everyday consumers (yet), and I don't expect it will be for at least 10 more years, no matter HOW bloated Microsoft makes their software.
  • by level_headed_midwest ( 888889 ) on Thursday May 17, 2007 @06:36PM (#19171211)
    XP has support for IPv6 already. Direct3D 10 supposedly can be grafted into XP- it's only a question of whether the driver will need to be hacked to do so or not. New hardware can prove to be a thorny issue as 64-bit Vista and Vienna drivers cannot load into a 32-bit XP kernel.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...