Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Windows Microsoft Operating Systems IT

Microsoft Takes Step Toward Phasing Out 32-bit PC Support for Windows 10 (zdnet.com) 105

Starting with Windows 10 2004, Microsoft is changing the minimum hardware requirements for the device. The change affects new, not existing PCs from OEMs only. From a report: According to the documentation, Microsoft isn't making available copies of 32-bit Windows 10 media. For now, Microsoft is still allowing users to buy 32-bit Windows 10 at retail and to continue to get updates for their existing 32-bit Windows implementations. Anyone with a 32-bit PC should be fine for as long as their devices remain usable. "Beginning with Windows 10, version 2004, all new Windows 10 systems will be required to use 64-bit builds and Microsoft will no longer release 32-bit builds for OEM distribution. This does not impact 32-bit customer systems that are manufactured with earlier versions of Windows 10; Microsoft remains committed to providing feature and security updates on these devices, including continued 32-bit media availability in non-OEM channels to support various upgrade installation scenarios," Microsoft wrote.
This discussion has been archived. No new comments can be posted.

Microsoft Takes Step Toward Phasing Out 32-bit PC Support for Windows 10

Comments Filter:
  • by xack ( 5304745 ) on Thursday May 14, 2020 @09:21AM (#60059430)
    Companies still have a lot of 30 year old software from Win 3.1 and DOS that don't have the souce to update them. Windows LTSC should support 32-bit indefinetly for these situations. Apple lost a lot of credibility when they forced the dropping of 32 bit, I hope Microsoft is smarter.
    • by WoodstockJeff ( 568111 ) on Thursday May 14, 2020 @09:29AM (#60059466) Homepage

      Not necessarily - it is possible to run 16-bit and DOS software using a 32-bit VM. I've been doing that for years after I ran out of copies of Win7 32-bit, using a Microsoft-supplied XP virtual machine.

      • by MightyMartian ( 840721 ) on Thursday May 14, 2020 @09:51AM (#60059570) Journal

        You can run Windows 3.1 under DOSEMU as well, or through virtualization.

      • Comment removed (Score:4, Interesting)

        by account_deleted ( 4530225 ) on Thursday May 14, 2020 @11:46AM (#60060048)
        Comment removed based on user account deletion
      • Not necessarily - it is possible to run 16-bit and DOS software using a 32-bit VM. I've been doing that for years after I ran out of copies of Win7 32-bit, using a Microsoft-supplied XP virtual machine.

        Running a VM is a pain in the ass for non-techies. It's one thing to ask your sysadmins to perform functions on a VM. It's quite another to have non-tech staffers go "Look, you'll just have to fire up a VM and run it on an old OS there".

        Basically, companies with legacy 16 bit software are going to be forced to find a replacement. I myself run some old 16 bit software in a VM environment (Thanks, VirtualBox!), but that's as a hobbyist. Microsoft (and to an extent, Intel as well) is forcing the hand of busine

        • It doesn't have to be a pain. For example, old 16-bit games packaged with the dosbox emulator. The virtualization isn't even apparent to the user as all they ever see is the application.

    • Support (Score:5, Interesting)

      by JBMcB ( 73720 ) on Thursday May 14, 2020 @09:31AM (#60059476)

      The companies I know of in that situation (mostly manufacturing job shops using old production control software) have a stack of old hardware to support their old systems. You can buy old Pentium hardware for almost nothing, and it will run Windows 3 just fine. One IT manager who really had his act together had images of all of the weird DOS/Win31/Win95/OS2 configurations and could burn them to CF cards, which all the old hardware ran off of. He could spin up a replacement commodity hardware box in minutes. These were used to load patterns into old CnC machines, which were not supported by the manufacturer anymore, and the replacement cost, each, would have been around $200,000.

      • This is smart. And is the right way to do things.

        • Re:Support (Score:5, Informative)

          by Wolfrider ( 856 ) <{moc.liamg} {ta} {nortuengnik}> on Thursday May 14, 2020 @10:44AM (#60059806) Homepage Journal

          --Actually, the smart way to do things would be to start a plan to get off 30 year old hardware and software, unless you're running a mainframe. It's not the least expensive way, but there's literally no future in continuing to run old unsupported proprietary software. Eventually that whole stack of cards will fall over. It's like relying on original PS/2 hardware to run your shop - it was never designed to run that long. If you don't have a Disaster Recovery plan, you're out of business - and so are your line workers.

          --OMG, back in the mid-2000s I had a project to (finally) convert production software that was running in BASIC, believe it or not. Hired a professional to take a look at the code and he discovered a bunch of bugs that needed to be fixed during the conversion to more modern platform.

          • Its a cost/benefit calculation. A lot of these machines can do their job just fine. Its just a case of planned obsolescence by the vendor to force more hardware sales. Why buy a new water jet when the old one is just as fast and precise? Depending on what the machine is, $200k may be under estimating it. Back when I did IT in aerospace we had to support hardware that got into the millions. At that cost it is completely worth it to set up the logistics of supporting old hardware. These aren't commodity syste

            • Re:Support (Score:4, Interesting)

              by JBMcB ( 73720 ) on Thursday May 14, 2020 @12:36PM (#60060280)

              Depending on what the machine is, $200k may be under estimating it.

              That's a good point. The cost of the new hardware is around $200,000. Then you have the downtime of the swapout, re-training everyone on using the new machine, validating it's output, transferring all the old part files to the new format. The overall replacement cost is probably much higher.

              Not entirely related, but my wife works in the automotive manufacturing field, and they are famous for using equipment for *decades* One of their sheet metal shops still run gigantic steel presses that are fifty years old. They've replaced the control electronics and motors dozens of times, but the press itself is the size of a small house and weighs multiple tons, and is anchored into the foundation of the building. Replacing it would cost tens of millions of dollars. When she first started working in the 90's, it wasn't uncommon to see PDP-11s in factories, as they ran completely custom production software that cost millions of dollars to develop. Replacing it would have cost millions more, plus downtime for retraining, plus all the problems and bugs you are going to have to work out...

              Computers are tools, just like a press. If it works, and you can maintain it, there's no reason to stop using it.

          • Manufacturing plants have maintenance contracts for $200,00 CNC machines. Those machines are repaired, not replaced, for as long as they can produce parts that meet tolerances. They do not have replacement plans for those machines. Upgrade principles and cycles for computer hardware/software do not apply to $200,000 plant equipment.
            • by mccalli ( 323026 )
              They absolutely do. The timescales are clearly different, but the principle is exactly the same. If you have a business dependent on such machines, then you absolutely should have an obsolescence programme to go with it. Saying they cost millions...so do lots of other things. If they cost millions they presumably produce business value in the multiple millions. The problem is exactly the same.
          • The reason is cost. Even where I am, I've been here just over 10 years. When I started 32-bit was still very common, it wasn't the most common on new equipment but it was available widely especially on less expensive PCs and with tiny form factors or affordable laptops (ie, not for mainstream desktop office use). Things like oscilloscopes still had floppy disks. Then a decade later most of them are gone, but I think some are around. Ie, if you replace the machine with something new, then sometimes that

          • Feel free to provide out of the goodness of your heart the $200k+ per system that needs replacing.

      • I maintained an old ultrasound machine that ran windows XP with special drivers you couldn't find anywhere and the vendor couldn't provide. In fact, they required an onsite visit to reinstall windows, which was crazy. We had a clone of the HDD image that we reloaded anytime anything went wrong. We made sure never to connect it to the internet.

        It was just replaced with a new one, but that thing ran for decades because the replacement cost was 5 figures.
        • For one ultrasound machine that I was acquainted with in the late 90s, the image storage was on a Macintosh stuck inside. That also needed periodic maintenance.

          There is a LOT of medical software as well that is utterly dependent upon Windows, even though Windows is not stable. And by "not stable" I don't mean that it crashes a lot, I mean that it changes a lot. You cannot rely upon buying Windows and hoping that it remains unchanged over the years. And for medical equipment or software, they want to kee

      • You can buy old Pentium hardware for almost nothing, and it will run Windows 3 just fine.

        This was true 10-15 years ago. Now vintage computing is a hot thing, eBay has a category dedicated to it, and those old first gen Pentium systems are expensive now. You can get a Pentium IV box for under 30 bucks most days. A Core 2 or first gen I3 for under 60. But that Circa-1996 Pentium 120 Mhz box? It's going to set you back at least a couple of hundred now.

        • I just looked at some of that stuff... and wow. Listings for old Packard Bell computers that sold for $100-$300. Those things were junk when they were new. I suppose if they still work now then maybe they were one of the good ones, but still....

          Personally I'd probably go for something like a Pentium III/early Pentium IV era system over a Pentium I if you are into old games. They can still run DOS/Windows 98 and do most of anything you could do with that old Pentium I. Due to timing issues, for the real

      • by tlhIngan ( 30335 )

        The companies I know of in that situation (mostly manufacturing job shops using old production control software) have a stack of old hardware to support their old systems. You can buy old Pentium hardware for almost nothing, and it will run Windows 3 just fine. One IT manager who really had his act together had images of all of the weird DOS/Win31/Win95/OS2 configurations and could burn them to CF cards, which all the old hardware ran off of. He could spin up a replacement commodity hardware box in minutes.

    • Companies still have a lot of 30 year old software from Win 3.1 and DOS

      You can't expect your software from Windows 3.1 era to happily run when you call up Dell and ask for a new Intel Core i7 with 8GB of RAM and an SSD.

      You are well and truly in specialist territory there, not OEM new machine production.

    • Ending support for 32bit installs doesn't mean they are removing the ability for 32bit applications to run in windows. You can run 32bit software in a 64bit OS. I do it all the time.

      • Well... most people don't have your technical acumen...

        That's simply amazing!

      • by slashcross ( 4471571 ) on Thursday May 14, 2020 @10:28AM (#60059744)

        Ending support for 32bit installs doesn't mean they are removing the ability for 32bit applications to run in windows. You can run 32bit software in a 64bit OS. I do it all the time.

        The problem isn't 32 bit applications, it's 16 bit applications. The new 64 bit processors can run 16 bit programs, but not while they're running in 64 bit mode. If you use a 32 bit OS on the 64 bit processor it will still run the 16 bit applications. And yes, that still matters. There are hardware controllers for very expensive specialized industrial equipment that still use 16 bit control programs to work.

        • by thereddaikon ( 5795246 ) on Thursday May 14, 2020 @11:37AM (#60060024)

          In my experience nobody is trying to run 16bit code directly within a modern OS. Either its being emulated or they are keeping contemporary hardware going specifically for the task. I can't think of many cases where I would want to run legacy 16bit software within Windows 10, and I've supported old CNCs and such before so I'm not talking completely out of my ass on this. Most of this software isn't compatible with Windows 10 to begin with.

          • Comment removed based on user account deletion
          • That's why I was pissed that the installer for Windows 10 (on a test system) deleted a ton of applications from the system with no warning or confirmation, stating they were "incompatible" with Win 10. Gee, did MS ever consider I might want to set up an emulator to run that old software?

            As a result of this, among other things, I never put Win 10 on any of my production systems... not even my gaming rig.

      • Well, OSX has removed all 32-bit support in Catalina, including being able to run 32-bit applications!

        • Well, OSX has removed all 32-bit support in Catalina, including being able to run 32-bit applications!

          The pundits seem to think that Apple did that to prepare for yet another architecture change in macOS, this time from Intel 64-bit processors to Apple/ARM 64-bit processors. By jettisoning all Intel 32-bit support now, they just cut their work in half.

    • Re: (Score:3, Interesting)

      by OrangeTide ( 124937 )

      I set up a small practice on vDos in order to continue running the same DOS based patient records program. They didn't want to retrain the receptionist on new software. Really they did retrain her, and she had some trouble learning the new software, and she ended up with repetitive strain injuries from using the mouse-based Windows software.

      Turns out to be HIPAA compliant you don't need to store patient records electronically on site, you can pay another company to scan your print outs and submit them elect

      • That's because while HIPAA is great for patient privacy it's a disaster for EMR information sharing and integration.

        Well, to be honest part of the disaster was created by MUMPS and Epic and Medicare, but that's another story.

    • Comment removed based on user account deletion
    • by labnet ( 457441 )

      *This*
      My wife last year says... hey I can’t print anymore from my iMac. Husband scratches his head... eventually works out that a software update completely removes any 32bit drivers, which of course includes the laser printer driver. I have to then hack together some abortion of a driver from another brand that now only prints from the manual feed tray. Thanks Apple!... now thanks Microsoft...

    • Companies still have a lot of 30 year old software from Win 3.1 and DOS that don't have the souce to update them. Windows LTSC should support 32-bit indefinetly for these situations. Apple lost a lot of credibility when they forced the dropping of 32 bit, I hope Microsoft is smarter.

      Apple didn't lose "Credibility"; it lost about .5% (yes, pulled out of my ass) worth of application support for packages that simply would never come forward.

      Apple gave users and devs. three major revisions of macOS before finally pulling 32 bit support in the latest version (10.15, "Catalina"); it isn't like anyone wasn't given enough notice.

      Besides, Apple had no choice but to prepare Devs. for the coming of Arm-based Macs. Apple deprecated (then removed) 32 bit support from their Arm Frameworks (and OSes)

    • Note that they didn't say that they won't be supporting 32 bit apps. They only said that they wouldn't provide seperate media for PC's with 32 bit processors. This move only affects those people interested in installing Windows 10 from scratch on a desktop computer with a processor that was made pre-2009 (the core duo).
    • I wouldn't say that Apple lost much credibility. It's pretty much a given that over in the Apple world that you're not going to be able to run an application that's more than a few years old on the latest and greatest version of Mac OS because Apple has a long history of breaking backwards compatibility.

      Windows is really the king of backwards compatibility, and really the 32-bit version of Windows 10 is the backwards compatibility version, since at this point if you're running windows you're running the 64

  • by arglebargle_xiv ( 2212710 ) on Thursday May 14, 2020 @09:22AM (#60059438)
    ... whether it's marketed as 32-bit or 64-bit.
  • by lobiusmoop ( 305328 ) on Thursday May 14, 2020 @09:28AM (#60059464) Homepage

    OK, I have karma to burn, can somebody answer this?

    is 64 bit code slower than 32 bit, in that address references, instructions etc are double the size and therefore both instruction and data cache efficiency is presumably halved?

    • I'd imagine that pretty much all of the data paths within the system hardware have doubled (or more) in size compared to previous hardware generations. Some of these paths can service multiple operations simultaneously. Modern data caches are absolutely enormous, so efficiency is probably less of a concern. Depending on the type of work being done, if a program is optimized for it, 64-bit architectures will generally show measurable gains in performance.
    • Yep, it's slightly slower for normal code. Through there are 64 bit only cpu instructions and if a program made heavy use of them it could be faster.
    • by thereddaikon ( 5795246 ) on Thursday May 14, 2020 @09:52AM (#60059584)

      No, because the processor is designed as 64bit. That means the registers hold 64bit instructions, the pipeline stages are also made to fit 64bit instructions and the caches are sized and tuned to work with them as well.

      There may be some niche circumstances where 32bit code may get a speed bump but its unlikely. For example, a 64bit register can in theory store two 32bit instructions. But the processor has to support actually using this method to see any benefit. And even if it did, there is no guarantee that you will see improved execution performance. That's because the the different execution stages in the pipeline are designed with 64bit in mind. So if you load your two 32bit instructions as one 64bit and get to the ALU execution stage there is no guarantee that 1: both instructions are integer math and 2) the ALU is setup to even handle two instructions at the same time.

      • They didn't stop optimizing executing 32 bit code when the first 64 bit CPU came out, there was a 4 to 5 year change over period. Most optimizations have been done.
      • by urusan ( 1755332 )

        Pipeline parallelization is a more well thought out way to do this: http://hpca23.cse.tamu.edu/tac... [tamu.edu]

        Not only will you be working on 2 instructions at the same time, you may very well be working on N instructions at the same time, where N can easily be 8 or more.

        Modern CPUs rely heavily on these techniques already, so going down to 32-bit won't help in this area.

        Trying to load 2 32-bit instructions simultaneously would only be useful beyond pipelining if both instructions can be simultaneously executed, and

    • A fair question. In a word, no. The 64-bit refers to the address space available to the programs. 32-bit address space was limited to 4GB while 64-bits can address more memory than most of us can afford. Performance should be about the same unless the application has been compiled optimized for 64-bit in which case my may run somewhat faster.

      Where the speed improvements double is when the instructions can process 64 bits of data in a 'word' useful for large numbers.

    • by gweihir ( 88907 )

      OK, I have karma to burn, can somebody answer this?

      is 64 bit code slower than 32 bit, in that address references, instructions etc are double the size and therefore both instruction and data cache efficiency is presumably halved?

      Only minimally so. Also, memory is read in bursts, so cases where it matters are rare. And then there is memory segmentation, where you address, for example, data-areas = 4GB with a base-pointer and a 32 bit offset. The base-pointer goes in to a register or gets cached (TLB?) and hence as almost no influence on run-time.

    • The question is malformed....

      64 bit software is typically running on hardware where the various data paths are at least 64 bit wide. So, per clock cycle, a 64 bit OS on 64 bit hardware can move 2X the amount of data per clock cycle compared to a 32 bit archetecture. This is not related to instructions per clock cycle per core.

      However, 64 bit hardware will perform better when running 64 bit code, generally speaking. You could find exceptions if you looked hard.

      64 but architecture has computing advantages as

    • If they both, let's say have 1GB of ram, then the 32bit system will be more efficient\faster. Windows 10 32bit supports maximum memory of 4GB, so on a 64bit system with more ram, then 64bit may be faster. 32bit systems also suffer more from memory hole issues, so your PCI devices may reassign memory addresses to themselves, which means some of your memory might be wasted. I've seen 32bit systems with 4GB ram, but only 2GB available. For performance, there's all kinds of things that can be done, such as

    • On the other hand, the 64 bit mode can use more registers.

      • by Gabest ( 852807 )

        And have to save more and bigger registers across function calls. That alone is a huge overhead, depending on how you optimize.

    • by DontBeAMoran ( 4843879 ) on Thursday May 14, 2020 @10:18AM (#60059694)

      Yes, no and maybe. Not necessarily in that order.

    • by Misagon ( 1135 )

      No, far from all instructions are longer. and most of them are longer by only a single byte. On the plus side they introduce more registers -- which can save (slower) memory accesses, and thus memory-access instructions. The x86-32 was already one of the most densest instruction sets out there, and X86-64 is very close. (Read more: Weaver,McKee:Code Density Concerns for New Architectures [tudelft.nl])

      The biggest drawback is in the amount of memory needed for page tables: memory that does not become available for program

    • by AmiMoJo ( 196126 )

      It's not particularly large because most of the instructions are the same word size and absolute addresses (full 64 bits) are rare. Usually addresses are relative and the relative part doesn't need to be 64 bits, it can be 32 or 16.

      There is some slightly low cache efficiency when storing 64 bit values like addresses e.g. on the stack, but it's relatively minor especially in comparison to other overheads.

      Any reduced performance from this is almost always more than offset by the gains coming from using 64 bit

    • No, and it isn't a dumb question either. The answer is actually fairly complicated:

      - All uses of 8-bit, 16-bit, and 32-bit quantities are still perfectly valid. They take up the same amount of space as before.
      - However, programming languages have a tendency to play it safe and use 64-bit quantities for things that would fit in a 32-bit value quite easily. This adds quite a few 64-bit values that would have been 32-bit otherwise.
      - Pointers are 64-bits instead of 32-bits. There are a lot of pointers in a typi

    • Yes and no. Theoretically it should not make too much of a difference as long as you have more memory (please be larger than 4GB first). There is more data to read into memory for the 64-bit, and more data being pulled out of the caches. This is mostly for instructions, not necessarily for data. The same program recompiled form 32-bit to 64-bit won't necessarily see larger amounts of data, depending upon how it is written. But for code with a lot of bload the instrucitons may be a bit slower to read due

    • by BKX ( 5066 )

      Yes and no. 64-bit algorithms are often much faster than their 32-bit counterparts because the registers store twice as much data, which makes algorithms involving very large numbers more efficient, and because there are many more registers available in 64-bit mode, which makes complex algorithms much faster, due to less having to go back and forth between RAM/cache and the registers. In theory, 32-bit mode can be faster for algorithms working on small numbers, because the pointer size and instruction size

  • Whatever it is, it's dumb and wrong because MIKKKRO$OFT did it!!
  • by jfdavis668 ( 1414919 ) on Thursday May 14, 2020 @09:32AM (#60059482)
    In supporting customers, I run into equipment that is controlled by 32 bit drivers. Companies sell you the equipment, and never update the drivers. If they had the source code, it would be trivial to update them, but I doubt they even keep the source around. I see a lot of setups where the equipment is managed by an old PC that still has all the required software versions. Best one was in my own building, the HVAC system was configured using a Windows 95 machine. This was during the Windows 7 era. I didn't support that, but I would guess the software was 16 bit. No need to connect to a network, only needed booted when you wanted to change the configuration. HVAC technicians knew exactly how to use it. So much legacy hardware will need 32 bit support for many years to come.
    • Best one was in my own building, the HVAC system was configured using a Windows 95 machine.

      I highly doubt if that machine breaks that someone is going to call up Dell and ask for a new intel off the shelf box and actually expects it to work.

      32bit software and it's 16bit support will be around for a time to come yet, just don't expect to buy a new off the shelf PC with that capability or expect your A/C unit to run on Windows 10.

    • by gweihir ( 88907 )

      Ah, yes, the utter stupidity of using a closed, single-source OS like Windows as an industrial controller.

    • by AmiMoJo ( 196126 )

      Most of those drivers wouldn't load on newer versions of Windows anyway. You can modify the manifest file and re-sign it and disable driver signing to get them to load but there is a good chance they won't work anyway.

      This is not just a Windows problem, anything that requires an ancient Linux kernel version that won't run on modern hardware properly is in the same boat. You can usually get old Windows/Linux to at least boot though, maybe with the basic SVGA video but good enough for one app.

    • I had some business critical hardware that used a HP-IB controller in an ISA bus. No other option but to keep some 486 machines on hand.
  • by hcs_$reboot ( 1536101 ) on Thursday May 14, 2020 @09:38AM (#60059498)
    MS will phase out the 64-bit PC as well ;-)
  • Microsoft tells us to use 32 but software on the Surface Go to make sure it will work. Then they phase out support for 32 bit.
    Nice support job Microsoft.

    • Microsoft Hardware Team, meet the Microsoft Software Team.

      Microsoft is so big, each department doesn't even know what the other ones are doing.

  • This just means that they're phasing out support for 32-bit hardware. Since just about every modern PC has a 64-bit CPU, that makes sense. As far as 32-bit *software* goes, there will still be a way to run older programs.

    • This just means that they're phasing out support for 32-bit hardware. Since just about every modern PC has a 64-bit CPU, that makes sense. As far as 32-bit *software* goes, there will still be a way to run older programs.

      As others have said in the thread, the issue isn't 32-bit software. In all likelihood, that'll continue to work for decades. The issue is 16-bit software. Sure, mainstream software has all moved on; no meaningful number of people are still using Wordperfect 5.1 or Lotus 1-2-3 anymore. However, the 16-bit software that *is* still in use are the sorts of things typically tied to very expensive hardware. CnC machines are the go-to example in the thread; certain medical equipment, imaging equipment, and other 2

  • by King_TJ ( 85913 ) on Thursday May 14, 2020 @10:18AM (#60059698) Journal

    Like someone else pointed out, Apple showed the pitfalls of simply yanking 32-bit support out of the OS in one swoop. OS X Catalina broke compatibility with many game titles out there, which really stings when you realize how small a percentage of games exist for native OS X in the first place. Deus Ex: Human Revolution quit working, for example, and there's no sign of Feral Interactive releasing a 64-bit compatible update to it.

    At least, if Microsoft carefully phases out 32-bit in stages, it will give people more time to transition things over. My experience is, there are still some casual users out there with old laptops that only support 32-bit operating systems (original Intel Core Duo CPUs, for example). But these are all on their last leg. I recently helped a low income person in town get his old HP laptop to run Windows 10 by putting the 32-bit edition on it. It's kind of slow, sure ... but it's all he needs. If MS keeps supporting people like that for another year or two, it should be long enough so they're at least able to use their computers until they physically break down and become unfeasible to repair. That's a lot better than forcing them to trash a working PC because it can't run a current operating system anymore.

    • I'm annoyed that Apple dropped 32-bit support because it broke compatibility with an app I use almost daily, but I wouldn't say they "yanked" support. They started showing the "This app will not work with future versions of macOS" warning in High Sierra, just over two years before the release of Catalina.
      • The problem I had with that warning box is, it wasn't at all descriptive. Sure, developers should have known what was going on, or looked into it. But the typical Mac user who saw that pop up on their favorite app(s) just shrugged it off after they clicked past it. I mean, not really their problem at that point, right? Still runs just fine for them, and who knows how much work is needed to update it? Software often needs an update to keep with with some OS changes. I used to use the "Mariner Paperless

        • It seemed to me they could have at least allowed you to optionally install 32-bit compatibility if you needed it, similar to the way they built "OS 9 classic mode" into the first OS X releases?

          Yeah, they could have released it as an optional component, but they're Apple and they can do whatever they want. The pundits seem to think that Apple killed i386 libraries in order to prepare for yet another architecture change in macOS, this time from Intel 64-bit processors to Apple/ARM 64-bit processors. By jettisoning all Intel 32-bit support now, they just cut their work in half.

    • > I recently helped a low income person in town get his old HP laptop to run Windows 10 by putting the 32-bit edition on it.

      As nice as that is do you not have a stack of old laptops that run 64 bit windows you could have given to them? (I do, well did, running short again as people have recently took a few for various reasons). Or indeed since the machine could probably have run 64 bit, boxes of old ram sticks to max it out.

      I mean I would do that for people and not bother telling them what I did as they

  • I don't think that there's a 64-bit version of Visual Studio yet.
    • by bn-7bc ( 909819 )
      Not score i community edition is representative for the hole range, but if it is, no vs2019 and vs2019 preview are still 30bit ( maybe I’ll haveto do some diging on ms site but the downloads I got (windows 10 pro on a 64 bit cpu) Where 32 bit anyway
    • Userspace will be supported. 32 bit host OS is what's phased out here (and with it native support for ancient 16-bit applications).
  • What I have waited to see (for several years now) is a pop-up in Windows, asking me if I want a free upgrade to 64-bit Windows.
    Then clicking "Yes" should just take care of it ... Just like the upgrade to Windows 10.

    This should have been part of the coerced update to Windows 10 several years ago already! It wouldn't have cost Microsoft a thing (more).
    Instead, millions of users got 32-bit Windows 8 upgraded to 32-bit Windows 10 ...

    Maybe there is a button somewhere in settings that I have not seen or a free it

    • I can tell you the corporate windows 64 bit ISO we use at work that can install any of 12 editions of win 10 refuses to upgrade a 32 bit. Clean install must be done.

      Maybe Microsoft doesn't allow it?

  • by Vandil X ( 636030 ) on Thursday May 14, 2020 @12:54PM (#60060378)
    I am/was a Legacy Software Hoarder.
    As a kid who grew up in the 80s, I've used computers all my life. Apple, IBM/PC, Macintosh, PET, NEXT, SunRays, you name it.

    My first home computer was a Windows 95 PC, which turned out to be a golden era by the time Win98SE came out, because I had access to DOS, Win-16, and Win32 games and, for the most part, they all just worked. Many of these games and select applications were never updated for Windows 7 and so you had to keep an XP machine handy or VM a Win9x machine to play them. I have no business need for them, but I enjoy them for nostalgia.

    I became a Mac user at home with the PowerMac G3. I survived the switch to OSX thanks to Classic. I survived the switch to Intel, thanks to Rosetta and Universal Binaries. I soon had to run Classic programs in Sheepshaver when Classic was killed and I had to say goodbye to programs that never updated to Intel when Rosetta was removed. Catalina killed all the remaining 32-bit Mac programs I enjoyed. Again, no business need for them, but I enjoyed them for nostalgia.

    I was an iPhone 3G owner and built up a solid software library that was killed off by iOS11. Including some very useful paid apps that were no longer in development. There is no legacy way to experience them anymore other than keeping an old iPhone around as a pseudo iPod touch that can run those programs.

    Basically, I have programs I have loved and used for years and then had modern access to those programs removed. It's allowed me to make better decisions on what software to buy. In the end, developers lose because I might not be willing to try their app if I can't run it 5 years later.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...