Forgot your password?
typodupeerror
Cloud IT Technology

Bezos's Vision of Rented Cloud PCs Looks Less Far-Fetched (windowscentral.com) 154

Amazon founder Jeff Bezos once told an audience that he views local PC hardware the same way he views a 100-year-old electric generator he saw in a brewery museum -- as a relic of a pre-grid era, destined to be replaced by centralized utilities that users simply rent rather than own. The anecdote, shared at a talk a few years ago, positioned Amazon Web Services and Microsoft Azure as the inevitable successors to the desktop tower. Bezos argued that users would eventually abandon local computing for cloud-based solutions, much as businesses once abandoned on-site power generation for the electrical grid.

Current market dynamics have made that prediction feel more plausible. DRAM prices have become increasingly untenable for consumers, and companies like Dell and ASUS have signaled price increases across their PC ranges. Micron has shut down its consumer DRAM operations entirely, prioritizing AI datacenter demand instead. SSD storage is expected to face similar constraints. Cloud gaming services from Amazon Luna, NVIDIA GeForce Now and Xbox are seeing steady growth.

Microsoft previously developed a consumer version of its business-grade Windows 365 cloud PC product, though the company deprioritized it -- the economics didn't work when cheap laptops remained available. That calculus could shift. Xbox Game Pass's 1440p cloud gaming runs $30 monthly and NVIDIA recently imposed a 100-hour cap on its cloud platform. The infrastructure remains expensive to operate, but rising local hardware costs may eventually close that gap.
This discussion has been archived. No new comments can be posted.

Bezos's Vision of Rented Cloud PCs Looks Less Far-Fetched

Comments Filter:
  • by Anonymous Coward on Wednesday January 14, 2026 @12:58PM (#65924006)

    destined to be replaced by centralized utilities that users simply rent rather than own

    It's clear Bozo's vision is that he gets to look at what every body does on "their computer", and nothing to do with what actual ownership might become.

    • by Gilmoure ( 18428 ) on Wednesday January 14, 2026 @01:03PM (#65924022) Journal

      And the ultimate Rent income stream.

      A lot of non-techy folks would likely jump at this, especially if it's part of some 'connectivity bundle' that comes with a thin client screen and keyboard, phone service, and streaming service.

      How long before folks rolling their own are looked at sideways by LE?

      • How long before folks rolling their own are looked at sideways by LE?

        We're nearly there. I imagine the suspicion I would be under explaining to some goon that, no, I do not have any social media history to share because I do not have any social media accounts. To answer your question: we're this close.

        • We are... but people are not exactly trusting cloud stuff with all the AI slop, which keeps PC sales gong. In fact, I read PC sales outpaced Mac sales this quarter by a larger percentage, which is notable.

          Yes, Bezos wants to take away our PCs. We have had many companies want to turn all our stuff into terminals for many decades now, way back to the leased lines for mainframes, JavaStations, XStations, ChromeOS, and many others. They will have some success, but the cost of leasing a VM to play games over

      • How long before folks rolling their own are looked at sideways by LE?

        Amazon and Microsoft, etc... will be doing that, so I'd start with them.

    • by taustin ( 171655 ) on Wednesday January 14, 2026 @01:06PM (#65924030) Homepage Journal

      Privacy concerns aside (and they're very real), every time we've looked at any kind of cloud vs locally owned equipment, it has not only cost more, but a lot more, like twice as much, over the life of the equipment.

      It's simply a bad deal.

      • Not to mention the whole DRAM issue is caused by these AI companies. Get rid of AI, and the DRAM prices would become affordable for consumers again.

        See also: The cost of people's electric and water bills, and recent increases in unemployment.

        The whole "rent a device" idea is being pushed by the same assholes making shit unaffordable in the first place.
    • It's clear Bozo's vision is that he gets to look at what every body does on "their computer", and nothing to do with what actual ownership might become.

      Not only that, but also how they're manipulating supply to make his 'vision' happen:

      Current market dynamics have made that prediction feel more plausible. DRAM prices have become increasingly untenable for consumers

      • yeah - the numbers don't stack up so now we'll just make the alternatives MORE expensive... capitalism is a beautiful thing
  • by andywest ( 1722392 ) on Wednesday January 14, 2026 @01:01PM (#65924014) Homepage
    Bezos's vision of the future is in fact a revival of a project of a group of big corporations in the 1960s to build a computing utility. It was called Multics [wikipedia.org] and, like all big corporate projects, it was bloated and sludgy. Most corporations pulled out. On the good side, some of the programmers in that project went and developed Unix and C.
    • by gweihir ( 88907 ) on Wednesday January 14, 2026 @02:02PM (#65924176)

      Indeed. We have gone back and forth between local computing and remote computing as the preferred way, simply because both ways have strong advantages and strong disadvantages. None is the "one true way" and anybody characterizing one option as such is an idiot or a liar. Hence what is better (local, remote or mix of the two) depends very much on what you do and what your requirements and limitations are.

      • Indeed. We have gone back and forth between local computing and remote computing as the preferred way, simply because both ways have strong advantages and strong disadvantages. None is the "one true way" and anybody characterizing one option as such is an idiot or a liar. Hence what is better (local, remote or mix of the two) depends very much on what you do and what your requirements and limitations are.

        You could also frame that as centralized vs distributed computing.

  • I'm not buying it. The only way you'll take my room heater is if I cannot buy hardware anymore. I don't see that happening.
    • by taustin ( 171655 )

      Lots of people have visions of things that aren't here. They're called "hallucinations," and it's the one human ability that AI has mastered fully.

      Perhaps that joke about Zuck being a robot isn't entirely a joke after all.

    • by gtall ( 79522 )

      I wouldn't want a dumb box either, but most regular folks are scared of their computers. If they even need one, on the rare occasions that happens, I think they'd be perfectly potty with a dumb box supplied by their ISP. They no longer have to worry about updates, any viruses are the ISP's problem. Most folks do not use a lot of different apps. And right now their phone has already greased this slide.

    • by gweihir ( 88907 )

      It is not going to happen. The market would need to massively shrink before consumer computer hardware would go away. It would need to shrink massively again before the same would happen for industrial computer hardware. And there are things that will not work with remote computing for a long time, like CAD applications. I talked to some people that tried. The lag was killing their engineers.

      Hence, no, the market for PC-type hardware will NOT go away.

      • The market would need to massively shrink before consumer computer hardware would go away. It would need to shrink massively again before the same would happen for industrial computer hardware.

        Between the first and second shrinks, on what machine would high school students taking a programming class do their coursework?

      • If things get really bad, people will buy Raspberry Pi-tier machines. This might even get gaming companies to actually figure out how to slim things down, because there will be developers (likely the generation after the COBOL programmers) who can bust out a game in 64k of x86 assembly code and have it be something worth playing.

    • If your load is not completely flat all day, i.e. it's a real load, do you provision for peak load and let the excess sit idle all day or do you accept instantaneous throughput constraints?

      • I provision my development workstations for a heterogeneous load. When a task is constrained by throughput of one resource, such as compiling a large program using a lot of CPU, I temporarily switch to another task that uses a different resource. This could be system library updates (which are network and disk bound), updating doc comments of the code that I wrote (which is thought bound), or reading documentation (which is network and thought bound).

  • Nope (Score:5, Insightful)

    by Revek ( 133289 ) on Wednesday January 14, 2026 @01:04PM (#65924026)
    I'll just quit computing if I have to rent a PC.
    • You need to own a PC to be able to access the rented PC.
      • This was my thought. What do the 'terminals' to these 'rented computers' look like?

        How much 'local processing' do they need to do the communication and display? Wouldn't they still need RAM and hardware of some sort?

        • A glorified Echo Show, maybe with an HDMI port for a TV, I'd expect
        • This was my thought. What do the 'terminals' to these 'rented computers' look like?

          Thin clients have been around at least since X terminals in the early 1990s [wikipedia.org].

          How much 'local processing' do they need to do the communication and display? Wouldn't they still need RAM and hardware of some sort?

          One could do a useful thin client that "just" runs X, VNC, or RDP on 256 MB of RAM and a cheap ARM processor. This isn't enough for (say) a development workstation unless you're using tools made for RAM capacities typical of 2003, which means no Visual Studio Code with LSP-driven tooltips.

        • I imagine it would be a monitor with network capabilities that you can plug a mouse and keyboard into, that's it. Everything is streamed. Nothing is processed locally. Just a dumb monitor that doesn't do anything unless it's connected to their service.

        • by ceoyoyo ( 59147 )

          What do the 'terminals' to these 'rented computers' look like?

          More processing power than the fastest computers in the world in the year 2000.

        • This was my thought. What do the 'terminals' to these 'rented computers' look like?

          Historically a thin-client [wikipedia.org] (which are still a thing), and more recently probably something like a Chromebook [wikipedia.org].

      • by Revek ( 133289 )
        Not true. You just need a tv with a app running that accesses your 'computer'.
    • I'll just move to a phone that can be attached to a keyboard, mouse, and monitor, and use the phone as a desktop PC with whatever comes after Samsung DEX.

  • by PatKa ( 1043990 ) on Wednesday January 14, 2026 @01:05PM (#65924028)
    I think that is the long term plan of the rich. Make us pay rent for anything that could be useful so they can cut access whenever you step out of line.
  • Dummy Terminal (Score:4, Insightful)

    by GoJays ( 1793832 ) on Wednesday January 14, 2026 @01:06PM (#65924032)

    They can pry my RTX 5070 from my cold dead hands....

    Fuck dummy terminals. This idea has been pushed by corporate elites for decades...

  • I don't agree (Score:5, Insightful)

    by mukundajohnson ( 10427278 ) on Wednesday January 14, 2026 @01:06PM (#65924034)

    You still need hardware to access cloud resources. May as well make that hardware capable in its own right--it's not expensive, especially if you don't care about gaming.

    • by EvilSS ( 557649 )
      This. VDI makes sense for some corporate use cases, but for individual users I don't ever see it happening and one of the main reasons is the need for an endpoint device. That device can be cheap, but most users are still going to want good screens, keyboards, etc. And it still needs some RAM and a CPU. GPU if you want to pass down decent graphics (it doesn't need to be a powerhouse and integrated will work but it does need some capabilities to handle streaming things that use the GPU on the cloud PC.) At t
      • Even for commercial users, VDIs are insanely expensive, be it cloud, or using VMWare (or whatever it is now) Horizon. There was a product called vWorkspace which was awesome back in the day, but Dell bought the company and killed it.

    • In their world, networks never go down and it's 100% uptime for everything, all the time. They also commute to work riding over the rainbow on a unicorn.
      • by dgatwood ( 11270 )

        In their world, networks never go down and it's 100% uptime for everything, all the time. They also commute to work riding over the rainbow on a unicorn.

        Not to mention that in their world, consumers will have enough bandwidth to upload a quarter terabyte video file in less than a day, which I definitely do not.

  • by Holi ( 250190 ) on Wednesday January 14, 2026 @01:07PM (#65924040)

    The idea of ownership is disappearing in this country.

  • by Shaiku ( 1045292 ) on Wednesday January 14, 2026 @01:10PM (#65924048)

    I don't "rent" electricity, I pay for someone else to generate it and then I own what I've consumed. It does not get returned to the company and there are no restrictions on how I use it.

    I'm gonna have to say that's how I want my computer, too. I might delegate certain tasks but I'm never buying into this subscription based cloud computer crap.

    • Re: (Score:3, Insightful)

      by radarskiy ( 2874255 )

      You might not "rent" electricity (actually, electricity capacity) if you can accept that your instantaneous demand may not be met. If you have peaks that have to be met, you actually do rent the electrical capacity.

      Analogies are descriptive, not prescriptive. Adding constraints merely to break the analogy doesn't disprove the analogy, especially if they also break the analog.

    • Since the invention of the computer user, it has only been a matter of time before someone managed to find a way to make the drug dealer model applicable and we're most of the way there.

      I might delegate certain tasks but I'm never buying into this subscription based cloud computer crap.

      For the most part, a lot of people are most of the way there. Specifically, they will use an online office suite which keeps most of their data offsite. I had a friend who, through a series of poor choices, managed to lose access to his google account (and several years of data) after switching phone numbers. As he described

    • by flink ( 18449 )

      I don't "rent" electricity, I pay for someone else to generate it and then I own what I've consumed. It does not get returned to the company ...

      You better, otherwise the electrons won't move and you won't get any power out if it.

      ...and there are no restrictions on how I use it.

      Sure there is. Maybe not you personally because your usage is so small that it is basically a rounding error to the utility, but commercial users are subject to power factor constraints and time of use constraints. It's a shared, limited resource.

  • Mainframe (Score:5, Insightful)

    by dotslashdot ( 694478 ) on Wednesday January 14, 2026 @01:12PM (#65924054)
    Great. We are going back to the 1950s and 60s in every way possible. Now we are returning to mainframes, where people would have to rent compute time. Society is devolving.
    • Nobody who had a dial up terminal at home and had to rent time on a mainframe cried when the modern personal computer became available to consumers in the late 1970s.
    • by Dan667 ( 564390 )
      bezos is old enough to remember mainframes. I wonder if he forgot about them and then invented them again in his subconcious. Watching his behavior after dissing William Shatner's speech after space flight and his $100 million cringe wedding I have to expect he came up with this idea smelling his own farts.
      • by ceoyoyo ( 59147 )

        Bezos sells remote computing. Your mechanic doesn't think he invented oil changes and X point inspections but he sure thinks you need them every three months.

  • by Errol backfiring ( 1280012 ) on Wednesday January 14, 2026 @01:13PM (#65924060) Journal
    Given that this is called an Xstation, I would suspect Elon Musk would have come up with the idea.
  • by MpVpRb ( 1423381 ) on Wednesday January 14, 2026 @01:14PM (#65924066)

    ...in the olympics of bad ideas

  • by johnnys ( 592333 ) on Wednesday January 14, 2026 @01:15PM (#65924072)

    So the "vision" is that the ultra-rich own all the computers and everyone else gets to rent them. Sounds like the vision for real estate: The ultra-rich will own all the properties and we will all get to rent them. And food: The ultra-rich will own all the farmland and we get to buy the food from them.

    What this really is, is the WEF vision of "You will own nothing and you will be happy". There will be no middle-class allowed to own property: Just the ultra-rich owning everything and the rest of us will be a "working class" only allowed to work to survive.

    This is just "Neo-feudalism". A few ultra-rich and a serf class.

    Time to break out the guillotines, friends.

  • by TurboStar ( 712836 ) on Wednesday January 14, 2026 @01:17PM (#65924076)

    People love this idea. Look at automobiles. Everyone uses public transportation now because nobody wants to own the things important to them.

    • by dgatwood ( 11270 )

      People love this idea. Look at automobiles. Everyone uses public transportation now because nobody wants to own the things important to them.

      It's certainly what the Übers of the world want. Personally, I think robotaxis are backwards. Having cars drive themselves is great, but the removal of personal ownership implied by robotaxis is a mistake.

      The theory is that by renting what you need for only the hours that you need it, you'll pay less, because you won't have to build one for every person. The flaw in that theory is that by building fewer, the cost per unit skyrockets, so you'll end up paying not that much less for significantly reduc

  • by BrendaEM ( 871664 ) on Wednesday January 14, 2026 @01:23PM (#65924094) Homepage
    Billionaire AI caused this problem, please don't give them any more money.
  • by Registered Coward v2 ( 447531 ) on Wednesday January 14, 2026 @01:26PM (#65924104)
    Big Iron and dumb terminals. The 60's and 70's want their computers back.
  • by yababom ( 6840236 ) on Wednesday January 14, 2026 @01:36PM (#65924118)

    I see all the 'not a chance' responses, but I'd argue that the current Chromebook ecosystem comes pretty close to this: the majority of Chromebook hardware is designed to run a centrally-managed OS platform that enables access to larger services--just like a typical thin client; and the data and applications it runs are already 'rented' from Google using ad revenue & Google app subscriptions.

    If Google added remote-hosting services for apps that can't run locally (and resurrected Stadia under a new name cause that's what Google does best), I think it would fulfill this 'vision' in all the meaningful ways...

    • Is the difference between this new idea and a Chromebook just a middle-ware layer where the processing/rendering takes place? So instead of interacting with a browser, I interact with a picture of a browser and the 'computer' that generated that picture interacts with the actual webpage?

      Is there a way to 'rearchitect' everything to not need a dedicated 'middle-ware' layer, but instead your 'display terminal' receives feeds of pre-computed and pre-rendered webpages and applications from multiple sources?

      Ar

      • The AI data centers are just centralizing a shit ton of compute to process questions. Which is not something that was "distributed" previously.

        The modern web is rendered locally on the end user's device. Hence all of the JS and exploitation thereof. See also the industry's extreme focus on data caps and bandwidth usage, and why end users are able to use effective AD blockers in the first place. Rearchitecting all of that to be server side would upend many business models. (And funnily enough, remove the n
    • by Gleenie ( 412916 )

      Yeah and I won't buy one of them either.

    • by ceoyoyo ( 59147 )

      Chromebooks do pretty much all of their computing locally on capable hardware. It's not the same thing at all.

  • Was the mainframe. And when the computer was down, your business was down. But computers were expensive, and not every business needed a full computer, so the time sharing companies became a thing, and when their computer was down, your business was down. Everyone had a bad time, so as soon as personal computers became a thing, they became a thing. And the internet happened, and they forget the lessons of the old days and then reinvented it a piece at a time. So we now have the time sharing services that br

  • This has been babbled about since the mid 1990s but they just keep on trying. If they couldn't grasp by now why nobody wants this, then I don't know what to say. Try high end gaming on a "thin client". HAR HAR HAR!
  • by nealric ( 3647765 ) on Wednesday January 14, 2026 @02:16PM (#65924222)

    I think we've long hit the point of computing as service for commercial computing. Businesses need the flexibility to scale processing and storage capacities pretty widely, and they also don't want to deal with staffing for maintenance of physical computing system.

    But for consumers, the "power grid" analogy doesn't hold up as well. A commercial user who needs to store a bunch of data or do a bunch of number crunching does need terminals proportional to those needs. You can manage petabytes of storage with one terminal same as gigabytes. But consumer computing generally needs a terminal for each task. That terminal itself needs some amount of computing power anyways even if the heavy lifting is offloaded, and most consumer tasks don't really need much more computing power than would take to power the terminal in the first place. There's no physical reason you need cloud based computing to run a word processor or web browser. The only exception is really gaming and maybe some prosumer things like higher-end video editing/3d modeling, etc. Even then, the only reason why the server-based computing looks good in comparison to owning your own RTX5090 is that the data centers themselves are driving prices up (same for RAM). Unlike commercial computing, home users don't have to worry about operating costs- the marginal electricity savings of outsourcing your video card are pretty trivial.

    The real reason why Amazon (as well as the rest of the tech ecosystem) wants to foist this on home users is that they much prefer revenue streams to single purchases, with the added benefit of another datamine.

    • There's no physical reason you need cloud based computing to run a word processor

      When you use the phrase "run a word processor", you are implying using a PC in the way it was used when it was invented. There was no internet back then, and you had just one computing device which had your applications and files.

      Now consider that you use several devices, laptops, smartphones, etc. You edit a document on your laptop, and you want to display it on your smartphone, when you don't have your laptop with you. Pretty soon you will ask your AI secretary to show you the file that your wrote abou

      • While they have been pushing cloud storage hard, there's no reason why it should be mandatory. It can (and should) be optional. It may sound convenient for access to files to be hardware agnostic, but there are also significant cost and security reasons to keep them local. Most home end-users don't have a pressing need for cross-platform access to documents, and if they do, they can add them to the cloud selectively rather than natively.

        A web browser is indeed a terminal to other computers, but that's the p

  • What he is saying is that in his future vision, you own nothing. In the "generator" example, you own all the equipment run by grid power - and there's no secondary business digesting my privacy and selling my info to 3rd parties. BTW, I do still own a generator. I live out of town and have a home backup generator that gets used a couple times a year.
  • The old local power plants cost more to run than buying electricity from the grid. However with Bezos' version of time share, you'll still need to pay for the network, and you'll still need to buy or rent a local terminal / Chromebook / Cloud P/C. That local device won't need powerful hardware, but it will need some. So the costs are almost the same, especially if you factor in a monthly 365-like subscription. Sounds like a perfect use for all those old P/Cs that aren't allowed to run Windows anymore ( or t
  • "You will own nothing and like it."
    - Klaus Schawb (WEF)

    You won't own your computer, you won't own your data. You will own nothing.

    Time to stock up on computer hardware if you want to continue to OWN your own stuff.
  • Because we need Bezos et. al. to be looking over our shoulders at all times.

  • Bezos argued that users would eventually abandon local computing for cloud-based solutions ...

    Users aren't "abandoning" local computing, and as far as I can tell most have no interest in doing that. They're being FORCED away from local computing: witness the dystopian hellscape which Windows has become, as well as Adobe's push toward cloud-based rentware.

    The broligarchs won't rest until every single daily activity of the average citizen is spied upon, recorded, controlled, and monetized. They regard us as chattels - we are to them as livestock are to humans in general. Saying that we're "abandoning"

  • I will not be renting my PC from Amazon or anyone else. I do not care how much it costs.

  • Guess, whom I view as a giant dick peeking out of a polo shirt...

  • They have been saying that since the 1990's. We've constantly been told that personal hardware would be pointless and everything is going back to old school server-workstation architecture, and it still hasn't happened.

  • "VoidLink is a comprehensive ecosystem designed to maintain long-term, stealthy access to compromised Linux systems, particularly those running on public cloud platforms and in containerized environments," the researchers said in a separate post.

    https://it.slashdot.org/story/... [slashdot.org]

  • This is a vision of the future from the 1950s. If you want to credit someone, credit Asimov.

  • It is the same reason I still buy physical copies of books and music. Though I have mostly given up on physical copies of movies and TV. But even with the latter for the ones I own I store backups on a hard drive not connected to the internet.

    For the same reason that even though one could rent a condo or apartment and have all of your maintenance cared for, some people do in fact like to own things. If not just for the sake of something potentially increasing in value. But because when you own something,
  • Here we are, filling out the flesh on the dystopian skeleton set up by the conservative progenitors of the âoeownership societyâ (as in âoethey own the society â" and youâ). If you are looking for me, Iâ(TM)ll be off in the ruins, scrounging chips to put together any Linux boxes I can manage to salvage.
  • by Junta ( 36770 )

    The RAM prices are a relatively short term effect of lots of investment being thrown an an unprepared supply chain. This is not a durable 'end-user computing is from now on going to get more expensive', it's an anomaly of a trend of cheaper and cheaper bang for your buck. Crash or continue, either way the AI craze buildout will decrease (either they will have built out and settle into a milder 'refresh' cycle, or crash out and obviously not be buying).

    • by ceoyoyo ( 59147 )

      RAM prices have gone up to about what they were in 2019, they affect cloud providers just as much as they affect you, and yeah, they're temporary.

      But we have to have this freakout about RAM every few years when the price briefly goes up instead of down.

  • Might be worth buying IBM stock if Big Iron is coming back!
  • We've already see the model with the dawn of the smartphone. Everyone buys more and more cores with more and more storage and RAM to run the same apps they did ten years ago, some of it just to show your friends MyPhoneVersion = YourPhoneVersion + 1. And yet that more powerful hardware you keep upgrading isn't worth much for standalone uses. WiFi-only connectivity doesn't count because it's the same rechargeable 5% brick (I'm excluding standalone camera functions, for now, with 5%) without cloud services.

    Be

  • Disappointed there ain't no Funny. But as usual.

  • have a monthly subscription for a PC and all the apps that run on it. Consumers are stupid enough to go for this.

  • . . . when they say it out loud.

    I think there's a real growing concern, I last heard Gamer's Nexus on YouTube make this point really well, that the real appeal of AI to VC and Big Tech, is exactly this. They can squeeze the PC market for a few years until the cloud computing end game is finally realized, and everything is a service. It's not just Bezos and DRAM. Nvidia really wants to route consumer GPUs not to consumers, but go GeForce Now subscriptions.

    Owning a PC could very well be a protest statement so

It is better to never have tried anything than to have tried something and failed. - motto of jerks, weenies and losers everywhere

Working...