Forgot your password?
typodupeerror
Programming

'Memory is Running Out, and So Are Excuses For Software Bloat' (theregister.com) 152

The relentless climb in memory prices driven by the AI boom's insatiable demand for datacenter hardware has renewed an old debate about whether modern software has grown inexcusably fat, a column by the Register argues. The piece points to Windows Task Manager as a case study: the current executable occupies 6MB on disk and demands nearly 70MB of RAM just to display system information, compared to the original's 85KB footprint.

"Its successor is not orders of magnitude more functional," the column notes. The author draws a parallel to the 1970s fuel crisis, when energy shortages spurred efficiency gains, and argues that today's memory crunch could force similar discipline. "Developers should consider precisely how much of a framework they really need and devote effort to efficiency," the column adds. "Managers must ensure they also have the space to do so."

The article acknowledges that "reversing decades of application growth will not happen overnight" but calls for toolchains to be rethought and rewards given "for compactness, both at rest and in operation."
This discussion has been archived. No new comments can be posted.

'Memory is Running Out, and So Are Excuses For Software Bloat'

Comments Filter:
  • I just don't see incentives for most companies to seek much memory efficiency. Even if the calculator app takes a gig of RAM, few will notice and fewer will make purchasing decisions based on ram usage. Most people function just fine with 8gb. Even when they go over, the system still works thanks to ssds and virtual memory. Dell is not going back to mostly shipping mostly 4gb systems.

    • by Luckyo ( 1726890 )

      No, but they're supposedly going back to 8GB systems after most low end systems having been moving towards 16GB for a while.

    • by AmiMoJo ( 196126 ) on Friday December 26, 2025 @07:01AM (#65882563) Homepage Journal

      It costs money to optimize software, and most people don't want to pay it. A lot of software is already "free" with ads and upsells. I just can't see there being any improvement.

      • by phantomfive ( 622387 ) on Friday December 26, 2025 @10:00AM (#65882667) Journal
        Arguably the reason it costs money is because software engineers are really bad. In most cases, it doesn't take more engineering time to be efficient, and in many cases it takes less time. It's a skill issue.
        • The purpose of an employee is to achieve what his employer wants. If he does that, he gets paid.

          The purpose of an employer is get the system out the door as quickly as possible. Wasting time reducing bloatware is an extra cost that doesn't have a market value, so will be not be done.

          Given this outworking of our economic system, bloatware is inevitable...

          • by alexgieg ( 948359 ) <alexgieg@gmail.com> on Friday December 26, 2025 @12:07PM (#65882807)

            Given this outworking of our economic system, bloatware is inevitable...

            That isn't exactly a trait of the economic system, but of lack of professional standards for software engineering, so the point calling it an engineering at all is an offense to actual engineers.

            Consider all engineering disciplines, from civil and electric to bioengineering. In all of those the same two forces exist. And yet, civil engineers don't, say, design bridges as fast as possible with zero regard for quality and resources. Why? Because they're personally responsible for the outcome of their engineering and, if it causes issue, can be personally sued, both civil and criminally, and lose their license to work.

            That utter lack of personal responsibility is what makes software engineering an "engineering" in name only. Had their practitioners to abide to similar ethical and professional standards required from actual engineerings, and the moment an employer came telling them to ship it as is because fuck quality, and they'd retort for the employer to go fuck themselves, as they won't do a shitty job and risk losing their license to program professionally.

            Evidently, a world in which software "engineering" were a real engineering discipline would be very different from ours. There'd be less software, it'd cost more, and it'd be efficient, rock solid and almost 100% guaranteed bug free, even for the most irrelevant of apps. Some would hate it. Other would love it.

            • I think it's more the fact that you get angle of sphere-alikes who talk about how awesome java is because his hello world program starts in under two seconds and only needs a gig of memory.

            • by Bruce66423 ( 1678196 ) on Friday December 26, 2025 @04:26PM (#65883271)

              You're right, of course, but that would require the emergence of a professional body able to impose standards and discipline those who fail to respect them. Although attempts to creates such a body have been made, we live in a culture that isn't good at recognising the need for such bodies and where any restraint of trade is shouted down.

              Actually that's not true; there is a remarkable range of training requirements imposed by state legislators on workers to 'protect the public' / 'enable those who've got the skills to charge more'. Programmers have not played that game so far and it is hard to see it happening; can you imagine California enforcing such a standard in Silicon Valley? The big boys would all up sticks and leave - if they hadn't prevented the passage of the legislation in the first place.

            • Re: (Score:2, Insightful)

              by johnnys ( 592333 )

              I think you've hit the nail on the head. "Engineers" have standards and responsibilities to meet those standards. This does not exist in the software industry.

              There is another strong force against "expensive" software and that is FOSS. Justifying "expensive" versus "inexpensive" when there is competition that is free is extremely difficult for manglement who can only see $$$.

              Remember that when a mangler sniffs and says "I can't hear you if you don't speak to me in MY language" they really mean "You have to

            • by AmiMoJo ( 196126 )

              The consequences of a bridge collapsing are huge. Loss of life financial, criminal.

              Your app using loads of RAM or crashing is usually the norm for the industry and widely accepted.

              • It's more than that. We've been building bridges for tens of thousands of years. We've been writing software for less than 200 years. We're in a quickly advancing field and 'safety' is often neglected in such fields. I think we're doing pretty good all things considered.

                Further, anyone can open a text editor and write code for a year and release software indistinguishable from any other software. You can't do the same with a bridge. Well actually you kind of can (a landlord fixing their apartments is

            • I am afraid you are a bit idealizing engineers in general and in particular what we call in my country of origin engineers in civil constructions.

              Plenty of of engineers rush the design of their bridges by simply over-sample everything since concrete is cheap. In fact, It is often the engineering technicians which puts a few numbers in a software. The software will do all the calculations for him with wide margins knowing the laborers are going to do a mediocre concrete and he will give standard instructions

          • Wow, work on your reading comprehension before posting again.

            Wasting time reducing bloatware is an extra cost that doesn't have a market value, so will be not be done.

            You missed the part where it doesn't take more engineering time to be efficient, and in many cases it takes less time.

            • The reality is, however, a programmer does what works to get their project out the door. If, because they have bad habits, it contains bloatware, noone has any incentive to do anything about it.

              Ultimately someone has got to pay to train programmers to do it right. If projects are going out of the door on time, there is zero incentive for an employer to pay for that training. So it won't happen. So we will keep getting bloatware. Sad but true...

              • The reality is, however, a programmer does what works to get their project out the door. If, because they have bad habits, it contains bloatware, noone has any incentive to do anything about it.

                The way you write it, it sounds like you are an asshole trying to make excuses for your bad habits.

            • It does take more engineering time to be efficient. Significantly so. It's far, far easier to write a mostly single-threaded application with all related logic being performed in button callbacks than it is to break that logic into non-UI threads and have the UI and backend update each other when once side changes.

              There's tons of other examples too. I'm posting on firefox which seems to dump it's entire session state to a file when you do things like close/open a tab. When you have tons of tabs open and

      • Yes, it does cost money, but they do it anyway. The problem is for the last two decades it's been "Let's solve this problem by using more RAM".

        Example: Firefox and Chrome both do everything they possibly can to avoid rendering things in front of the user because it's perceived to be slower. So, for example, they'll render the entire viewport for an HTML page as a giant ass bitmap (well, pixel map) and let you scroll it. Usually it'll only need to be rerendered if something changes or if the width of the win

        • My firefox is only using 8 GB right now, bruh.

        • by AmiMoJo ( 196126 )

          That sounds more like an OS issue. The browser should be telling the OS that the 32 cached pages can be dumped as soon as there is memory pressure. The only additional lag is then clearing that memory at the time of reallocation.

          Do they not do that?

      • It doesn't simply cost money, but also maintainability and bug-freeness. Heavily optimized code is often far less straightforward and harder to read and reason about.

    • What the fuck are you running that is "fine" with 8gb of memory? Virtual memory on SSDs is a terrible idea, also. Write endurance is getting worse on drives over time as they went from slc to mlc to whatever the fuck they have now. Point in case; Samsung doesn't even publish a clear TBW endurance value for their newer drives.
      • SSD TBW has been pretty steady to rising from what I've seen. 1000TBW is pretty good if you ask me. This 4TB SSD I'm using does 2000TBW.

        • SSD TBW has been pretty steady to rising from what I've seen. 1000TBW is pretty good if you ask me. This 4TB SSD I'm using does 2000TBW.

          Write limits have been in a persistent state of decline for a given storage capacity. If it seems higher this is due to reductions in cost allowing for purchase of higher capacity SSDs.

    • by dvice ( 6309704 )

      I estimated that if I would shave half of the memory usage out of the application, I would have to do that in 8 seconds, including testing to make it worth the cost.

  • never happen (Score:5, Insightful)

    by iwulinux ( 655433 ) on Friday December 26, 2025 @06:14AM (#65882519)

    But since you can't afford a decent computer anymore, we'll let you rent one in the cloud that can run our shitty, bloated software

    • Re:never happen (Score:5, Insightful)

      by Zocalo ( 252965 ) on Friday December 26, 2025 @06:44AM (#65882547) Homepage
      Hopefully not, at least in the commercial space. If so, that presents a massive opportunity for Linux and other FOSS tools to gain an advantage where it really matters if they do take this approach; on the bottom line.

      At some point arguments like: $xxx for RAM (per server, per month) + $deity-knows-what in commerical software licenses (again, per month since we're renting in the cloud) vs. let's say a reasonably achievable target of 1/3 of that on RAM (per otherwise identical "hardware") + no software costs to do exactly the same thing, and probably do it faster, is really going to start to register in the C-suite.

      And, with a few adjustments, that argument even more so than cloud if you are doing all this on-prem.
      • Hopefully not, at least in the commercial space. If so, that presents a massive opportunity for Linux and other FOSS tools to gain an advantage where it really matters if they do take this approach; on the bottom line. [...] And, with a few adjustments, that argument even more so than cloud if you are doing all this on-prem.

        1.) How many FOSS Electron Apps have you seen? I've seen A LOT. Those waste memory. A LOT.
        2.) How much memory do you think FlatPacks/Snaps/AppImages waste? A LOT
        3.) How many FOSS Apps leak memory like a sieve (the FireFox 104 ESR on Mac I am using to write this comes to mind)? A LOT

        Memory hogging is not exclusive To free vs paid, or Libre vs propiertary. At some point it became ingrained in the industry

        As for Cloud, Public Clouds are not going away, and worklodas will not return en-masse to propiertary datc

      • packagekitd, something I do not care much, can take 1GB of RAM. Why is that so prevalent on my company VDI that is managed by company IT?
  • by Quakeulf ( 2650167 ) on Friday December 26, 2025 @06:23AM (#65882523)

    I am working on energysaving for software, with energyrating as part of it at Ahodzil [ahodzil.com]. Hopefully I will be able to make the code optimisation build framework available next year along with the energyrating. It goes beyond the compiler (and PGO) by checking if the code is necessary against a baseline of an "optimal code formula" based on decades of accumulated programming (and most codebases are just manipulating a database anyhow).

    • by Unpopular Opinions ( 6836218 ) on Friday December 26, 2025 @07:52AM (#65882589)

      Good on you to provide an immediate potential solution, but you are highlighting the root of this problem: you are building just another framework. Modern developers do not know how to code, they only know how to integrate a framework with another framework. Everything became frameworks. And frameworks are, by definition, bloated. Even yours, given enough time, will cease to offer meaningful reductions because it too will get bloated with so many variables and options to circumvent others bloat.

      • by StormReaver ( 59959 ) on Friday December 26, 2025 @08:45AM (#65882607)

        And frameworks are, by definition, bloated.

        Most are bloated, but some are not. When I was doing Assembly on the 6809 in the 1980s, I wrote a framework that contained everything I used in most of my programs (printing to the screen, letting the user input a line of text, printing to the dot-matrix printer, modem file transfers, saving and loading files, etc). My assembled projects were quite small (usually around 20-25 kilobytes when finished). I did not include anything just for the hell of it, and my framework saved me tons of time on subsequent projects.

        Fast forward to about 12 years ago, when I started creating a Web programming framework for my own use. The framework is about 50K lines of PHP code (including comments and whitespace). It does include some dead code bloat, as it originated from a project I was writing for a specific project (which is easily trimmable if I ever get around to it). Barring that project-specific code, though, I would have to rewrite the vast majority of the framework for each Web project, so it is actually rather trim around the waist. It easily saves me years-worth of man hours on each project, and each project uses about 95% of the framework.

        But most modern frameworks are indeed bloated all to hell with no justifiable reason. And the reason is probably because they want to do everything for everyone, which always results in a bloated pig. But frameworks that target a specific need can be lean and efficient.

        • Also, just because a framework is bloated, it doesn't mean that you'd be using all the bloat. It will lie dormant if you don't use it. A d if you do use it, is it really bloat?

          • The way most packages, shared libraries and frameworks are distributed, compiled, linked, deployed and loaded at runtime? Then yes, it is still bloat if you arenâ(TM)t using it.
            • In terms of size of the package, sure. But if you don't use a particular functionality, it won't hog your RAM.

              • There are plenty of ways that unused functionality can occupy memory.

                It is true that, in certain cases, unused sections of a shared library may remain in file-backed pages that are not accessed and therefore do not contribute to physical memory usage (eg process resident set size). However, depending on the languages and programming patterns used, it is very common for significant sections of library code and data to be loaded into physical memory even if it is not ostensibly “used”.

                For exam

                • AI says my last post was essentially correct but used technical jargon that might be hard to understand. Here’s a simpler summary from AI:

                  Summary

                  Even if you don’t intentionally use certain features in a large framework or library, those features can still end up using memory. In some cases, unused parts really do stay dormant, but that’s not always how software works in practice.

                  Many programs load more code and data than they strictly need when they start up. This can happen because

        • > When I was doing Assembly on the 6809 in the 1980s, I wrote a framework that contained everything I used in most of my programs.

          Come one now, no, you didn’t. You maybe wrote a "library", but not a framework. Assembly doesn't have frameworks. Heck, Assembly doesn't even have libraries in the modern sense. Even then, I doubt you linked binary object or archive files. You likely had some .s files you copied in, just like I do to this day.

          You're right to call out that not ALL frameworks are bloated,

      • Frameworks are not "bloated by definition".
        Especially when you actually do not know what the term "framework" means in computer science.
        In general: everything the framework you use does for you, you otherwise have to code yourself.
        So basically you want to claim: you wrote half a dozen applications, and when you are about to write the 7th, you figure: hey let's cut and paste the common part if those programs into "a library" ... beautify it a bit and call it a "framework". And now it is suddenly bloated? Tha

      • Everyone is using some kind of abstraction today, nobody is writing opcodes manually and almost nobody is even using assembler, not only because that's done in a library or a framework but because they don't have time. If it wasn't abstracted for them it wouldn't be happening. All you can reasonably do is try to choose efficient abstractions and use them in efficient ways.

        • by tlhIngan ( 30335 )

          Frameworks were a part of software since the 60s. Even the Apollo Guidance Computer had a framework - it ran a VM that abstracted out the 15+1 bit hardware architecture into something more useful for planning a mission to the moon, including positional tracking and correction.

          The famous error on landing happened because the radar was causing more VM tasks to be spawned than was supported by the memory or executive.

          Plus, we all use libraries and greater abstraction because going without gets tedious quick.

          Wi

      • I've seen frameworks since I entered the profession in the very early nineties. The first one I used was something for the Mac that used Aztec C (sorry, can't remember the name of the framework itself, it's been 35 years!) Macs back then rarely came with more than a megabyte of RAM, I think some in our lab only had half that.

        So... bloat is 100% relative term. It might, slightly, increase the size of an application for it to be built around a framework, but it certainly doesn't turn megabytes into gigabytes.

      • It's not running in the code, it is running on the code at compile-time, so it's not a consistent % increase in resources, however yes, I agree. It should not be like this. Anything to increase complexity is absolutely awful.

  • by Mr. Dollar Ton ( 5495648 ) on Friday December 26, 2025 @06:27AM (#65882533)

    I'm sure the models trained on stack exchange examples will produce exactly the optimized code every CTO is hoping for when they promote the "AI".

  • We also have to look at shared memory. Once you go load DirectX or something, you (should) have one copy that is shared. I guess some idiot might statically link this stuff. But yes, Windows software is like the fat ass who takes up three airline seats.
  • The funny part (Score:5, Insightful)

    by Luckyo ( 1726890 ) on Friday December 26, 2025 @06:46AM (#65882549)

    The funny part is that the "agentic OS" shit in windows, copilot required 16 gigs of ram for system to be certified to be copilot ready (or whatever it was that microsoft calls their copilot branding for OEM systems).

    And rumor mill suggests that low end OEM systems are going back to 8 GB now. AI demand has caused... reduction in AI capable systems.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      No worries, in the next version the "agent" will require a constant connection and run on the server.

  • Point in case: Installed the Dire Wolf Digital Boardgame Companion(!) app last night. 333 MB. It's a neat app and it looks cool, but 333 MB for this is insane. Basically every piece of software is like this these days.

    Part of this is due to cross-platform and cross-version development, but a larger portion of it is that devs don't need to care and memory efficiency isn't a priority anymore.

  • by HnT ( 306652 ) on Friday December 26, 2025 @07:29AM (#65882579)

    Way too many layers of abstraction to enable ever less qualified unwashed masses to develop software with even less understanding and skill; we had decades of this - now you see the results.

    No, not everybody should be coding. In fact, it is better if less clueless wannabes are running around with these loaded guns.

    • It's not real abstraction. It is project managers (PMs) treating engineers as interchangeable cogs. I've worked on many projects with no documentation about what the project even does but had PMs manage the project through Jira tickets in an Agile way. They throw a random engineer at a specific bug ticket. The engineer is never given time to learn the project, likely they can barely run the test case in real life only the automated test. They then bang at the code till the test passes. Maybe they put a
      • by dvice ( 6309704 )

        What you describe is a software architecture problem. No decent software architect would allow code like that. Most likely the project doesn't even have one, or they have one named as a software architect, but in reality that person is just a normal developer. I have seen that happening multiple times. It is extremely important to pick a good architect for the project. That alone makes determines whether the project costs 100 million or 10 million.

        I have been in agile projects that don't have such problems,

  • by DrMrLordX ( 559371 ) on Friday December 26, 2025 @07:40AM (#65882583)

    Lower memory footprint is desirable whether or not there's a shortage of commercially-available DRAM. Though there's not going to be as much growth in available memory on systems, the reality is that DRAM shortages will cause end users to deploy fewer new systems in the near future rather than significantly curtailing the amount of RAM available per system. People will be holding on to older systems longer rather than sticking to whatever upgrade path they're on.

    • Like everything software developers do, it's a cost-benefit equation. At present, software developer time is very, very expensive, and memory is still, despite recent increases, a lot cheaper than developer time.

      So from that perspective, it does *not* make sense to spend time optimizing the memory footprint of software.

  • Why use toolchains at all? Just have AI produce the application ready to run!

    And why use frameworks? Just ask AI to generate all the code needed.

    "...and argues that today's memory crunch could force similar discipline."

    It definitely WON'T, not for the people causing it.

  • by fleeped ( 1945926 ) on Friday December 26, 2025 @08:36AM (#65882605)
    You wanted AI? Now you have to learn memory management again... And in C, as an extra punishment!
    • You wanted AI? Now you have to learn memory management again... And in C, as an extra punishment!

      Mod parent up.

      I learned C in 1991, on 640k MS-DOS computers, at Uni, Electronics Engineering. I know how to do that shit ;-)

      Zig and python, on the other hand, intrigue me.

  • Winodws 95 (Score:3, Informative)

    by Valgrus Thunderaxe ( 8769977 ) on Friday December 26, 2025 @09:19AM (#65882631)
    You could get it on a CD or *12* floppies. And I'd argue Windows 11 isn't better from a user's POV.
    • The Chicago beta I installed was more like 24 floppies, and it's not like it wasn't using compressed files. I wonder what happened in between there and the release. I don't remember it having anything that was removed, maybe it was just debug symbols.

      • I remember that Windows 95 install disks were formatted in an odd way. Each disk held more than the standard 1.44MB.
      • From my memory there was a 13 diskette set in DMF format (1.7MB instead of the standard 1.44MB) for the initial release. I still have a 27 diskette set of Windows 95 OSR2 with USB Support in standard 1.44MB format. However, I'm not sure if that was ever an "official" release for sale. I worked at a computer store at the time and I believe this was an OEM copy I obtained.

    • Windows 7 was the sweet spot. After that, it could have used maybe the Windows 8 kernel update, but everything else since has been a waste
  • Been here before... (Score:5, Interesting)

    by glatiak ( 617813 ) on Friday December 26, 2025 @09:55AM (#65882657)

    Brings up fond memories of the days when I did internals for a long vanished DBMS firm on PDP-11 and Vax machines. The fine art of building overlay trees and sharing memory regions to get some of the huge programs to fit. Seems fanciful today, packaging multi-megabyte code to run in 64kw of user space. Even had to overlay file buffers and built our own swap handler. One overlay map was over 8 feet long. Modern bloatware is so wasteful and seems to have lost what us oldtimers did to make things work. A memory diet might do everyone some good.

  • Some clients regularly complained our APIs were bloated and confusing, so many options and so many many {fill in the blank here}.

    Well, I was encouraged to point out that our APIs were built to satisfy the requirements of the target systems, with their many different said requirements, and to also serve many different types of client users. And ctrl+f was most useful when perusing the documentation for the specific feature *you* needed.

    As you can imagine, that sometimes resulted in an outbreak of insults, di

  • No. (Score:2, Insightful)

    by rsilvergun ( 571051 )
    We are not going to let tech Bros blame programmers for the memory shortage.

    I knew this was coming. I knew it wasn't going to be long until I was told not to be angry at billionaires going on trillionaires and they are attempts to eliminate all our jobs with AI while taking all the water and electricity for themselves.

    90% of all media you consume is owned by a billionaire or a small handful of billionaires.

    Every single piece of information you have access to is filtered through that lens.

    So
    • Two things can be true at once.

      Developers do seem to be creating programs that are less and less memory efficient and yet somehow less and less functional.

      AND

      Tech bros are pushing a shitty technology on everyone and are pushing up RAM, storage, and energy prices while doing so.

      I don't see why I should need 16Gb to read some forums and check my email when a 16Mb machine was perfectly capable of doing 90% of that 30 years ago.

      • In the 1980s you would literally count the bytes on a Atari 2600 cartridge. You don't do that in 2025 if you're making a game for the PS5.

        Technology is supposed to advance and when it does yeah we use it.

        I don't see developers besides Microsoft making excessive use of RAM and Microsoft is only doing it because they are doing AI bullshit specifically they're adding a bunch of monitoring to computers so they can train AIs to replace us all.

        The last thing I want is developers spending time heavily
        • Does it really take 1,000 times the memory usage to get added memory safety?

          No, it really doesn't.

          Programs are bloated, less functional, and the only reason they're often "memory safe" is because they've been written in Javascript and are running under Electron. Firefox doesn't take up 1,000x as much memory as Netscape because of better functionality (it has better functionality, but that's not why) or because it's written in a memory safe language (it's written in C++), it's because they got into a dick me

    • We are not going to let tech Bros blame programmers for the memory shortage.

      I had this thought too...gotta blame the programmers for something so that they feel bad about themselves, and are easy to control (Even if they don't actually care about memory).

  • Yes too much bloat in software!

    We could run an OS and a game or an OS and multiple other programs in 1MB of memory on an Amiga. You can't do that with Windows, Mac or Linux these days.
    • We could run an OS and a game or an OS and multiple other programs in 1MB of memory on an Amiga.

      Amiga had no security and the OS didn't even make use of an MMU when it was present unless you ran GOMF or similar. It was awesome in its day, but its day has long since passed. We also have orders of magnitude more RAM in our portable devices than we had in our desktops back then, so while there is something to efficiency, we just don't need to be that efficient any more. We can afford more safety and security.

      Amiga also had most of the OS in ROM, but this was slow, and if you had enough RAM then you'd loa

  • memory prices and availability will be normal again
  • by The MAZZTer ( 911996 ) <megazzt@gma i l . c om> on Friday December 26, 2025 @01:09PM (#65882945) Homepage
    I've observed high memory usage is always described as "bloat" even if they have no idea how it's actually being used. For example, Chrome/Firefox is going to keep a lot of cache in RAM. And why not? If it's available, use it for something! Unused memory is wasted memory. If there is memory pressure, the memory should be released for use, that is the important part, and Chrome/Firefox should both be doing this AFAIK. Windows itself does this as well, and if you check Task Manager you'll note it describes "Available RAM" probably because of this basic misunderstanding. RAM used for OS cache is included in this number! If you dig into Performance you can find the true Free RAM. But that number doesn't matter.
    • > Unused memory is wasted memory

      Yes, but no: If the RAM that one program uses for cache can't be easily reclaimed for use by another, that isn't cache, that's usage.
      Yanking the memory out from under the process should be an option for the OS, same as it can do for its FS cache. If it's cooperative (as in the OS asks politely), then it's usage.

      I'm saying this because I'm not aware of a way to allocate memory that the OS can reclaim at any time...

  • OOp is so wonderful, and...

    I want a clipping of Godzilla's toenail, and the object gives me Godzilla, with a small frame around his toenail.

    People use the highest level abstraction, rather than the lowest that gives them just what then need.

  • Case in point: Chrome says this tab is using 82 MB. For a mostly static site, with most of the ads and trackers blocked by ghostery. Absolute nonsense.
  • You mean to tell me itâ(TM)s not normal for my enterprise chat client to use as much memory as the rest of the OS?
  • by Somervillain ( 4719341 ) on Friday December 26, 2025 @05:51PM (#65883361)
    When I was in college, if you had told me how much computing power I could get for less than $100 today, my mind would be blown...4 64-bit cores for $60? 15 years ago, I had a cheap-ish desktop with 32GB RAM. 16 cores in a cheap-ish desktop chip?...I would have asked future me, "wow...I bet all software is sooo fast"...and future me would say, eh, a little...things are mostly the same. Why? Well, we took fast IDEs and said...nope, we need to rewrite the whole thing in JavaScript? Why? because some guy said so...fuck that C/C++ shit...write it in the slowest language we can find!!!!

    We have fast server-side engines heavily optimized in Java?...Great...let's throw those out....now rerun the whole thing in python with 4x the latency, 4x the RAM utilization and 1/2 the overall performance. Why? because a persuasive asshole successfully convinced your boss it made more sense to write all new software in Python than have him crack open a book and learn Java...because he argued, without any evidence or a fucking clue that it's easier to write Python than Java....even though if you know both, the difference is minimal to working in Java's favor if you have a larger project with more flux/contributors.

    What we're seeing is Asshole Driven Development. Whoever, is the biggest asshole or most persuasive gets their way....doesn't matter what the facts are. Some dipshit with huge tits can persuade your boss tomorrow that you need to rewrite your authentication suite in brainfuck....and guess what...you're now a brainfuck developer.

    Webpages used to be fast and simple and mediocre looking. Now they're SLOW AF, bring supercomputers to their knees, just to display 1kb of text, but hey, they're using 20 of the latest and greatest frameworks!...and they do look 5% better on mobile....so....I guess that's something!....you get to pay to send 100x more megabytes & megabytes of pointless HTML and JavaScript, just to render a simple contact-us form....but hey, you're using frameworks approved the by the coolest turtleneck-wearing urban UI hipsters.
  • Maybe the author doesn't realize that software developers are expensive. A business has to decide where they want to spend their engineering dollars. Are they going to build, say, new features they can sell? Or are they going to work on reducing software footprint, that gains them basically nothing? If you were paying software developers to do things, which one would you choose?

  • Pivoting development to address a temporary shortage isn't necessarily good business strategy.

    Someone told me that the best thing is to assume that in a couple of years when the AI bubble bursts will have a deluge of RAM and GPUs with lots of RAM, and it would be best to design for that.

    I think that's too optimistic, but seriously, investing into optimisation as a response to a market shortage doesn't sound to me like the best use of development time unless it's simple to do. If it's simple to do, it would

  • memory is not running out. AI is eating it all. or a lot of it. and how does that link to the idea that modern applications are larger than they used to be? we have the memory. but now there's a problem getting the memory. if we had the memory no one would be talking about bloat would they? do companies bend over backwards to say oh watch your bloat? not usually. this is brought on by AI and it's demand for memory. it's not linked to software bloat.
  • Won't create a demand for highly memory optimized versions of applications. The costs for rewriting that code in desktop specific environments is very high and the inertia behind web frameworks and applications is so large that nobody wants to move to something better, despite how bad the current frameworks can be.

    Most users expect more out of all their applications. Application frameworks have to deal with monitor scaling, use very high quality font and vector rendering and stay highly responsive. They use

Fundamentally, there may be no basis for anything.

Working...