'Memory is Running Out, and So Are Excuses For Software Bloat' (theregister.com) 152
The relentless climb in memory prices driven by the AI boom's insatiable demand for datacenter hardware has renewed an old debate about whether modern software has grown inexcusably fat, a column by the Register argues. The piece points to Windows Task Manager as a case study: the current executable occupies 6MB on disk and demands nearly 70MB of RAM just to display system information, compared to the original's 85KB footprint.
"Its successor is not orders of magnitude more functional," the column notes. The author draws a parallel to the 1970s fuel crisis, when energy shortages spurred efficiency gains, and argues that today's memory crunch could force similar discipline. "Developers should consider precisely how much of a framework they really need and devote effort to efficiency," the column adds. "Managers must ensure they also have the space to do so."
The article acknowledges that "reversing decades of application growth will not happen overnight" but calls for toolchains to be rethought and rewards given "for compactness, both at rest and in operation."
"Its successor is not orders of magnitude more functional," the column notes. The author draws a parallel to the 1970s fuel crisis, when energy shortages spurred efficiency gains, and argues that today's memory crunch could force similar discipline. "Developers should consider precisely how much of a framework they really need and devote effort to efficiency," the column adds. "Managers must ensure they also have the space to do so."
The article acknowledges that "reversing decades of application growth will not happen overnight" but calls for toolchains to be rethought and rewards given "for compactness, both at rest and in operation."
Bizarre wishful thinking (Score:2)
I just don't see incentives for most companies to seek much memory efficiency. Even if the calculator app takes a gig of RAM, few will notice and fewer will make purchasing decisions based on ram usage. Most people function just fine with 8gb. Even when they go over, the system still works thanks to ssds and virtual memory. Dell is not going back to mostly shipping mostly 4gb systems.
Re: (Score:2)
No, but they're supposedly going back to 8GB systems after most low end systems having been moving towards 16GB for a while.
Re:Bizarre wishful thinking (Score:5, Insightful)
It costs money to optimize software, and most people don't want to pay it. A lot of software is already "free" with ads and upsells. I just can't see there being any improvement.
Re:Bizarre wishful thinking (Score:4, Insightful)
False definition of 'bad' (Score:3)
The purpose of an employee is to achieve what his employer wants. If he does that, he gets paid.
The purpose of an employer is get the system out the door as quickly as possible. Wasting time reducing bloatware is an extra cost that doesn't have a market value, so will be not be done.
Given this outworking of our economic system, bloatware is inevitable...
Re:False definition of 'bad' (Score:5, Insightful)
Given this outworking of our economic system, bloatware is inevitable...
That isn't exactly a trait of the economic system, but of lack of professional standards for software engineering, so the point calling it an engineering at all is an offense to actual engineers.
Consider all engineering disciplines, from civil and electric to bioengineering. In all of those the same two forces exist. And yet, civil engineers don't, say, design bridges as fast as possible with zero regard for quality and resources. Why? Because they're personally responsible for the outcome of their engineering and, if it causes issue, can be personally sued, both civil and criminally, and lose their license to work.
That utter lack of personal responsibility is what makes software engineering an "engineering" in name only. Had their practitioners to abide to similar ethical and professional standards required from actual engineerings, and the moment an employer came telling them to ship it as is because fuck quality, and they'd retort for the employer to go fuck themselves, as they won't do a shitty job and risk losing their license to program professionally.
Evidently, a world in which software "engineering" were a real engineering discipline would be very different from ours. There'd be less software, it'd cost more, and it'd be efficient, rock solid and almost 100% guaranteed bug free, even for the most irrelevant of apps. Some would hate it. Other would love it.
Re: False definition of 'bad' (Score:2)
I think it's more the fact that you get angle of sphere-alikes who talk about how awesome java is because his hello world program starts in under two seconds and only needs a gig of memory.
Re:False definition of 'bad' (Score:4, Interesting)
You're right, of course, but that would require the emergence of a professional body able to impose standards and discipline those who fail to respect them. Although attempts to creates such a body have been made, we live in a culture that isn't good at recognising the need for such bodies and where any restraint of trade is shouted down.
Actually that's not true; there is a remarkable range of training requirements imposed by state legislators on workers to 'protect the public' / 'enable those who've got the skills to charge more'. Programmers have not played that game so far and it is hard to see it happening; can you imagine California enforcing such a standard in Silicon Valley? The big boys would all up sticks and leave - if they hadn't prevented the passage of the legislation in the first place.
Re: (Score:2, Insightful)
I think you've hit the nail on the head. "Engineers" have standards and responsibilities to meet those standards. This does not exist in the software industry.
There is another strong force against "expensive" software and that is FOSS. Justifying "expensive" versus "inexpensive" when there is competition that is free is extremely difficult for manglement who can only see $$$.
Remember that when a mangler sniffs and says "I can't hear you if you don't speak to me in MY language" they really mean "You have to
Re: (Score:2)
The consequences of a bridge collapsing are huge. Loss of life financial, criminal.
Your app using loads of RAM or crashing is usually the norm for the industry and widely accepted.
Re: (Score:2)
It's more than that. We've been building bridges for tens of thousands of years. We've been writing software for less than 200 years. We're in a quickly advancing field and 'safety' is often neglected in such fields. I think we're doing pretty good all things considered.
Further, anyone can open a text editor and write code for a year and release software indistinguishable from any other software. You can't do the same with a bridge. Well actually you kind of can (a landlord fixing their apartments is
Re: (Score:3)
I am afraid you are a bit idealizing engineers in general and in particular what we call in my country of origin engineers in civil constructions.
Plenty of of engineers rush the design of their bridges by simply over-sample everything since concrete is cheap. In fact, It is often the engineering technicians which puts a few numbers in a software. The software will do all the calculations for him with wide margins knowing the laborers are going to do a mediocre concrete and he will give standard instructions
Re: (Score:2)
Wasting time reducing bloatware is an extra cost that doesn't have a market value, so will be not be done.
You missed the part where it doesn't take more engineering time to be efficient, and in many cases it takes less time.
Challenging, thank you (Score:2)
The reality is, however, a programmer does what works to get their project out the door. If, because they have bad habits, it contains bloatware, noone has any incentive to do anything about it.
Ultimately someone has got to pay to train programmers to do it right. If projects are going out of the door on time, there is zero incentive for an employer to pay for that training. So it won't happen. So we will keep getting bloatware. Sad but true...
Re: (Score:2)
The reality is, however, a programmer does what works to get their project out the door. If, because they have bad habits, it contains bloatware, noone has any incentive to do anything about it.
The way you write it, it sounds like you are an asshole trying to make excuses for your bad habits.
Re: (Score:2)
It does take more engineering time to be efficient. Significantly so. It's far, far easier to write a mostly single-threaded application with all related logic being performed in button callbacks than it is to break that logic into non-UI threads and have the UI and backend update each other when once side changes.
There's tons of other examples too. I'm posting on firefox which seems to dump it's entire session state to a file when you do things like close/open a tab. When you have tons of tabs open and
Re: (Score:3)
Yes, it does cost money, but they do it anyway. The problem is for the last two decades it's been "Let's solve this problem by using more RAM".
Example: Firefox and Chrome both do everything they possibly can to avoid rendering things in front of the user because it's perceived to be slower. So, for example, they'll render the entire viewport for an HTML page as a giant ass bitmap (well, pixel map) and let you scroll it. Usually it'll only need to be rerendered if something changes or if the width of the win
Re: (Score:2)
My firefox is only using 8 GB right now, bruh.
Re: (Score:2)
That sounds more like an OS issue. The browser should be telling the OS that the 32 cached pages can be dumped as soon as there is memory pressure. The only additional lag is then clearing that memory at the time of reallocation.
Do they not do that?
Re: (Score:2)
It doesn't simply cost money, but also maintainability and bug-freeness. Heavily optimized code is often far less straightforward and harder to read and reason about.
Re: Bizarre wishful thinking (Score:2)
Re: (Score:2)
SSD TBW has been pretty steady to rising from what I've seen. 1000TBW is pretty good if you ask me. This 4TB SSD I'm using does 2000TBW.
Re: (Score:2)
SSD TBW has been pretty steady to rising from what I've seen. 1000TBW is pretty good if you ask me. This 4TB SSD I'm using does 2000TBW.
Write limits have been in a persistent state of decline for a given storage capacity. If it seems higher this is due to reductions in cost allowing for purchase of higher capacity SSDs.
Re: (Score:2)
2000 TB is 1 TB a day for over 5 years. How is this ever an issue short of a server?
Re: (Score:2)
I estimated that if I would shave half of the memory usage out of the application, I would have to do that in 8 seconds, including testing to make it worth the cost.
Re:Bizarre wishful thinking (Score:5, Insightful)
"I have not run out of memory on my computer in years."
Neither have "I", but my computer runs out of memory on a weekly basis and has done so for years.
"RAM is not the limiting factor."
No, it's software quality.
Re: Bizarre wishful thinking (Score:3, Funny)
Re: (Score:2)
Re: (Score:3)
Why? What's wrong with just using the right tools for the job in a pipeline?
wget -O - | xq -x "$(cat my-text-browser.xpath)" | less
Elephant in the room - Bloat (Score:3)
Employees can get promotions, keep their job and get pay increases by endlessly adding NEW features to products
It is politically easier to add to a product than to fight and justify removal of product features.
No one gets rewarded by blocking product feature creep
Turning everything into a content platform and subscription service is near the end of its 25 year run.
Executive leadership has aged in place with a preference for not changing while they finish out their work years.
Re:Elephant in the room - Bloat (Score:5, Insightful)
The thing is, it's not just feeping creatureism. It's also developers who think they need three layers of frameworks to say "Hello".
Re: (Score:2)
I have not run out of memory on my computer in years.
Every thirty days xtg-desktop-portal eats up about 10GB and Linux Mint would thrash on me. I wanted to do some AI and upgraded to 64GB but I still have to kill xtg- every few weeks. So I'd say sloppy software is still a problem. I do leave most of my software running all the time but it uses less than 6GB. The problem has persisted for years now.
This is a very insignificant problem, but it amazes me no one notices or seems to care. I can't be the only person to have suffered or noticed this. A couple
Re: (Score:2)
Re: (Score:3)
I just hand them a 400kb binary compiled with rustc that doesn't even give a shit which libc version you have installed, because it doesn't even need that.
How many times have you done that?
Re: (Score:3)
He hands it to them on punch cards. 400kb is A LOT of punch cards.
Re: (Score:2)
400kb, really? What are they doing with that post-strip "Hello World"? I completely agree that dependencies are a scourge, but then again I lump Cargo into that pile.
Re: (Score:2)
400kb, really? What are they doing with that post-strip "Hello World"?
In a "hello world" binary, most of that would be the statically linked standard library, which is optional. A "hello world" without std is less than 2kb, and on windows that's even when you statically link the windows crate in order to can tell the OS to print your string. Most of that 2kb is just the container built by the linker.
I completely agree that dependencies are a scourge, but then again I lump Cargo into that pile.
I see you've never built native binaries before. You don't have to install the build tools everywhere you want to run native binaries, they're only needed to compile it.
Re: (Score:2)
Oh and when I say "most of that", I mean most of the file contents for a hello world, which shouldn't be that large. As of the rustup 1.93 toolchain, "hello world" release target with the default profile (optimizes for speed and low memory usage, not binary file size) compiles to exactly 132,608 bytes on windows (PE files don't include debug symbols, so no need for strip).
But we can tune that in several ways.
For example, we can add a few compiler flags to do what you want, namely optimize in favor of a smal
Re: Bizarre wishful thinking (Score:2)
I'm not talking high level abstractions compiled into regular machine code, I'm talking runtime abstractions, runtime frameworks, etc. Otherwise literally every language except assembly is pure abstraction.
See this for example:
https://github.com/johannesjo/... [github.com]
And that kind of thing is becoming increasingly common.
never happen (Score:5, Insightful)
But since you can't afford a decent computer anymore, we'll let you rent one in the cloud that can run our shitty, bloated software
Re:never happen (Score:5, Insightful)
At some point arguments like: $xxx for RAM (per server, per month) + $deity-knows-what in commerical software licenses (again, per month since we're renting in the cloud) vs. let's say a reasonably achievable target of 1/3 of that on RAM (per otherwise identical "hardware") + no software costs to do exactly the same thing, and probably do it faster, is really going to start to register in the C-suite.
And, with a few adjustments, that argument even more so than cloud if you are doing all this on-prem.
Re: (Score:3)
Hopefully not, at least in the commercial space. If so, that presents a massive opportunity for Linux and other FOSS tools to gain an advantage where it really matters if they do take this approach; on the bottom line. [...] And, with a few adjustments, that argument even more so than cloud if you are doing all this on-prem.
1.) How many FOSS Electron Apps have you seen? I've seen A LOT. Those waste memory. A LOT.
2.) How much memory do you think FlatPacks/Snaps/AppImages waste? A LOT
3.) How many FOSS Apps leak memory like a sieve (the FireFox 104 ESR on Mac I am using to write this comes to mind)? A LOT
Memory hogging is not exclusive To free vs paid, or Libre vs propiertary. At some point it became ingrained in the industry
As for Cloud, Public Clouds are not going away, and worklodas will not return en-masse to propiertary datc
Re: (Score:2)
I am doing my best at Ahodzil (Score:5, Interesting)
I am working on energysaving for software, with energyrating as part of it at Ahodzil [ahodzil.com]. Hopefully I will be able to make the code optimisation build framework available next year along with the energyrating. It goes beyond the compiler (and PGO) by checking if the code is necessary against a baseline of an "optimal code formula" based on decades of accumulated programming (and most codebases are just manipulating a database anyhow).
Re: I am doing my best at Ahodzil (Score:5, Insightful)
Good on you to provide an immediate potential solution, but you are highlighting the root of this problem: you are building just another framework. Modern developers do not know how to code, they only know how to integrate a framework with another framework. Everything became frameworks. And frameworks are, by definition, bloated. Even yours, given enough time, will cease to offer meaningful reductions because it too will get bloated with so many variables and options to circumvent others bloat.
Re: I am doing my best at Ahodzil (Score:5, Informative)
And frameworks are, by definition, bloated.
Most are bloated, but some are not. When I was doing Assembly on the 6809 in the 1980s, I wrote a framework that contained everything I used in most of my programs (printing to the screen, letting the user input a line of text, printing to the dot-matrix printer, modem file transfers, saving and loading files, etc). My assembled projects were quite small (usually around 20-25 kilobytes when finished). I did not include anything just for the hell of it, and my framework saved me tons of time on subsequent projects.
Fast forward to about 12 years ago, when I started creating a Web programming framework for my own use. The framework is about 50K lines of PHP code (including comments and whitespace). It does include some dead code bloat, as it originated from a project I was writing for a specific project (which is easily trimmable if I ever get around to it). Barring that project-specific code, though, I would have to rewrite the vast majority of the framework for each Web project, so it is actually rather trim around the waist. It easily saves me years-worth of man hours on each project, and each project uses about 95% of the framework.
But most modern frameworks are indeed bloated all to hell with no justifiable reason. And the reason is probably because they want to do everything for everyone, which always results in a bloated pig. But frameworks that target a specific need can be lean and efficient.
Re: I am doing my best at Ahodzil (Score:3)
Also, just because a framework is bloated, it doesn't mean that you'd be using all the bloat. It will lie dormant if you don't use it. A d if you do use it, is it really bloat?
Re: I am doing my best at Ahodzil (Score:2)
Re: (Score:2)
In terms of size of the package, sure. But if you don't use a particular functionality, it won't hog your RAM.
Re: (Score:2)
There are plenty of ways that unused functionality can occupy memory.
It is true that, in certain cases, unused sections of a shared library may remain in file-backed pages that are not accessed and therefore do not contribute to physical memory usage (eg process resident set size). However, depending on the languages and programming patterns used, it is very common for significant sections of library code and data to be loaded into physical memory even if it is not ostensibly “used”.
For exam
Re: (Score:2)
AI says my last post was essentially correct but used technical jargon that might be hard to understand. Here’s a simpler summary from AI:
Summary
Even if you don’t intentionally use certain features in a large framework or library, those features can still end up using memory. In some cases, unused parts really do stay dormant, but that’s not always how software works in practice.
Many programs load more code and data than they strictly need when they start up. This can happen because
Re: (Score:2)
> When I was doing Assembly on the 6809 in the 1980s, I wrote a framework that contained everything I used in most of my programs.
Come one now, no, you didn’t. You maybe wrote a "library", but not a framework. Assembly doesn't have frameworks. Heck, Assembly doesn't even have libraries in the modern sense. Even then, I doubt you linked binary object or archive files. You likely had some .s files you copied in, just like I do to this day.
You're right to call out that not ALL frameworks are bloated,
Re: (Score:2)
Frameworks are not "bloated by definition". ... beautify it a bit and call it a "framework". And now it is suddenly bloated? Tha
Especially when you actually do not know what the term "framework" means in computer science.
In general: everything the framework you use does for you, you otherwise have to code yourself.
So basically you want to claim: you wrote half a dozen applications, and when you are about to write the 7th, you figure: hey let's cut and paste the common part if those programs into "a library"
Re: (Score:2)
Everyone is using some kind of abstraction today, nobody is writing opcodes manually and almost nobody is even using assembler, not only because that's done in a library or a framework but because they don't have time. If it wasn't abstracted for them it wouldn't be happening. All you can reasonably do is try to choose efficient abstractions and use them in efficient ways.
Re: (Score:2)
Frameworks were a part of software since the 60s. Even the Apollo Guidance Computer had a framework - it ran a VM that abstracted out the 15+1 bit hardware architecture into something more useful for planning a mission to the moon, including positional tracking and correction.
The famous error on landing happened because the radar was causing more VM tasks to be spawned than was supported by the memory or executive.
Plus, we all use libraries and greater abstraction because going without gets tedious quick.
Wi
Re: (Score:2)
I've seen frameworks since I entered the profession in the very early nineties. The first one I used was something for the Mac that used Aztec C (sorry, can't remember the name of the framework itself, it's been 35 years!) Macs back then rarely came with more than a megabyte of RAM, I think some in our lab only had half that.
So... bloat is 100% relative term. It might, slightly, increase the size of an application for it to be built around a framework, but it certainly doesn't turn megabytes into gigabytes.
Re: (Score:2)
It's not running in the code, it is running on the code at compile-time, so it's not a consistent % increase in resources, however yes, I agree. It should not be like this. Anything to increase complexity is absolutely awful.
Good luck vibe coding for efficiency (Score:5, Funny)
I'm sure the models trained on stack exchange examples will produce exactly the optimized code every CTO is hoping for when they promote the "AI".
Shared (Score:2)
The funny part (Score:5, Insightful)
The funny part is that the "agentic OS" shit in windows, copilot required 16 gigs of ram for system to be certified to be copilot ready (or whatever it was that microsoft calls their copilot branding for OEM systems).
And rumor mill suggests that low end OEM systems are going back to 8 GB now. AI demand has caused... reduction in AI capable systems.
Re: (Score:3, Insightful)
No worries, in the next version the "agent" will require a constant connection and run on the server.
RAM usage has become quite very l00ny. (Score:2)
Point in case: Installed the Dire Wolf Digital Boardgame Companion(!) app last night. 333 MB. It's a neat app and it looks cool, but 333 MB for this is insane. Basically every piece of software is like this these days.
Part of this is due to cross-platform and cross-version development, but a larger portion of it is that devs don't need to care and memory efficiency isn't a priority anymore.
Layers of abstraction and convenience (Score:5, Insightful)
Way too many layers of abstraction to enable ever less qualified unwashed masses to develop software with even less understanding and skill; we had decades of this - now you see the results.
No, not everybody should be coding. In fact, it is better if less clueless wannabes are running around with these loaded guns.
Agile and project managers are the problem (Score:3)
Re: (Score:2)
What you describe is a software architecture problem. No decent software architect would allow code like that. Most likely the project doesn't even have one, or they have one named as a software architect, but in reality that person is just a normal developer. I have seen that happening multiple times. It is extremely important to pick a good architect for the project. That alone makes determines whether the project costs 100 million or 10 million.
I have been in agile projects that don't have such problems,
Should've been doing this anyway (Score:4, Insightful)
Lower memory footprint is desirable whether or not there's a shortage of commercially-available DRAM. Though there's not going to be as much growth in available memory on systems, the reality is that DRAM shortages will cause end users to deploy fewer new systems in the near future rather than significantly curtailing the amount of RAM available per system. People will be holding on to older systems longer rather than sticking to whatever upgrade path they're on.
Re: (Score:2)
Like everything software developers do, it's a cost-benefit equation. At present, software developer time is very, very expensive, and memory is still, despite recent increases, a lot cheaper than developer time.
So from that perspective, it does *not* make sense to spend time optimizing the memory footprint of software.
toolchains? (Score:2)
Why use toolchains at all? Just have AI produce the application ready to run!
And why use frameworks? Just ask AI to generate all the code needed.
"...and argues that today's memory crunch could force similar discipline."
It definitely WON'T, not for the people causing it.
Sounds like a good line for undergrads in CS (Score:5, Funny)
Re: (Score:3)
You wanted AI? Now you have to learn memory management again... And in C, as an extra punishment!
Mod parent up.
I learned C in 1991, on 640k MS-DOS computers, at Uni, Electronics Engineering. I know how to do that shit ;-)
Zig and python, on the other hand, intrigue me.
Winodws 95 (Score:3, Informative)
Re: (Score:2)
The Chicago beta I installed was more like 24 floppies, and it's not like it wasn't using compressed files. I wonder what happened in between there and the release. I don't remember it having anything that was removed, maybe it was just debug symbols.
Re: (Score:2)
Re: (Score:3)
They were known as DMF formatted diskettes and held around 1.7MB.
Re: (Score:2)
From my memory there was a 13 diskette set in DMF format (1.7MB instead of the standard 1.44MB) for the initial release. I still have a 27 diskette set of Windows 95 OSR2 with USB Support in standard 1.44MB format. However, I'm not sure if that was ever an "official" release for sale. I worked at a computer store at the time and I believe this was an OEM copy I obtained.
Re: (Score:3)
Re: (Score:2)
And it could run on 4 MB RAM but 16 MB RAM was recommended.
Nope, official minimum was 4MB, Offical Recomended was 8MB. 16MB was the "word on the street".
Been here before... (Score:5, Interesting)
Brings up fond memories of the days when I did internals for a long vanished DBMS firm on PDP-11 and Vax machines. The fine art of building overlay trees and sharing memory regions to get some of the huge programs to fit. Seems fanciful today, packaging multi-megabyte code to run in 64kw of user space. Even had to overlay file buffers and built our own swap handler. One overlay map was over 8 feet long. Modern bloatware is so wasteful and seems to have lost what us oldtimers did to make things work. A memory diet might do everyone some good.
I went through this... (Score:2)
Some clients regularly complained our APIs were bloated and confusing, so many options and so many many {fill in the blank here}.
Well, I was encouraged to point out that our APIs were built to satisfy the requirements of the target systems, with their many different said requirements, and to also serve many different types of client users. And ctrl+f was most useful when perusing the documentation for the specific feature *you* needed.
As you can imagine, that sometimes resulted in an outbreak of insults, di
No. (Score:2, Insightful)
I knew this was coming. I knew it wasn't going to be long until I was told not to be angry at billionaires going on trillionaires and they are attempts to eliminate all our jobs with AI while taking all the water and electricity for themselves.
90% of all media you consume is owned by a billionaire or a small handful of billionaires.
Every single piece of information you have access to is filtered through that lens.
So
Re: (Score:3)
Two things can be true at once.
Developers do seem to be creating programs that are less and less memory efficient and yet somehow less and less functional.
AND
Tech bros are pushing a shitty technology on everyone and are pushing up RAM, storage, and energy prices while doing so.
I don't see why I should need 16Gb to read some forums and check my email when a 16Mb machine was perfectly capable of doing 90% of that 30 years ago.
Yeah of course developers are doing that (Score:2)
Technology is supposed to advance and when it does yeah we use it.
I don't see developers besides Microsoft making excessive use of RAM and Microsoft is only doing it because they are doing AI bullshit specifically they're adding a bunch of monitoring to computers so they can train AIs to replace us all.
The last thing I want is developers spending time heavily
Re: (Score:2)
Does it really take 1,000 times the memory usage to get added memory safety?
No, it really doesn't.
Programs are bloated, less functional, and the only reason they're often "memory safe" is because they've been written in Javascript and are running under Electron. Firefox doesn't take up 1,000x as much memory as Netscape because of better functionality (it has better functionality, but that's not why) or because it's written in a memory safe language (it's written in C++), it's because they got into a dick me
Re: (Score:2)
We are not going to let tech Bros blame programmers for the memory shortage.
I had this thought too...gotta blame the programmers for something so that they feel bad about themselves, and are easy to control (Even if they don't actually care about memory).
bloat for sure (Score:2)
We could run an OS and a game or an OS and multiple other programs in 1MB of memory on an Amiga. You can't do that with Windows, Mac or Linux these days.
Re: (Score:2)
We could run an OS and a game or an OS and multiple other programs in 1MB of memory on an Amiga.
Amiga had no security and the OS didn't even make use of an MMU when it was present unless you ran GOMF or similar. It was awesome in its day, but its day has long since passed. We also have orders of magnitude more RAM in our portable devices than we had in our desktops back then, so while there is something to efficiency, we just don't need to be that efficient any more. We can afford more safety and security.
Amiga also had most of the OS in ROM, but this was slow, and if you had enough RAM then you'd loa
Once the AI bubble pops.... (Score:2)
Most People Have No Idea (Score:4, Insightful)
Re: (Score:3)
> Unused memory is wasted memory
Yes, but no: If the RAM that one program uses for cache can't be easily reclaimed for use by another, that isn't cache, that's usage.
Yanking the memory out from under the process should be an option for the OS, same as it can do for its FS cache. If it's cooperative (as in the OS asks politely), then it's usage.
I'm saying this because I'm not aware of a way to allocate memory that the OS can reclaim at any time...
Re: (Score:2)
I'm saying this because I'm not aware of a way to allocate memory that the OS can reclaim at any time...
You don't need to, the OS knows: https://en.wikipedia.org/wiki/... [wikipedia.org]
My std. complaint (Score:2)
OOp is so wonderful, and...
I want a clipping of Godzilla's toenail, and the object gives me Godzilla, with a small frame around his toenail.
People use the highest level abstraction, rather than the lowest that gives them just what then need.
640 TB oughtta be enuf for anyone (Score:2)
- GatesGPT
Slashdot is a great example (Score:2)
4GB chat app⦠(Score:2)
Asshole Driven Development + Python/JS to blame (Score:3)
We have fast server-side engines heavily optimized in Java?...Great...let's throw those out....now rerun the whole thing in python with 4x the latency, 4x the RAM utilization and 1/2 the overall performance. Why? because a persuasive asshole successfully convinced your boss it made more sense to write all new software in Python than have him crack open a book and learn Java...because he argued, without any evidence or a fucking clue that it's easier to write Python than Java....even though if you know both, the difference is minimal to working in Java's favor if you have a larger project with more flux/contributors.
What we're seeing is Asshole Driven Development. Whoever, is the biggest asshole or most persuasive gets their way....doesn't matter what the facts are. Some dipshit with huge tits can persuade your boss tomorrow that you need to rewrite your authentication suite in brainfuck....and guess what...you're now a brainfuck developer.
Webpages used to be fast and simple and mediocre looking. Now they're SLOW AF, bring supercomputers to their knees, just to display 1kb of text, but hey, they're using 20 of the latest and greatest frameworks!...and they do look 5% better on mobile....so....I guess that's something!....you get to pay to send 100x more megabytes & megabytes of pointless HTML and JavaScript, just to render a simple contact-us form....but hey, you're using frameworks approved the by the coolest turtleneck-wearing urban UI hipsters.
Cost/benefit (Score:2)
Maybe the author doesn't realize that software developers are expensive. A business has to decide where they want to spend their engineering dollars. Are they going to build, say, new features they can sell? Or are they going to work on reducing software footprint, that gains them basically nothing? If you were paying software developers to do things, which one would you choose?
Bad short term strategy (Score:2)
Pivoting development to address a temporary shortage isn't necessarily good business strategy.
Someone told me that the best thing is to assume that in a couple of years when the AI bubble bursts will have a deluge of RAM and GPUs with lots of RAM, and it would be best to design for that.
I think that's too optimistic, but seriously, investing into optimisation as a response to a market shortage doesn't sound to me like the best use of development time unless it's simple to do. If it's simple to do, it would
what a strange take on the situation. (Score:2)
The AI Irrational Market... (Score:2)
Won't create a demand for highly memory optimized versions of applications. The costs for rewriting that code in desktop specific environments is very high and the inertia behind web frameworks and applications is so large that nobody wants to move to something better, despite how bad the current frameworks can be.
Most users expect more out of all their applications. Application frameworks have to deal with monitor scaling, use very high quality font and vector rendering and stay highly responsive. They use
Re: (Score:2)