Where Have All The Cycles Gone? 854
Mai writes "Computers are getting faster all the time, or so they tell us. But, in fact, the user experience of performance hasn't improved much over the past 15 years. This article takes a look at where all the precious processor time and memory are going."
My CPU Usage (Score:5, Funny)
3% gaming
5% internet
90% feet warming
Re:My CPU Usage (Score:5, Funny)
Where have all the cycles gone? (Score:5, Funny)
Re:Where have all the cycles gone? (Score:5, Funny)
Re:Where have all the cycles gone? (Score:5, Funny)
Re:Where have all the cycles gone? (Score:5, Funny)
Re:Where have all the cycles gone? (Score:5, Funny)
>> "My girlfriend has a cycle every month."
>
> Heat?
Running at about 380 nano-Hz, I would rule out heat issues.
Re:My memory Usage (Score:4, Informative)
because that's the size of the uncompressed waveform. You don't play the MP3, you play the waveform that is compressed inside the MP3. iTunes just decompresses the file all at once, and puts that into memory, instead of a bit at a time like some players.
Re:My memory Usage (Score:5, Insightful)
Not Lazy. (Score:4, Interesting)
Re:Not Lazy. (Score:5, Interesting)
Well, the process can still be paged out. So you don't really gain anything from doing that.
iTunes on the mac is famous for not skipping no matter the system load, guess why?
Decoding an mp3 file is not a heavy task, even a 486 CPU would manage that. And Winamp hasn't skipped on my computer either, regardless of load. So I don't think it has anything to do with pre-decompressing the music.
Re:Not Lazy. (Score:4, Interesting)
It doesn't have to be a heavy task--I noticed in iTunes you can modify a playing file in all kinds of odd ways--there is no need to lock the file after its been loaded and started playing.
WinAmp just falls down here--update a playing file's ID tags and it skips (on a modern, otherwise unfetterd system) or try to rename it outside of the application, and it fails miserably because the file is (obviously) locked.
You may not like Apple's approach, but it works well enough for everyone else.
Re:Not Lazy. (Score:4, Interesting)
Another annoyance is older versions of Winamp didn't like it when you deleted a file that it had in a playlist, but it wasn't playing. Windows Media Player is the same way though.
Re:Not Lazy. (Score:4, Informative)
This is of course long before I had ever used linux, and for all I know the experience of playing MP3s on a 486 in linux could be entirely different, but MP3s + 486 + Winblows = glitches, skips, crackles, pops and not a whole lot of CPU left to do anything else.
Re:My memory Usage (Score:4, Interesting)
The parent was understandably modded a troll, but I have to say that I agree with the sentiment, if not the exact words. My recent experience with OSX and the iLife apps is exactly that: Apple writes software that is very slick and nice for the average user, but really limited and arguably even broken for the user with atypical or demanding needs.
My wife's new iBook is pretty, but if it were my iBook, it'd be running Linux by now.
Re:My memory Usage (Score:4, Informative)
Apple doesn't want to port every iTunes change to MFC. They write it in Carbon, keep the Windows-specific bits separate, and then just develop on Mac and recompile on Windows.
Yes it makes iTunes heavy weight and slower and a second class citizen. But it achieves Apple's goals.
Re:My memory Usage (Score:3, Funny)
Two things. First, could it possibly be under Windows? Try minimising it and tell us again.
Next. To put your question differently "Why does Matlab uses 300Mb just to add two numbers?" Because it is intended for more than that?
Just look at the size of a word document today (Score:5, Insightful)
Re:Just look at the size of a word document today (Score:5, Insightful)
But you can still fit an entire book on a floppy if you use LaTeX. The morale of the story: Don't want a slow, bloated system? Then tough it out and don't use one. But don't complain when you have to type:
instead of clicking the bullet button or asking a paperclip to make a list. It's all a matter of what you want. There are plenty of lean, mean systems out there. Don't bitch about UI slowness unless you are willing to use a plain-text console with "screen" "mutt" and "elinks" as your main applications.
Comment removed (Score:4, Insightful)
Re:Just look at the size of a word document today (Score:4, Informative)
I'm a physics major at UW, so I do a decent bit of scientific work on my computer. I use GNUPlot, XFig and the Gimp to generate drawings for lab reports and whatnot.
I'm typing this from Firefox running on the Xorg6.7.0 server+WindowMaker 0.91. The key here is to use a lightweight window manager. Blackbox and fluxbox are other good choices (light, usable, not fugly (cough, fvwm, cough)). If you have to have that desktop environment, go with Xfce.
The only gap that I occasionally feel in my user experience is a good spreadsheet. I haven't found one. KSpread, OpenOffice Calc and Gnumeric either are or require the use of heavy GUI software which we are trying to avoid (KDE and Gnome are not as big as XP, but far too big to run comfortably on my system). I've glanced at Siag, but haven't really tried it out (I don't know scheme and don't have the time to figure it out right now--see physics undergraduate work).
I use mutt or pine, depending on which email address I'm checking. Thunderbird looks promising for being light and good, if you want a GUI based email client.
Recompile your kernel to match your hardware (trim the fat and optimize for your processors), and turn off any extra servers that you don't need (don't need telnetd, ftpd, &c. running? Turn off inetd--it's also more secure). Customize your boot sequence to only start and load that which your system needs and those things which you use.
I also boot to the command line and don't run xdm or the like. I do a lot of work from the command line, and X+light WM doesn't take long to start. It is, again, one less thing wasting clock cycles on my machine.
For reference, I'm running my Slack 10 system on an Abit BP6 with two PIII 866MHz processors underclocked to 650MHz (long story... Has to do with the fact that the BP6 doesn't technically support the PIII). I've got 384MB of RAM and a GF4 video card. It is lightning fast. The only exception to this is when I'm running X with the closed nVidia drivers (damn thing has a 3MB kernel module... grrr...), but that only adds a hang of a couple seconds when switching between X and the consoles, and that's it. If I'm not playing Quake or dealing with 3D visualization stuff, I can use the OSS driver (2D accel only), and get rid of even that performance problem.
So, yes, the middle ground is there, and it rocks. My computing experience is awesome, my slightly dated hardware is rock solid and perfectly responsive. Take a good, customizable Linux distribution, run light weight software, turn off stuff in the background and run a lean, mean, customized kernel, and you'll reclaim those lost cycles as interface responciveness. I suggest Slackware [slackware.com] for this. FreeBSD, Debian, and any other Linux distro which is aimed at power-users will be good for setting up a configuration like this.
Mandrake, RHAT (RHEL & Fedora), SuSE and any other user-friendly type distro is ill-suited to this, IMO. Not that you can't, but my experience with these distros and their high-level admin tools is that if you try to do something too different from the default, it gets extra hard. So, Slackware and the like just end up being simpler, and now you know what Slack users mean when they say "it's simple." So stop giving us funny looks when we say it.
Jeff
Re:Just look at the size of a word document today (Score:4, Funny)
Oh, please, can we? I had one of the first postscript printers (on my Amiga 1000, serial #27) and while the language HP lasers used - HPGL seemed nice, Postscript seemed scary and complex; but it was obviously the one true way. Anything looking that much like FORTH had to be the answer.
So I bought the red book. I bought the blue book. I looked for the green book. I cultivated the Reid brothers as friends. I read comp.lang.postscipt. I imaged film at 1250 and then 2450 dpi. I typeset a book on my amiga that actually got printed. I became a barely competant PS hacker. I was ready for the postscript revolution!
Wake me up when it happens. Until then I have to put a gray border around some assholes webpage because his accountant wants it... this, bitch.
To comment on the original topic, I was an Assembly progammmer from 70 to about 93 and I'm telling you people the problem with code bloat is C++.
Go back to C and check to see if it generated nice code and you can fit things on floppies again.
I'm a contract programming whore but you can't pay me to use C++. I'd sooner do COBOL.
Now, the interesting thing is there were TWO languages to come out of Bell Labs after C. C++ was only one of them, but it's creater had a Cerfian sense of self promotion and the language was popularized much to our collective dismay.
Jim Fleming aquired the rights to the other language, it's called C@+ and pronounced "cat".
A C@+ compiler written in C+@, fits on a floppy. I have one here. It's what we should be using if you want to use anything other than C.
Re:Just look at the size of a word document today (Score:4, Informative)
Google Search with some hits [google.com]
Re:Just look at the size of a word document today (Score:4, Informative)
Re:Just look at the size of a word document today (Score:5, Interesting)
And how much of that bloat in Word is useful information?
If I open word, type the letter 'a', and save the document, it's a 20K document.
If you type 'a' 2000 times, it's still a 22K document.
What the heck is in that other 99.9% of the document?
Re:Just look at the size of a word document today (Score:5, Funny)
64 bytes: Cryptic Masonic signature
64 bytes: Reserved for Carnivore
8KB: Macro playground
8KB: Random extracts from King James Bible
64 bytes: Run-length encoded document contents
8KB: Uncompressed copy of above for compatibility
Re:Just look at the size of a word document today (Score:4, Funny)
As a Mason, let me be clear: the file format may indeed be cryptic, but we had nothing to do with this one.
Besides, we're more interested in handshakes and networking. We let the Teamsters handle the obfuscation and misdirection stuff.
Re:Just look at the size of a word document today (Score:5, Informative)
This is not an excuse fro a BLANK MSWord document being 19,456 bytes of course. But there is "useful" data in there...
I'm running Win2K, and if I right click on the file and sepect "Properties", there is a summary tab that displays all the info stored in that 19k. (You might have to click "Advanced")
The data includes:
-Title
-Subject
-Category
-Keywords
-Template name
-Page Count
-Word Count
-Character Count
-Line Count
-Paragraph Count
-Scale (No idea what this means)
-"Links Dirty?" (No idea what this is... maybe it's true if there's porn links in it?)
-Comments
-Author (From computer info)
-Last Saved By... (From computer info)
-Revision Number (Number of saves?)
-Application
-Company Name (From registration info)
-Creation Date (Seperate from file system creation date)
-Last Saved Date (Seperate from file system modified date)
-Edit time
Now is this ACTUALLY useful? I dunno. It might be in some situations. There should be an option for not saving this metadata though, for security if not for file size.
=Smidge=
Re:Just look at the size of a word document today (Score:3, Interesting)
Believe it or not, something like 60% of the document was NULLs.
So its not really bloated, its just full of nothing
This week's action item (Score:5, Funny)
I think it'll be the same, given the same machine.
Re:This week's action item (Score:3, Funny)
Where are all the precious cycles going? (Score:3, Insightful)
On complexity, code bloat and user interface (Score:5, Informative)
Mr. Seebach points out that "computers are, in fact, doing more than they used to. A lot of the things computers do are fairly subtle, happening beneath the radar of a user's perception. Many functions are automatic and, as discussed in last month's column, you could probably do without some of them."
This recalls an analogy drawn by a recent Economist article [economist.com]. Unlike most automobile analogies popular among Slashbots, this one is actually rather appropriate: "By the 1930s, ... the car had become more user-friendly and ready for the mass market. ... [T]he makers' increasing skill at hiding the technology from drivers ... meant that cars got hugely more complex on the inside, because most of the tasks that had previously been carried out by drivers now had to be done automatically. This presented drivers with a radically simplified surface, or 'interface' in today's jargon."
Given this lesson drawn from history, I disagree with Seebach's conclusion that "the worst is probably over" in terms of code bloat and complexity. Computers still have a long way to go before they can approach the ease of use and stability we demand of every other consumer appliance in our lives.
The aforementioned article requires a paid subscription to view, so in the interests of convenience, I'll reproduce it here.
--
SURVEY: INFORMATION TECHNOLOGY
Now you see it, now you don't
Oct 28th 2004
From The Economist print edition
[Image] [economist.com]
To be truly successful, a complex technology needs to "disappear"
THERE has never been anything quite like information technology before, but there have certainly been other complex technologies that needed simplifying. Joe Corn, a history professor at Stanford University, believes that the first example of a complex consumer technology was clocks, which arrived in the 1820s. Clocks were sold with user manuals, which featured entries such as "How to erect and regulate your device". When sewing machines appeared in the 1840s, they came with 40-page manuals full of detailed instructions. Discouragingly, it took two generations until a trade publication was able to declare in the 1880s that "every woman now knows how to use one."
At about the same time, the increase in technological complexity gathered pace. With electricity came new appliances, such as the phonograph, invented in 1877 by Thomas Alva Edison. According to Mr Norman, the computer-design guru, despite Mr Edison's genius for engineering he was a marketing moron, and his first phonograph was all but unusable (in fact, initially he had no particular uses in mind for it). For decades, Mr Edison fiddled with his technology, always going for the most impressive engineering solution. For instance, he chose cylinders over discs as the recording medium. It took a generation and the entry of a new rival, Emile Berliner, to prepare the phonograph for the mass market by making it easier to use (introducing discs instead of cylinders) and giving it a purpose (playing music). Mr Edison's companies foundered whereas Mr Berliner's thrived, and phonographs became ubiquitous, first as "gramophones" or "Victrolas", the name of Mr Berliner's model, and ultimately as "record players".
Another complex technology, with an even bigger impact, was the car. The first cars, in the early 1900s, were "mostly a burden and a challenge", says Mr Corn. Driving one required skill in lubricating various moving parts, sending oil manually to the transmission, adjusting the spark plug, setting the choke, opening the throttle, wielding the crank and knowing what to do when the car broke down, which it invariably did. People at the time hired chauffeurs, says Mr Corn, mostly because they needed to have a mechanic at hand to fix the car, just as firms today need IT staff and
Nobody give a fig about optimizing (Score:5, Insightful)
'disks are fast'
'processors are fast'
nobody cares about optimizing code anymore.
Re:Nobody give a fig about optimizing (Score:4, Insightful)
Re:Nobody give a fig about optimizing (Score:5, Insightful)
Tom
Re:Nobody give a fig about optimizing (Score:5, Insightful)
Re:Nobody give a fig about optimizing (Score:3, Interesting)
Well, a 10 year old NeXTSTEP computer can do a lot of stuff that I still can't do with Gnome and KDE, and yet it is still faster doing some things than todays computer. Same with Inkscape, I tried to play around with some of the stuff I did on a P90 with 24mb RAM in CorelDraw years ago, Inkscape turned out to have huge problems rendering the stuff on a 1Ghz Athlon with 768mb RAM, was almost unusable. Its true
Re:Nobody give a fig about optimizing (Score:5, Interesting)
I think all programming students should have to code for a system like this. It gives you a MUCH greater appreciation for what the compiler is doing for you, and what the consequences of simple changes can be.
Re:Nobody give a fig about optimizing (Score:5, Interesting)
I agree completely. I've done some programming for OS-9, and when we were creating some software libraries, we had to do was worry about things like program footprint size and memory allocation/deallocation. We were using a cross-compiler and doing development in C and C++. Something as simple as the order in which you declare the variables could make a noticeable difference in program size. Memory allocation and deallocation had to be done by the top level of the program. The support libraries had to be written to accept a memory block to use and how large it was. The last thing we wanted to do was use up the 4MB of RAM (which had to hold the OS, plus any programs you were running) we had by making large chunks of it unusable because it first was malloc()'ed, then free()'ed. We didn't want to risk having whatever garbage collection scheme existed to be able to properly operate... assuming there even was one. (This was 1997.)
Of course, if you want speed, you have to learn to take advantage of the "short circuit" of && and ||. While nobody's really going to notice the several nanoseconds you might use up by doing !strncmp(str1, str2, n), when you process millions of rows from a database, it can make a big difference by not forcing a program pointer jump by saying
if (str1[0] == 'a' && !strncmp(str1, str2, n))...
The mindset we have now is a direct result of the prevailing attitude that memory is cheap and processors get faster. A friend of mine is no longer asked to interview prospective candidates because he would always ask questions about optimizing code and making it run faster. The candidates nearly always had the look of a deer caught by headlights, and these supposedly knowledgable programmers (interviewee AND interviewers) couldn't answer these questions.
Re:Nobody give a fig about optimizing (Score:3, Interesting)
Our c compiler had an output format that would list the c code and resulting assembly language intermixed. I wrote a quick little program that would read this, count the bytes of code per line, strip the assembly, and then just print out each line of C with the byte count at the beginning of the line.
This was easier to look over and you could see if some c expression was really bloated - I'd then go and simplify the code.
For example, I've been disassem
What kind of optimization? (Score:5, Insightful)
nobody cares about optimizing code anymore.
You can optimize in many different ways: for run-time performance, maintainability, extendibility, usability, compatibility, and probably a bunch of other ways I can't think of just now.
Many of these are at odds with each other. And since computers are getting faster, I think it's perfectly reasonable to start trading off run-time performance with some of these other things.
What are they talking about? (Score:5, Funny)
What about.... (Score:4, Interesting)
Ummmmm..... No.
A number of years ago, I had a project that required three days for each calculation. Just for kicks, when I got my dual G5, I ran the same calculation with the same parameters and it was complete almost instantaneously. Yes, yes....I know..memory bound performance versus disk swapping of memory space, but at the time, the memory on that system was maxed out (128 MB for $5000).
I also know that one of the games I helped work through beta (Halo) would absolutely not run on much hardware older than a few years ago.
Clippy (Score:5, Funny)
Re:Clippy (Score:5, Funny)
Re:Clippy (Score:5, Funny)
It's kinda like the Matrix, only less resource-intensive, and without as much "whoa" time.
Change isn't always a given. (Score:4, Insightful)
A few things (Score:5, Informative)
Some good things that have eaten more memory and cycles (all of which have improved the user experience, as opposed to what the summary states):
1 Programs that check your work as you go (e.g.: autocalculate on spreadsheets)
2 More help dialogs, things watching for cameras, and whatnot to smooth the user experience.
3 More use of IM and other software in the background much of the time.
4 Services running so that it's faster to sort and search files, open your favorite programs, etc.
In short, lots of stuff running to make your experience smoother, even if it doesn't look like it's doing much more.
Some bad things:
1 More viruses, etc.
2 The mandantory virus scanner that has to run in the background all the time because of (1)
3 All the crap adware that installed more than it used to be.
These are just a few of the trends I can think of . -- Paul
Reminds me of a song... (Score:5, Funny)
Where have all the cycles gone, long time ago
Where have all the cycles gone, gone to spyware everyone.
When will they ever learn?
When will they ev-ear learn?
Re:Reminds me of a song... (Score:5, Funny)
Where has all the spyware gone? Long time ago
Where has all the spyware gone?
Gone to spammers, everyone.
When will we ever learn?
When will we ever learn?
(Apologies to Mikhail Sholokhov, Pete Seeger & parent poster)
uh... how many windows are open? (Score:3, Informative)
So a lot of my cycles are going to managing my ability to work in several programs at once. My old iBook at home allows me to have all of two windows open at once... and with noticable performance drops.
Intel Extreme Graphics (Score:4, Interesting)
Usere experience unchaged .. nooo way (Score:5, Insightful)
Now that really depends on what you would call 'user experience'.
Compare a file manager 15 years ols (PC Tools had one right
Compare pine to Thunderbird.(though I still use pine on my old laptop
Compare Usenet clients or say Lynx to Firefox,
Compare Doom 3 to Pac Man
Comapre the fancy graphics on OS X to Win 3.1 or whatever OS Mac had then
No Sirrr I say the user experience of performance HAS changed. Maybe not directly proportional to the Proceessor speed increase (due to code bloat ?) but still its much much better. Thats my $0.02 .
Hello world of today (Score:3, Funny)
Re:Hello world of today (Score:3, Funny)
; This program displays "Hello, World!"
dosseg
.model small
.stack 100h
.data
hello_message db 'Hello, World!',0dh,0ah,'$'
.code
main proc
mov ax,@data
mov ds,ax
mov ah,9
mov dx,offset hello_message
int 21h
mov ax,4C00h
int 21h
main endp
end main
"submitted by: bronson@engr.latech.edu (Patrick Bronson)"
Re:Hello world of today (Score:3, Funny)
Re:Hello world of today (Score:3, Funny)
Heh (Score:5, Insightful)
Where are my CPU cycles and memory going on my AMD 3500+ and 1Gig 400MHz DDR Ram? Most of the time, nowhere. 1% CPU usage, commit charge 150 megs / 1 gig. Honestly, if you don't use CPU intensive apps, there's a limit to the 'improvement' you can expect in 'graphical display' and 'word processing' speed. But sales rep will tell you otherwise, for sure.
Re:Heh (Score:3, Insightful)
Thats true and untrue, depending on how you look at it. I have a CPU meter on my personal machine and I expect it to be hovering around 0. Right now its just got a little of activity going as I type this because it is spell checking as I type each word. Much less than 10% of the CPU. (I wish OSes came with these things so that people were aware of what was going on with their machine like if its
Uhmmm, no. (Score:5, Funny)
Have you used a Mac lately? (Score:3, Interesting)
Stuff running in the background (Score:5, Insightful)
1) VoIP Client
2) Messaging Client
3) Word Processor
4) Multiple Web Browsers
5) Email Client
6) Probably some graphics or photo editing tool
7) Something playing music
In addition there are various other background processes like desktop indexing, things watching for my digital camera being plugged in, smart start stuff...
Linux is probably worse since i keep Apache and often Tomcat running all the time.
Back in the day, this was never how it was done. You'd optimize config.sys to get the absolute max amount of free conventional memory.
Multitasking has improved to the point that many users probably run close to 100 processes at any point in time..
prstat here says i'm on a system with
Total: 3741 processes, 6739 lwps
Fair enough it's a shared box, but that scale was impossible a decade ago.
Mac OS X (Score:5, Insightful)
So Apple is bucking the trend, or their first versions of OS X were an inefficient piece of crap and they are just now optomizing it.
Re:Mac OS X (Score:3, Interesting)
Mac users are demanding and impatient. All that typical slowness you see logging in, opening apps, closing windows, etc., with no feedback on XP makes Mac users want to pluck their eyes out.
You can come out with something quite elegant, like iPhoto 2, but if the performance isn't there that's all you're going to hear about. Mac users will whine incessantly until it's fixed.
Re:Mac OS X (Score:5, Informative)
Apple pushed to get OS X released to the public and so they followed the belief of "make it work then optimise". Today we can see the fruits.
An example of this is Quartz. Quartz basically had all the components you needed in 10.0 to do some great on screen rendering and it was reasonably fast. Through each iteration of Mac OS X though it has improved. In 10.1 the speed of the code was improved. In 10.2 we had partial acceleration via the GPU. In 10.3 more optimising. In 10.4 we can see they have completely pulled apart sections of Quartz and rewritten it as well as buffering it all onto the graphics card. That is but one example though, there are plenty of others.
On the other hand, apps like iPhoto and GarageBand were really sluggish and the system reflected that. Mac users cried foul and now you have iPhoto 5 which is blazingly fast and literally all the apps have been following that trend. I know as a developer myself I spend a good 20-30% of my time optimising code simply so users get the speed that they are now used to. It's good, we needed it, especially when we were stuck on the G4's. Now with the G5's it's just icing on the cake.
subjective performance... (Score:4, Insightful)
the subjective performance of overall data processing hasn't changed much, but that's just because task complexity has increased as cpu speed increased.
15 years ago, most applications were far less computationally complex than they are today. it has little to do with code bloat.
In the old days, ... (Score:4, Interesting)
Apple II (1 MHz 6502) did animated graphics with sound and controlled floppy access while polling the keyboard (The Bard's Tale)
Amiga (14 MHz 68000) had complete GUI, multi-tasking, on 256K RAM.
The old saying that "Intel giveth, Microsoft taketh" is about right. The CPU's have gotten faster, with the Microsoft O/S taking more and more cycles to do the same thing.
DCTI (Score:5, Funny)
So the problem is one of expectations (Score:3, Interesting)
And then I load up my MUD client, with simple, 16 color text in a 12 point font. This is my favorite game.
And then I load up my word processor, AbiWord, which renders as fast as I can type and has a nice spell-checker. This is my favorite word processor.
And then I load up Kmail, Mozilla, and all the other "normal applications" which have never had a problem with virii or worms.
And after all this they realize, the problem with my computer is THEIR expectations, not my software and hardware.
(And then they ask me when I'm going to replace my rotary phone... I can't win them all.)
My favorite theory... (Score:5, Insightful)
I this case there's a supply of plenty of clocks. There's an (existing) demand for a certain level of performance; if the supply outstrips that demand, then the supply is devalued, and consequently the programmers don't spend as much time conserving that resource.
Or to put it another way, programs behave like a gas with respect to responsiveness and user expectation; they expand to fill the available space.
Or to reword it another way (quoting from the article): computers are, in fact, doing more than they used to. A lot of the things computers do are fairly subtle, happening beneath the radar of a user's perception. Many functions are automatic and, as discussed in last month's column, you could probably do without some of them.
I can't work 2^(years/1.5) faster... (Score:5, Interesting)
Each time I plug in a new joystick and it just works, each time I plug in a new digital camera and it's just there as another drive, each time I alt-tab out of a game, check a walkthrough website, then alt-tab back, I think back to the old days where code was really efficient and didn't do any wasteful background tasks like that.
I remember helping a friend with a C++ assignment, via the net. Each time, she'd have to exit her telnet program, run Borland's C++ compiler from the command line, check the output, quit the compiler, reopen telnet, reconnect to the MUD we were talking over, then describe what had happened. Now... She'd just show me what's on her desktop via Messenger while we kept chatting.
And if some cycles get used up doing weird UI gimicks that I'll never use - like making the UI scalable so the partially sighted can use it, I'm willing to trade that.
For all those reasons, I'm more than happy that my 2^(years / 1.5) faster PC "wastes" all of those extra cycles. And that's before we get on to things like built in spell checkers and real time code debugging as I write it.
I don't want a 2^(years / 1.5) faster experience. I want all those cycles put in to making things work closer and closer to how I just expect them to work.
I don't know about anyone else but I can't code 2^(years / 1.5) faster so I wouldn't be able to keep up with that damn responsive text based compiler. On the other hand, I am that much faster overall as I now call an API that adds all that "bloatware" instead of having to code my own damn mouse drivers, my code is largely debugged on the fly and I can't remember the last time I lost several days just trying to format a newsletter in to columns.
So, before saying the cycles are wasted:
Pick an every day but semi complex task that people do now. For example: For a homework project, go on line, grab half a dozen graphics and ten blocks of text from those websites, put them all in to a stylishly laid out newsletter format. Do that on a P4, then do it on an a DOS PC from 15 years ago.
See if matching the same quality of work doesn't take you 2^10 times as long on that old PC, assuming you can even do it at all.
Those cycles aren't wasted. Sure, we do the same basic tasks but we do them with vastly more flexability and don't have to waste days of our lives wrestling with configs to do what we now consider simple tasks. That's where the speed is.
I somewhat disagree.. (Score:3, Informative)
The Future of computing (Score:3, Insightful)
Software hiding the complexity (Score:4, Interesting)
The dekstop files & folders paradigm is fine if marketing dweebs stop designing wizards that hide simplicity in a layer of complexity. What if I had a maid who said "I see you just set a piece of paper on your desk? Do you want me to file it for you? Great, I'll just shred this original while I'm at it, and you can conveniently ask me to find it whenever you need it!"
Example 1:
My dad plugs in his digital camera, and it displays a camera wizard. Great! It asks for the album name and places it in a convenient album with a nice slide-show.
The next day, he wants to edit one of the pictures, or copy it, or rename it. Too bad. Because it's now in a proprietary format in an album management program. The wizard was completely unnecessary. It have been easier for him to create a folder and drag the files into it. It would have functioned in the normal way files and folders work. He would know where they are, and could open, email, rename, delete, etc.
Another example: .MP3 file? No? Maybe it's a .WMV file? .OGG? .WAV? No... it's in the media library. And there it lies forever. You can't play it with anything else. Now I show her how to use CDEX, and click the CDDB button, then the RIP button, then whoa! And she can do whatever she wants with it.
My mom inserts a CD and Media Player asks her if she wants to rip the files to the media library. It even does a CDDB lookup and names the albums accordingly. Great! So where's that
Now I want to email that file. But I can't. Because it's not in a file on the file system, it's hiding in some "convenient" media library for me. And I want to view the pictures in the order the camera took them.
Vested Interests (Score:3, Insightful)
* Excerpt from Aldous Huxley's Brave New World *
"We condition the masses to hate the countryside," concluded the Director. "But simultaneously we condition them to love all country sports. At the same time, we see to it that all country sports shall entail the use of elaborate apparatus. So that they consume manufactured articles as well as transport. Hence those electric shocks."
"I see," said the student, and was silent, lost in admiration.
By the way, current number of mouse-clicks to configure viewing an MS Outlook sender in a given color:
17
Don't $top that fat, gravy-train from rolling! Keep the bloatware coming!
Start-up time (Score:3, Interesting)
Does anyone here recall the famous if not accurate "Whoa, Win95 boots in under 3 seconds!!!" usenet thread?
Startup time is currently an area where the likes of Windows XP excels over Linux. On an Athlon 2600+, XP takes 6 seconds to boot (and become usable) whilst Fedora Core 3 takes closer to 90 seconds.
Yes, both use prelinking (or prefetch if you like), but linux distros still don't load independent services in parallel, and I suspect Fedoras prelinking is far from optimized.
Isn't it obvious? (Score:3, Funny)
1) pr0n
2) Sharing pr0n.
Let's go back to assembly, non-PNP, and text mode! (Score:3, Insightful)
Yes, if I could find a floppy drive, and get a dos boot disk to boot, I could theoretically run a wickedly fast instance of WordPerfect 5.1. I wouldn't be able to surf the web, send email, listen to MP3's, work wirelessly, or work with graphics though -- and yes, graphics (e.g. diagrams) do have a proper place in day to day work!
Do people even remember the non-PNP days? IRQ's, IO Ports and the rest? Non-multitasking? Non-Memory protected (i.e. complete OS crashes from app errors?). These issues didn't seeem so bad back then since "that's the way it was", but now, I dread ever having to deal with those limitations again. Futzing with IRQ's for an hour just to make a modem stop locking up a PC is not my idea of productivity.
Hardware is cheap. Time isn't. I just hope we keep finding more ways to make my use of computers even easier.
ummm... kernel compile? (Score:5, Insightful)
I'd say that's an improvement, wouldn't you?
User experience of performance HAS improved (Score:5, Interesting)
When I bought a Centris 650 in the early 90's, it was noticably faster--so much faster that I brought it to work to show my boss, as I was sure he would not believe my stories of how fast it was.
This same thing has happened to me with every generation of PCs, too...it's not just a Mac thing. I buy a new machine, and marvel at how much faster it is.
Furthermore, I can go the other way to verify this. I still have my Centris 650 in storage, and booted it up a couple years ago. It was so slow that I could not believe that I ever found such a slow machine usable.
What is really going on is that it doesn't take us long to get used to a fast machine, and since we normally never go back, we don't realize just how much faster things are now.
Thats easy (Score:4, Funny)
Crap uses up processor time.
Re:Fun fact! (Score:3, Informative)
I'm sooo sick of people looking in Task Manager then saying how much an application sucks because of how much "memory" it uses. For the most part, memory is not a factor.
Re:Fun fact! (Score:5, Insightful)
Reporting the total memory allocated by an app is meaningless, because much of that memory includes things like memory-mapping DLLs that are only loaded once, and shared amongst all processes using them.
Windows constantly trims pages from the working set of each process that haven't been accessed in a while and pages these out to disk.
When you click the minimize button windows assumes that the app (or at least large parts of it) are going to be inactive for a while, so it tries to remove as many pages from the working set as possible - hugely reducing the "memory usage" reported in task manager. Sometimes it is a little too agressive in trimming pages from the working set, and the "memory usage" immediately climbs a little again, as the app accesses memory pages that windows thought it could page out.
Try it with an app like MS Word that has background threads that constantly check spelling and stuff, and you'll see the working set goes up for a few seconds after you minimize it.
Re:Fun fact! (Score:3, Insightful)
You think that's annoying? You should try having to put up with all people who complain "X is sooooo bloated" because they looked at memory usage on top but have no idea how it is calculated for X, nor what the figure really means. Every single article that has any mention of X11 gets at least 15 posts all saying X is bloated based on t
Re:Fun fact! (Score:3, Informative)
Windows uses virtual memory. Each process has its own virtual address space which is 4GB large. The address space is split into pages 4KB large. Each page may be free (it is not backed by any real memory, access to it is invalid) or committed (there is actual memory behind it). For each committed page, there is a matching 4KB page in some file on the hard driv
Re:Code Bloat (Score:4, Insightful)
Re:Code Bloat (Score:5, Insightful)
Re:Code Bloat (Score:3, Informative)
Man, that word sure is thrown around a lot.
My feeling is that the people who call
a lot of stuff bloat are the same ones
who view managers are useless loudmouths.
Truly time-critical code is executing faster than
ever. Compilers are smarter, hardware is faster,
and development systems are cleaner.
Programs are more featureful, intuitive, and pleasant-looking than ever. (I know, very general.)
Then again, if you are simply one of those minimalists who thinks that everything had every feature it ne
Re:Get a clue! (Score:4, Interesting)
The Story of Mel
This was posted to Usenet by its author, Ed Nather
(nather@astro.as.utexas.edu), on May 21, 1983.
A recent article devoted to the macho side of programming
made the bald and unvarnished statement:
Real Programmers write in FORTRAN.
Maybe they do now,
in this decadent era of
Lite beer, hand calculators, and "user-friendly" software
but back in the Good Old Days,
when the term "software" sounded funny
and Real Computers were made out of drums and vacuum tubes,
Real Programmers wrote in machine code.
Not FORTRAN. Not RATFOR. Not, even, assembly language.
Machine Code.
Raw, unadorned, inscrutable hexadecimal numbers.
Directly.
Lest a whole new generation of programmers
grow up in ignorance of this glorious past,
I feel duty-bound to describe,
as best I can through the generation gap,
how a Real Programmer wrote code.
I'll call him Mel,
because that was his name.
I first met Mel when I went to work for Royal McBee Computer Corp.,
a now-defunct subsidiary of the typewriter company.
The firm manufactured the LGP-30,
a small, cheap (by the standards of the day)
drum-memory computer,
and had just started to manufacture
the RPC-4000, a much-improved,
bigger, better, faster -- drum-memory computer.
Cores cost too much,
and weren't here to stay, anyway.
(That's why you haven't heard of the company,
or the computer.)
I had been hired to write a FORTRAN compiler
for this new marvel and Mel was my guide to its wonders.
Mel didn't approve of compilers.
"If a program can't rewrite its own code",
he asked, "what good is it?"
Mel had written,
in hexadecimal,
the most popular computer program the company owned.
It ran on the LGP-30
and played blackjack with potential customers
at computer shows.
Its effect was always dramatic.
The LGP-30 booth was packed at every show,
and the IBM salesmen stood around
talking to each other.
Whether or not this actually sold computers
was a question we never discussed.
Mel's job was to re-write
the blackjack program for the RPC-4000.
(Port? What does that mean?)
The new computer had a one-plus-one
addressing scheme,
in which each machine instruction,
in addition to the operation code
and the address of the needed operand,
had a second address that indicated where, on the revolving drum,
the next instruction was located.
In modern parlance,
every single instruction was followed by a GO TO!
Put that in Pascal's pipe and smoke it.
Mel loved the RPC-4000
because he could optimize his code:
that is, locate instructions on the drum
so that just as one finished its job,
the next would be just arriving at the "read head"
and available for immediate execution.
There was a program to do that job,
an "optimizing assembler",
but Mel refused to use it.
"You never know where it's going to put things",
he explained, "so you'd have to use separate constants".
It was a long time before I understood that remark.
Since Mel knew the numerical value
of every operation code,
and assigned his own drum addresses,
every instruction he wrote could also be considered
a numerical constant.
He could pick up an earlier "add" instruction, say,
and multiply by it,
if it had the right numeric value.
His code was not easy for someone else to modify.
I compared Mel's hand-optimized programs
with the same code massaged by the optimizing assembler program,
and Mel's always ran faster.
That was beca
Re:Code Bloat (Score:4, Insightful)
You want language interop, you have to live with COM, DCOM, CORBA, XML Web services, SOAP, UNO. If not, then go back to your Amstrad or Sinclair.
FFS you cant have your cake and eat it.
Its not bloat if you derive utility from it (Score:5, Insightful)
I derive utility from them and my hardware can handle it. This is not bloat.
Re:Its not bloat if you derive utility from it (Score:5, Insightful)
Bloating isn't just in RAM, CPU and diskspace. It's now happening to other resources like network bandwidth too. Ten to 15 years ago I could do a lot of very useful stuff with a 1200baud dial up. Now 50k dialup is pure crap for many purposes, mainly because of bloat.
Re:Code Bloat (Score:3, Interesting)
Re:Code Bloat (Score:5, Insightful)
With a lot more files, a lot larger files, etc, your performance at file access (and disk caching) will decrease unless you can increase the throughput and decrease the seek time/latency of data access from the disk.
Back in 1998, most PCs were shipping with 5400 RPM hard drives; I bought a used 10k rpm drive for 200$. Nowadays? Most PCs are still shipping with 5,400 RPM drives, and 10k RPM is still good performance (although not the best out there - although, mine back then wasn't top of the line either).
Disk sizes are scaling up wonderfully. Disk access speeds are not. The same holds true with RAM.
It's not just that... (Score:5, Insightful)
Windows 3.1 may have been incredibly snappy, but it also lacked propery memory protection, wasn't 32-bit addressed, and didn't provide an intuitive interface. Also, more advanced typography (anti-aliasing) and filesystem indexing services like Spotlight come into play, as well as all the important system daemons running in the background that are now considered stock.
It's not that things aren't getting any faster because of bloat. It's because as power increases, the ability to add new modern features to the original experience is utilizing that extra power, just as it should.
Re:It's not just that... (Score:3, Insightful)
I see your point, but let's put things in perspective - Windows 3.x was not snappy on common hardware available in that day, which was pretty much a 486DX2-50/66 with 8MB (average) of RAM and slowish IDE HDs. It really wasn't until the Pentium 60/66 became more mainstream (and I managed to get my hands on one) that 3.x actually seemed damn fast. This was especially true for WFW, which was rather slow on a DX33, but OTOH it actually got better if you threw 32MB or
No. The features existed in 1991 and were faster! (Score:4, Interesting)
Not only that, but the program was well-designed enough to provide four different levels of UI complexity (allowing new users to use it without getting lost while expert users could enable all the features and even customize the toolbars), and the PC/GEOS environment itself provided multiple threads per process and preemptive multitasking but was fast enough to be considered "fast" on my 286 with 1MB of RAM and a VGA card.
The PC/GEOS folks got around the bloat because they were interested in doing so, and they were successful in almost all respects.
Modern coders seem a lot less interested in doing so, perhaps because so many of them take the bloat for granted. It wasn't always so, as many of us remember...
Microsoft Called.... (Score:3, Insightful)
Re:So (Score:3, Insightful)