Despite Aging Design, x86 Still in Charge 475
An anonymous reader writes "The x86 chip architecture is still kicking, almost 30 years after it was first introduced. A News.com article looks into the reasons why we're not likely to see it phased out any time soon, and the history of a well-known instruction set architecture. 'Every time [there is a dramatic new requirement or change in the marketplace], whether it's the invention of the browser or low-cost network computers that were supposed to make PCs go away, the engineers behind x86 find a way to make it adapt to the situation. Is that a problem? Critics say x86 is saddled with the burden of supporting outdated features and software, and that improvements in energy efficiency and software development have been sacrificed to its legacy. And a comedian would say it all depends on what you think about disco.'"
Does it matter? (Score:5, Interesting)
At this point, does it matter as much? As we move on the future is clearly x86-64 which is MASSIVELY cleaned up compared to x86 and is really rather clean compared to that. Sure at this point we still boot into 8086 mode and have to switch up to x86-64 but that's not that important, it only lasts a short while.
As we move off of x86 onto -64, are things really still that bad? Memory isn't segmented, you have like 32 different registers, you don't have operands tied to registers (all add instructions must use AX or something like that) as some 16/32 bit instructions were.
Of course, we should have used a nice clean architecture like 68k from the start, but that wasn't what was in the first IBM.... and we all know how things went from there.
It's hairy to emulate, too (Score:5, Interesting)
The x86 has so many modes of operation (SMM, real/protected, lots of choices for vectorizing instructions, 16/32/64 bit modes) and special cases that it's a pretty big project to get emulation working correctly (much less fast). You're pretty much stuck with a 10x reduction clock-for-clock on a host. Making an emulated environment secure is hard, too; you don't necessarily need specialized hardware here (e.g., specialized MMU mapping modes), but it helps.
And now, with transistor speeds bottoming-out, they want to go multicore and make *more* of the things, which is exactly the opposite direction that I want to go in...
Re:The X86 is a pig. (Score:3, Interesting)
In this day and age of multi-core CPUs, why not have a processor with a X64 ISA core and a core with the desired architecture. Let them run in parallel like 32/64 bit compatible CPUs. Old software would run on the X64 cpu and newer software or updated versions could run on the newer core. Maybe this could provide a crutch for the PC world to modernize over time.
60% of transistors used for legacy modes? (Score:5, Interesting)
Who is this guy and what is he smoking? Over half of a modern processor is cache. The instruction decoding and address decoding are a small fraction of the remainder. Where does he get the 60% from?
Re:The X86 is a pig. (Score:2, Interesting)
It's convenient to have a consistent interface layer, and the gate count cost of the translation is asymptotically zero. It makes writing good optimizing compilers for "generic x86" all but impossible, but fortunately the final levels of optimization are done in real time in the plasma processor. It's actually a pretty cool approach to squeezing as much parallelism as possible out of non-parallel code, given a transistor budget in the neighborhood of 1e8.
--
phunctor
+/- epsilon on the details...
Legacy Support Drives It (Score:5, Interesting)
I know we all bitch about old designs, legacy support for outdated features, but, one of the things that keep people from moving from one OS to another is "existing base of installed software" and "knowledge of exisiting software". Like it or not, the major player is Microsoft. No matter how much a geek says, MS UI's suck, people are comfy with them. If alternative OS's had the same software offerings with the same UI, people would be able to move to them. The same holds true for processors.
No matter how well a processor performs, if there is no application base for it, no one is going to buy a machine with that processor. In this case, perception is reality. You walk into a software store, you see 16 rows of Windows applications, half a row of Linux, and 5 rows of Apple.
What processor family runs each of these? Guess who has moved to the dominant processor?
The only way to build a software base is to build in legacy support. Then start weening users away from the legacy features, get programmers to stop using those features (mainly those building the compilers that developers use), and move towards the more advanced features.
x86 rules for a reason. Microsoft rules for a reason. The customer is comfortable with them, and their perception is reinforced everytime they go to the store.
Emulation (Score:1, Interesting)
Unfortunately, that would only eliminate a small fraction of the baggage. And I can't honestly say I'd trust Microsoft to do it right. If I depended on legacy apps for my business I would probably want to stick with the hardware implementation.
Nevermind.
Re:The X86 is a pig. (Score:4, Interesting)
Basically x86 isn't a perfect instruction set for today's landscape, but then again UNIX isn't a perfect operating system for today's landscape; that doesn't mean it's not still very good and we shouldn't praise those who have made it so good.
Some say plan9 has a better design than Linux, some say that PPC has a better design than x86, but apparently design isn't everything.
Lots of things could be better if we could get everyone to migrate from what they currently use, but would it be worth it in this case? I don't think so, at least not until we reach the limits that better design & hardware can do.
Re:Windows (Score:1, Interesting)
But the real point is: since you don't have any choice since this is a monopoly, what is the point of asking such questions? I mean, if NT was in fact just a bunch of MSDOS scripts faking to be a multiuser kernel, would you be able to complain about it? Would you be able to move to any other system anyway?
Re:Does it matter? Less than it did (Score:4, Interesting)
I would add to this that ISA mattered a lot more when I wrote code in assembly language. For a clean (and simple) instruction set architecture, I fondly remember the PDP-11 [wikipedia.org]. Later on, the 680x0 offered more powerful addressing modes for less simplicity (and consistency). Compared to both, the x86 was infuriating to work with.
ISA's still mattered, but less, in my early "C" days when source-level debugging was less robust, or even to understand what the compiler was turning my code into so I could figure out where to optimize.
Today, it hardly matters at all. Looking at generated code tells me little about how the processor with multiple execution units is going to process it; it is necessary to trust the compiler and its optimization strategy. It matters even less with interpreted or JIT'd languages, where the work eventually performed by the processor is far removed from my code. Knowing what's happening at runtime involves much more important factors than the ISA.
Re:The X86 is a pig. (Score:5, Interesting)
Re:English is 700 years old (Score:5, Interesting)
RISC architectures don't give very good code density, so ARM have their ARM Thumb compressed instruction set, thats the way the embedded processors acheive good power efficiency, by cutting down the amount of memory traffic that instruction requests generate.
You can think of x86 as a way to compress the storage needed to contain the equivilent RISC instructions needed to perform the same work, that means that you make better use of available memory bandwidth and caches etc, your memory is vastly slower than the processor so you've got to make use of its bandwidth efficiently.
Re:English is 700 years old (Score:4, Interesting)
Re:English is 700 years old (Score:4, Interesting)
If I could stick with Windows 2000, I would have. Windows upgrades since then have just been eye candy.
Re:Does it matter? Less than it did (Score:5, Interesting)
What's with all this dissing of the X86?
Like you, I'm an old fart; I wrote assembler code for the PDP-8, PDP/LSI-11 and the 68k. They were ok: easy to learn and use, but I always preferred the X86.
Sure, it was harder to learn and I never got past having the blue book on my desk when I was coding but, in the end, it produced smaller, faster code. There were a number of apps I wrote for multiple platforms, so I got to compare. Also, (the same reason I love perl) you could do astounding things with side-effects.
Commercially, X86 has staying power because it was architected to scale. Variable-length instructions with lots of space in the operator range lets Intel adapt the design to any new demands. Most, if not all, of the complaints about X86 (e.g. too few registers) are just version features—yesterday's news if there's a market demand for an improvement.
Bottom line—it ain't neat, but that doesn't matter; it's programmed once and used millions of times. Programmer convenience is irrelevant.
Re:English is 700 years old (Score:4, Interesting)
Each 4k page of code is compressed with whatever scheme is supported by the processor. The first byte indicates compression algorithm for the block or 0 for no compression (this byte is skipped over like a nop when executing sequentially from one block to the next). The block can't decompress to >4k, so many block will only be say 50% full. This saves on memory bandwidth even more than x86 and it is upgradable. If you want to save memory size too, each page can decompress to >4k and each jmp takes an address that is 50 bits of page address and 14 bits of offset into the decompressed page; we went to 64 bits to address data anyway not code. Then you get bandwitdth and in-core savings.
The CPU in some cases may have to fetch and decode a whole block just to run a 10-byte functions. So it could be an exceptionally bad idea, but I think compilers can learn to lay things out to minimize this (sounds like RISC...).
Re:In short:x86 is the result of natural selection (Score:3, Interesting)
One of the things about evolution is that it can only work with what it has, which is why our backs hurt all the time. Evolution can't just suddenly stick a good spine/leg support/locomotion system in, but works with what already exists, intended for quadrupeds. (This is, in essence, the area that the Irreducible Complexity crowd are attacking.)
But, look at x86 and its dominance over itanium. Itanium is a *good* design, but x86 is outcompeting the hell out of it because with a kludge here and a workaround there, it could be iteratively fixed up to outperform itanium. x86 has evolved to be the top dog despite going up against intelligent design of the itanium, showing that the criteria for success aren't always what we think they are.
Re:English is 700 years old (Score:2, Interesting)
I have to inform you that you have a very poor opinion of the easiest "natural" (vs. artificial such as Esperanto and Quenya) language on earth. I learned to speak English in six weeks!
English IS easy. Dead easy. If you find it difficult, come try French sometime : we have seven forms for every verb, and that's the first example only out of our ridiculously complex conjugation system. We have around 120 possible forms for every verb. English speakers get what, three, four? And that's for the irregular verbs, which happen to be ... twenty at most. Irregular means they don't follow the rule, right? Okay, here is the other rule : English monosyllabic verbs use the other vowels, but not the E. Drink, drank, drunk. No E. (Y doesn't count.) When in doubt, use euphony and talk fast enough to cover your possible mistake. (I can't remember the rule, have no English grammar on hand, don't want to search for one online, and never need it anyway, since I write without any fault in English. And French, which is my primary language.)
If that seems too anecdotal for evidence, let me just point out that I wrote an English grammar and it was 30 handwritten A5 pages. Now go look at the hulky monster in three volumes that's French grammar.
Re:English is 700 years old (Score:3, Interesting)
I almost did the same to an ex-gf's PC, but I first put an Ubuntu 6.10 CD in it, to see if it was compatible, what there exactly is in the computer, and such. Then I wiped the preloaded Vista and installed Ubuntu "for a try" and after 24hrs she told me she was not going back.
That's the woman who hated to have to use a computer for chatting and browsing. Since her first computer in 2002, she hasn't learnt one thing about how computers work, or even how to use them!
I'm not the first to point out that Vista is so confusing that switching to Linux or OSX is easier than learning to use it. But I've seen that it is true.
Re:English is 700 years old (Score:2, Interesting)
English does pull words from many other languages (legacy of our long history of international relations... both British and American), but a huge proportion of those words are rooted in Latin, Greek, and German/Flemish languages. Especially in the case of science/technology, Latin and Greek roots are pronounced. Studying Spanish for instance, I find it easy to make at least a few educated guesses about words because Spanish is largely Latin based (with Greek as well because the Romans used many Greek scientific words). Generally it is easy to look at an English word and at least guess at it's origin once you have a language or two to compare it with.
Also, many relatively new words, especially in the area of technology, are the same or very similar in other languages. In Spain, computer is 'ordenador' (literally, 'orderer'), but in Latin America it is 'computadora' ... which is obviously taken from English. Likewise, in Spain, a car is 'coche' (which probably comes from the older English word 'coach'... not the coach of a team but coach as in stagecoach... or at least that is how I remembered it at first), and in Latin America it is 'carro'... which again, is obviously in line with the English 'car'.
Anyway, what I am saying is that yeah, English is irregular and steals words from all over the place, but this promiscuity often makes it easier to learn (or easier to learn a language from which we have taken words), because there are a lot of cognates and relatively few truly original words once you get past the boilerplate.
Grammar likewise isn't too convoluted. Gerunds (-ing words) exist in Spanish as well (-ando, -iendo words). Same with participles (-ed words in English, -ado and -ido words in Spanish), some other endings (-er words in English, -or and -dor words in Spanish), and even some basic structures like 'I am going to' ("I am going to swim") which is 'Voy a' in Spanish ("Voy a nadar").
I know where you are coming from, and I have had non-native speakers say that the hard part about English is the vocabulary, not because the meanings of the words themselves are that hard to figure out, but because we have no hard and fast rules of pronunciation (since the words we took from other languages were either taken verbatim or converted to be more 'English-Sounding').
Since I only speak two languages, I can't generalize to all non-native speakers, but these are my observations with Spanish at least (which is the second of my measly two).
Re:The X86 is a pig. (Score:3, Interesting)
Re:Simple! (Score:3, Interesting)
Not to me or car maufacturers. If we started from scratch today could we do better than current engines? I guess maybe. I don't really know. If so, you'd think someone would on some significant scale.
It's considerably more obvious to me that starting from scratch one could design a more efficient microprocessor instruction set than x86. People have, repeatedly.
So the engine case seems like a pretty lame way to make any kind of argument about the instruction set case; the analogy is less clear than the question.
And that's not even getting into the basic failure of the analogy in the first place: People keep adapting x86 for backward-compatibility reasons. These do not apply in the engine case. If you can make a more efficient way of deriving propulsion from gasoline, what's stopping you?
Troll? (Score:1, Interesting)
Lets look at the argument of x86 instructions being more compact than RISC instructions. They sure are more compact -- in the binary stored on the hard disk. When modern x86's pull these instructions in, they get decoded into micro-ops which are then stored into a trace cache. That's right, the instruction cache that Intel hopes is being used to read instructions stores RISC instructions because it is far more efficient for execution. Also, since instructions are stored in traces, a single micro-op can appear many times within the trace cache -- it just leads to higher performance to fetch work this way. So again, no, RISC instructions don't screw up the instruction cache. The modern x86 instruction cache actually stores RISC-style instructions.
The reason that x86 outperforms other architectures is largely due to the MHz scaling they were able to pull off. By playing in the large commodity CPU market, as opposed to niche supercomputer processor markets, they earned a ton of money. They wisely invested this money back into their chip FABs. Intel VLSI and manufacturing prowess is what led the big crush against the supercomputer processors. Intel microarchitecture actually emulates RISC microarchitecture -- their marketing just requests they don't broadcast that fact too loudly.
Re:English is 700 years old (Score:2, Interesting)
Re:English is 700 years old (Score:2, Interesting)
http://www.verbix.com/imag/map_indoeuropean.gif [verbix.com]
Wikipedia also has an interesting article on the Germanic languages.
http://en.wikipedia.org/wiki/Germanic_languages [wikipedia.org]
store string? (Score:2, Interesting)
But there's another reason your instructor was mad at you. Yes, the mechanically generated instructions could have been replaced with more efficient sequences. But you do _not_ want to do that in your first pass with a mechanical translator, especially if it's the first one you've ever written. Once you know what you're doing you can build optimizing passes and combine passes and such, but if you knew that much there would be no reason to take the class you were taking (unless it was just for the grade) and you should also know better than to try to confuse the other students.
joudanzuki