MenuetOS, an OS Written Entirely In Assembly Language, Inches Towards 1.0 372
angry tapir writes "MenuetOS is an open source, GUI-equipped, x86 operating system written entirely in assembly language that can fit on a floppy disk (if you can find one). I originally spoke to its developers in 2009. Recently I had a chance to catch up with them to chat about what's changed and what needs to be done before the OS hits version 1.0 after 13 years of work. The system's creator, Ville Turjanmaa, says, 'Timeframe is secondary. It's more important is to have a complete and working set of features and applications. Sometimes a specific time limit rushes application development to the point of delivering incomplete code, which we want to avoid. ... We support USB devices, such [as] storages, printers, webcams and digital TV tuners, and have basic network clients and servers. So before 1.0 we need to improve the existing code and make sure everything is working fine. ... The main thing for 1.0 is to have all application groups available'"
Assembly == SLOW ; JAVA == FAST! (Score:5, Funny)
ASSembly language? This OS will run as slow as frozen mollasses.
I want my computer to run FAST!
I want an operating system written in Java, running on a java interpreter written in java [add recursion here].
Java is faster than C, so logically, java must be faster than ASSembly. More java == more speed.
Lets get to it!
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Funny)
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Funny)
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Funny)
Might actually be the case (Score:5, Insightful)
Re:Might actually be the case (Score:5, Insightful)
Re:Might actually be the case (Score:5, Insightful)
As someone who has spent a lot of time optimizing assembly code (17 years in the games industry) I can tell you this:
As the number of CPU registers increases it gets harder to beat the compiler. It's not too hard to better x86 compiler output. It's pretty difficult to improve PPC code.
What assembly optimization gives you is a significantly better understanding of the data flow. Since all CPUs are memory bandwidth bound now knowing where the memory accesses are allows you to restructure the data to make the algorithm more efficient. You also must understand what the compiler is doing to your code so that you can make in access your data as efficiently as possible.
To conclude, you can beat any compiler by hand, so long as you have more information then the compiler has.
Re:Might actually be the case (Score:5, Informative)
Three points:
1) Compilers vs Humans
You have to start by doing an apples-to-apples comparison. Yes, many developers these days are ignorant of low level details about assembly language, and would therefore not produce assembly code that is as good as what comes out of a compiler. But that is because the compiler isn't built by your standard run-of-the-mill code monkey. They are built by people who truly understand the issues involved in creating good assembly language. So you need to compare assembly created by a compiler vs assembly created by someone who is at least as skilled as the people who created the compiler. In such a comparison, the humans will generate more efficient code. It will take them much longer (which is one of the two reasons why we have compilers and high-level languages), but they will generate better code.
2) Why write assembly language
No, one does not write assembly language for "fun" - there are specific business reasons to do so. Replacing inner loops in performance-critical loops with hand-coded assembly language is a common example. Most major database companies have a group of coders whose jobs are to go into those performance-critical sections and hand tune the assembly language. Would I try to write a GUI using assembly language? No, because it simply isn't that performance sensitive. Choose the tool that fills your needs. Religion about tools is just silly.
3) Out-perform C
No. Given coders of equal skill, all of the common high-level languages (Java, C, C++, etc) are identical in terms of CPU-intensive performance. That's because the issue is more one of selecting the correct algorithms and then coding them in a sane manner. It is demonstrable that Java can *never* be more efficient than a corresponding C program because one could always write a C program that is nothing more than a replacement for the standard Java JVM (might be a lot of code, but it can be done).
The place that one starts to see differences in performance is in the handling of large data sets. Efficiently managing large data sets has much more to do with management of memory. Page faults, TLB cache misses, etc have significant performance impacts when one is working on large data sets. Java works very hard to deny the developer any control over how data is placed in memory, which leaves one with few options in terms of managing locality and other CPU-centric issues related to accessing memory. C/C++ gives one very direct control over the placement and access of objects in memory, hence providing a skilled developer the tools necessary to exploit the characteristics of the CPU-CACHE-RAM interaction. It is laborious, to be sure, but C/C++ allows for that level of control.
So it all boils down to what one is implementing. If I were implementing a desktop application, I would probably use Java. The performance demands related to memory management are typically not very great and Java's simpler memory management paradigm streamlines the development of such applications (not to mention the possibility at least of having one implementation that runs on multiple platforms). If I were implementing a high volume data management platform, I would use C++ because the fine grain control of memory management provides me the necessary tools to optimize the data-intensive performance.
Re:Might actually be the case (Score:4, Insightful)
"The compilers, for the most part, are smarter than people at optimizing code."
No, they emphatically are not. No computer algorithm is any smarter than the people that wrote it (in fact it's always going to be dumber.) If the compiler is better than YOU are at optimizing code, that may well be true and understandable - presumably optimising assembler is not your specialty, after all.
But a competent assembler specialist (someone in the same league, skillwise, with the guys that write the compiler) will beat the hell out of any compiler ever made. There just is no question. He knows every technique the compiler knows, but he is better equipped to know when and where to use them.
Compilers serve many valid purposes. They allow less skilled programmers to still produce a usable product. They allow more skilled programmers to produce a usable product more quickly. They facilitate portability. Plenty of good reasons for them to exist and be used. But beating a competent human at optimisation tasks is not one of them.
"Java has the added advantage that it uses Just-In-Time compiling, so there's a lot of cases where Java, or .Net or any other language that uses an intermediate byte-code and actually outperform C."
I know a lot of people think this, but it's nonsense, as a moments reflection should make clear.
I have no doubt that a poor coder might find JIT improves the performance of his code, but that really doesnt justify the assertion. You would need to show that JIT can actually beat a well written C program, and it wont. It cannot. Absolute worst case, if he has to, the C coder could simply implement a VM and JIT in his program and achieve the same results - and that is a tie. C cannot possibly lose that comparison, the worst it could possibly do is tie.
Re: (Score:3)
Re: (Score:3)
There is almost no reason to write code in assembly anymore, other than "because we can", which is a fine reason for a "fun" project. However I wouldn't write assembly if I was trying to run a business. Java has the added advantage that it uses Just-In-Time compiling, so there's a lot of cases where Java, or .Net or any other language that uses an intermediate byte-code and actually outperform C.
I keep hearing the arguments but I don't see the fruits. I expect Java and .Net apps to run slow and take up irrational amounts of memory for what they do and I am rarely disappointed. I expect the handful of ASM function libraries we use to significantly outperform c code and they always do.
If these high level languages are so great where are the commensurate outcomes in real life? Where are the cutting edge Java and .Net games? Browsers? Operating systems?
In cases where the developers time is worth
Re:Might actually be the case (Score:5, Interesting)
Back when I was trying to write games, 20 years ago, I figured out pretty quickly to write the important parts in assembly and the rest in C. But not before I wrote a full screen graphics editor in assembly. That was about 1200 lines of awesomeness that took me about 7 months to write. Fortunately, most of the graphic work carried over to the main game itself. Recently, I did a recreation of that work in C#. What took me over 2 years to do in 1994-95 took me a weekend to do now. My how times have changed.
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Insightful)
You jest, but wikipedia says it will boot in five seconds on a 90 mHz Pentium FROM A FLOPPY. Is that fast enough for you, javaboy?
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Informative)
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Insightful)
Yeah, outside a few rather narrow cases, modern CPUs have just gotten too complicated to write efficient assembly for.
That says more about lousy CPU architecture design with bloat and incredible inefficiencies than it says about software developers who can write software for those CPUs. Then again, look up the RISC vs. CISC debates about CPU design and you might be surprised about CPU complexity as well.
Otherwise, most of the CPU complexity that currently shows up is due to the fact that the CPU speed far outstrips the memory bus speed, thus all of the concern about "local" memory caches and pipelined instruction ordering. If you could create a much faster memory bus, CPU designs could be simplified considerably from a software developer POV.
Of course we are still living with the effects of the 4004 design architecture that lives on with the laptop I'm using right now (and is opcode compatible with the 4004 instruction set still capable of being used as a strict sub-set of the opcodes used by my laptop's CPU). That is another reason for much of the added complexity in CPU architecture design, as few if any CPU designers want to abandon the software developed for earlier generations of CPUs as a way to promote their new design. Instead, they tinker on the edges and keep piling new instructions onto the existing heap of instructions and keep making the CPU more and more complex as a new generation of designers is in charge.
Re: (Score:3)
Of course we are still living with the effects of the 4004 design architecture that lives on with the laptop I'm using right now (and is opcode compatible with the 4004 instruction set still capable of being used as a strict sub-set of the opcodes used by my laptop's CPU).
Well, you have a crappy laptop then. Most laptops out there have a CPU with an instruction set such that assembler-language code for the Intel 8080 could be mechanically translated to assembler-language code for that instruction set (although the machine codes were not compatible), although the instruction set in current CPUs have been at least extended significantly to a 32-bit version and, in most cases, to a 64-bit version.
(I.e., you've confused the 4-bit 4004 with the 8-bit 8080 and you've confused "s
Re: (Score:3)
It was so bad that I had a 70 year old former transmission systems electrical engineer (so nothing below thousands of volts) rant at me for hours about how bad it was when it came out. When a guy who graduated from University before the transistor went on sale can see a dozen holes in microprocessor design then you've got a problem.
I really should retire the last netburst machine I've got but it has a lot of drive bays and freebsd can still do stuff with it.
Efficient assembly is still quite doable ... (Score:4, Interesting)
That said, I rarely use assembly in professional software development situations when targeting computers and mobile devices. Before addressing your comment let me add one more thing. Learning assembly is important with respect to making you a better C programmer. I'm not talking about understanding the asm code you see in the debugger. I really am referring to writing C code. C code can be written in architecturally specific manners. The code is still portable, yet more efficient only on a specific architecture. Understanding the architecture and assembly language can allow you to write more efficient C code.
Yeah, outside a few rather narrow cases, ...
That is nothing new, this has been true for decades. You have to go back to Apple II and Commodore 64 days to find a timeframe where assembly was appropriate for general purpose software development. I'm referring to computers, not micro controllers and other such environments.
... modern CPUs have just gotten too complicated to write efficient assembly for.
This is absolutely false. The key to writing good assembly code is not trying to out-compile the compiler, that is a newbie thing to do. The key is to leverage knowledge that can not be transmitted to the compiler, can not be expressed in C. This is where the real win is, this is where assembly can still beat C even after a couple of architectural upgrades.
Re: (Score:3)
I'm not sure what level of "architecture" you are talking about. An assembly language usually reveals relatively little about the architecture underlying the CPUs that execute the code generated by the assembler.
That is a very modern x86 centric view. Even so we still have an "instruction set architecture" (ISA) in the modern x86 case. This ISA limits the underlying hardware architecture's (micro-ops) view of the software's intent. We still have the case where optimizing at the ISA level can also improve performance at the micro-op level. Furthermore understanding the underlying micro-op architecture can help to write more efficient code at the ISA or C level. This level is not documented but a little is known abou
Re:Assembly == SLOW ; JAVA == FAST! (Score:4, Funny)
So, what you're saying is that the C compiler is better Assembly coder than you are. I feel your pain on that one.
Re: (Score:3, Insightful)
If a compiler can't produce better assembly than any one programmer, it's time to get a new compiler.
I mean, that's pretty much the point of upper level languages to begin with.
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Insightful)
If a compiler can't produce better assembly than any one programmer, it's time to get a new compiler.
I mean, that's pretty much the point of upper level languages to begin with.
Hardly.
The point of high level languages is to improve the productivity of software developers with the full knowledge and recognition that compilers will always do an inferior job... trading off efficiency and memory space for increased throughput of the developer.
That some compiler developers using some languages (notably C) can usually produce software that is within a few single digit percentages of the efficiency of a hand crafted assembly code written by competent and knowledgeable programmers well versed in those respective languages (an important distinction.... you need to compare expert C programmers with expert assembly programmers) for a few benchmark algorithms is besides the point.
Grace Hopper, one of the developers of COBOL, created the programming language for some very different goals than what you mention. Portability (the ability to move from one CPU architecture/operating system to another) was a key component and rationale. The other was to use "understandable" words that supposedly non-programmers could at least in theory be able to read. It is debatable if COBOL actually met those goals. Another very significant goal is to increase the volume of software produced by the individual software developer (mentioned above), and perhaps finally the point of a high level language is to reduce the learning curve for somebody new to computers and software development to be able to become the expert that is needed to get software developed.
I would agree that some compilers need to be tossed out the window in terms of their code efficiency, but you might be surprised at which compilers really can make the cut or not as well.
Re:Assembly == SLOW ; JAVA == FAST! (Score:4, Interesting)
Re: (Score:3)
And, under ideal circumstances . . . its gonna be hard to beat Assembler. (Slower to market . . . perhaps . . . but faster for you the next 20 years as you run it.)
You may be right, but I wouldn't bet on that. First of all, it's not unheard of for a CPU architecture to die in 20 years, but that aside: It would be interesting to take an x86 assembler program written 20 years ago and run it on modern hardware, then perform the same experiment with a C program recompiled with a modern compiler.
The 20 year o
Re: (Score:3)
That is a terrible example. Besides, if you want to do that in something like Visual BASIC it would be more like:
MsgBox("Hello, World!")
or something similar. Like I said, it would need to be seriously updated if you are going to be using current compilers. Indeed most versions of BASIC that I've seen that aren't historic compilers, hence have also been optimized to be running on the current generation of computers, don't even use the PRINT command.
As I said, it would need some substantial tweaking if you
Re:Assembly == SLOW ; JAVA == FAST! (Score:5, Informative)
So, what you're saying is that the C compiler is better Assembly coder than you are. I feel your pain on that one.
Indeed. I spent 5 years supporting a production commercial OS written entirely in assembly (one of many forks that happened when IBM started licensing the source for their old mainframe OS). Today I let a C compiler do it's job on my personal projects.
Can you write faster code than the compiler - sure you can, though it requires a deep understanding. But that code will be crap unmaintainable code. There was a day when C was called a high level language, and in a meaningful way it still is. You can write good maintainable C code that doesn't look optimized and get nearly-perfect assembly that bears little resemblance to the source.
The worst choice in C is to think you need to help the compiler optimize. Seriously, the compiler doesn't care at all whether you write x = x << 1; x += x; or x *= 2; it sees them all the same, so code the one that makes sense in context.
Giving the compiler hints can be useful ... (Score:3)
The worst choice in C is to think you need to help the compiler optimize. Seriously, the compiler doesn't care at all whether you write x = x << 1; x += x; or x *= 2; it sees them all the same, so code the one that makes sense in context.
Historically helping the compiler, giving it hints, is in fact a good way to get superior code out of a compiler. For example consider 4x4 matrix multiplication. Do you use nested loops or just unroll it manually? Compilers tend not to fully unroll all the nested loops. The compiler may do better scheduling on fully unrolled non-looping code. Do you create temporary variables to preload a row or column, or do you just access each variable in memory directly? The former may generate better code on a RISC arc
Re:Assembly == SLOW ; JAVA == FAST! (Score:4, Interesting)
You can also do this with DOS & Windows 3.1 on a modern CPU: an i7 has 8MB of cache, which is more than most PCs c. 1995 had in their entirety.
Re:Assembly == SLOW ; JAVA == FAST! (Score:4, Informative)
Can fit in cache != will be in cache. On a modern multi-GB system the memory paging index alone is going to dwarf the size of this OS's code. Then there' all the rest of the OS data, plus the even more frequently used application code and data. Certainly shrinking the OS code size drastically will help free up more cache space for other uses, and the most heavily used parts may be able to remain in the cache most of the time, but it's almost a guarantee that most of the time most of the OS code won't be in the cache.
Re: (Score:3, Insightful)
It's the same way that your Bicycle is faster than your Jet Fighter Plane that is held together by duct tape and you've 'fueled up' with whip cream.
Re: (Score:3)
If your C is faster than your Assembly, that's because your Assembly is crap.
You are kidding yourself if you think you can write programs in assembler that run faster than the equivalent in C. Look at what my compiler generates for the C statement "return x/372;" with 64 bit ints on x64:
movslq %edi, %rax
imulq $738919105, %rax, %rax
movq %rax, %rcx
shrq $63, %rcx
sarq $38, %rax
addl %ecx, %eax
popq %rbp
ret
Your only practical approach as a human writing this in assembler is to use the slower 64 bit divide instruction. Puzzling out optimizations like this is a job muc
That's not the most important thing (Score:5, Funny)
So before 1.0 we need to improve the existing code and make sure everything is working fine. ... The main thing for 1.0 is to have all application groups available
Nah, main thing is to find a working floppy disk and drive.
Re:That's not the most important thing (Score:5, Informative)
Just tried it in Virtualbox, and it has made strides since I last tried it some years ago. Some notes:
Select "Other/Unknown (64-bit)" in the Operating System type drop-down, unless you specifically download the 32-bit version.
Add a floppy controller and add the image as a floppy disk attached to that. Delete the other controllers that are present by default, unless you have a specific reason not to (like listening to your outdated music on disc from within MenuetOS, or loading a WAD or PAK file for Doom/Quake).
Does not work with my work MacBook's iSight camera (afaict).
Boots in 5 seconds, and I'm thinking of ways to demonstrate it to students at the schools where I work.
Re: (Score:2)
Apparently my company still stocks floppy disks. There's still a bunch in the supply cupboard, even though you can search all the cubes on this floor and not find a single drive. Someone really needs to update the order inventory.
Re: (Score:2)
here ya go...
http://www.newegg.com/Product/Product.aspx?Item=N82E16821121001 [newegg.com]
http://www.amazon.com/Verbatim-3-5In-1-44MB-Pre-Fmt-10Pk/dp/B0000511BI/ [amazon.com]
Re: (Score:2)
I keep an LS120 in my desktop PC just in case. Guess how many times I've used it.
I did use the Sony USB floppy once. So I consider that two dollars well spent at a yard sale
Choose your own adventure, drinkypoo (Score:5, Funny)
"Help! Help!" she cried.
Most of them all just walked by, ignoring her. She might be jaw-droppingly gorgeous 36-24-36 redhead with pouty lips in a light dress, but she was clearly crazy, waving that weird plastic thing around. "You!" she urgently growled, as she seized a suited man's lapels. "Can you help me read this? It's the only copy!" This looked like a successful man, so surely he could afford a decent computer, right?
He blushed. He wanted her, and maybe could use it to get her back to his place, but the lie would eventually come out. And this woman had just the sort of sincere bearing, that he knew the lie would be punished, not accepted. He sighed. He glanced at the disc again, just to make sure what he was seeing, and sighed. "No. Sorry. What is that, anyway?"
She let go, and cast her glance around the crowd.
And that's when it happened: Drinkypoo. The moment she saw him, she knew. This was the one. She didn't even have to ask, because she knew. But she decided to ask anyway, so that he could have the pleasure of saying yes and offering to help. It would be the first of many pleasures that he would experience.
"Can you read it? There are some really important files from the 1990s on here. I've got to have them. I'll do anything, Drinkypoo. Anything."
There was a time when Drinkypoo would have grinned nervously, or asked followup questions. But by now, he knew the story. He didn't want to hear the story anymore; he just wanted the reward. Drinkypoo didn't so much as even glance at the disk. Why so many unbelievable hot women always had invested in LS120s and floppies to store their critical records, he didn't know. Maybe some freak effect of a one-shot Imation advertising campaign, long ago. No one even really knew why, for sure; they just knew that a shitload of the drives had somehow ended up in the possession of that demographic.
He squinted and recalled his depleted inventory. That blonde with the 1.4MB floppy from two nights ago. "Got to stop at the drugstore on the way. C'mon," he replied. She stuffed the disc into her purse for safekeeping and they walked to his car.
Half an hour later, at his place, she sat on the couch, nervously clutching her purse containing the precious disc. Drinkypoo sauntered over, and held out the glass of rye bourbon on ice. "Here, relax. Everything's going to be okay." She sighed with relief, set her purse upon the coffee table, accepted the glass and sipped. "Thanks."
"You can't imagine what this means to me," (though Drinkypoo actually knew very well), "I need those files so badly." She reached for his belt buckle for a brief moment, but paused. "First things first, though, I suppose."
Drinkpoo shrugged. What did it matter to him? It would all work out the same. This was going to be fun no matter which order things occured. "Sure." He walked over to the computer desk and nudged the mouse, waking it up from sleep, casting monitor light across the room. "Let's have the disc."
She smiled, and reached into her purse, pulling out a .."
[CHOOSE YOUR OWN ADVENTURE:
1) .. gun.
2) .. Zip or Jaz disc.
3) .. 3.5" floppy or LS120 disc. (Drinkypoo, it will cost you 0.01 BTC for me to write this version.)
]
Cool! (Score:2)
Re: (Score:3)
Likewise.. and it illustrates just how bloated "modern" OS have become. Then again, the generally accepted definition of "OS" seems to have changed to include things such as desktop and window managers.
Re:Cool! (Score:4, Insightful)
People want their operating system to let them operate their computer. Conveniently. You can certainly still get command-line-only linux distros if you care, and you're the kind of technology specialist who might get utility out of that.
Re:Cool! (Score:5, Insightful)
Re: (Score:2)
People have gotten so used to the idea of a computer being a general purpose machine they've forgotten that you could just run a kernel and application set that requires no human interaction software at all. For
Re: (Score:2)
Thank you for a better explanation of what I was alluding to.
Re: (Score:2)
Without basic tools to do so, how are you going to get software on your Operating System? Could we automatically boot to floppy like an Apple ][? What do you even want?
Re: (Score:2)
And an ever-growing array of utility software. What OS is complete without a media player, web browser and Minesweeper?
Lame (Score:3)
Real Programmers write assembly language operating systems on punch cards.
Re: (Score:2)
It's been awhile, but I thought we wrote Fortran on punch cards. It's possible my memory is going.
Re: (Score:2)
Re: (Score:3)
Real programmers engineer trees to grow punch cards with self-evolving code already punched on them.
I suppose this means that, yes, God is a real programmer. I think that makes perfect sense. The source code for everything is perfectly available (MIT license), but there's no documentation and there are no comments in the code at all. In particular, the build environment is completely left as an exercise for the user. The original developer has been so silent for so long he clearly considers the code "ma
Re: (Score:2)
Re: (Score:2)
Hexadecimal?! Luxury! We had to code ones and zeros, on clay tablets.
Pointless? (Score:2, Insightful)
I wonder ... (Score:2)
Re: (Score:2)
I wouldn't consider comparing a manually written routine to what GCC outputs cheating. If you have an optimizing compiler available to you, why not learn its tricks so that you can write better code yourself?
Re: (Score:2)
I have no knowledge, but I'd imagine the optimizations from a compiler would have a tell-tale signature.
Changing definition of Kernel (Score:3)
If we are starting to believe that the core of an operating system should include a full GUI, video and mp3 playback, audio, USB, network, etc. for the least possible battery use, then this is a really cool way to go.
Why waste the resources? Just cause we can?
If we are to rethink what a basic operating system of today ought to have right out of the box from the first nanosecond, then I'm sure there is a lot of reengineering that would happen to any Linux or Windows kernel.
Re: (Score:2)
Touché
so.... (Score:2)
i.e., from the sounds of it, it is not multi user, and everything runs with superuser privileges. It is written entirely in assembly language which adds another level of complexity for the programmer to deal with.
Whilst it sounds like an
Re: (Score:2)
13 Years to 1.0? (Score:3, Funny)
So obviously their development cycle is much faster than HURD!
NOT OPEN SOURCE (Score:5, Funny)
"MenuetOS is an open source"
Not the x64 version, which is the version that's actually worth a shit.
Why do it? (Score:2)
Because assembly programming is really fun. It's challenging and it's a pretty unique experience for most of us who rarely touch systems at that low a level.
Re: (Score:3)
Well, technically we all "touch" systems at this level, we just don't realize we are doing it. Learning/Using Assembly is like learning/using arithmetic instead of using a calculator. It is very handy and gives you a core appreciation for what is happening in complex problems, however most professionals just plug it into a computer rather than do it because it becomes too cumbersome at a certain level.
Re: (Score:3)
Well, "technically" I dabble in electrical engineering every time I flip on a light switch. I think my point was pretty clear.
What a... (Score:2, Funny)
What a senseless waist of human life.
Assembly has very little advantage any more (Score:5, Insightful)
As someone who writes bootloaders using both C and assembly language there really is very little advantage to using assembly any more. The C compiler gnerates very good assembly code at this point that is very compact if the right parameters are used. At this point it is difficult to exceed what the compiler does in terms of code density and it's a hell of a lot easier and faster to maintain C code than assembly.
In my last bootloader I had to fit a MMC/SD bootloader in under 8K. In that space all of the assembly code fits in the first sector along with the partition table. The assembly code sets up the stack and does some basic CPU configuration and contains the serial port routines just because I had plenty of space. The rest of the bootloader contains all of the SD/MMC driver, FAT16/32 support, CRC32 and more. Note that this is MIPS64 code. The bootloader is able to load the next stage bootloader from a file off of a bootable partition from the root directory, validate it, load a failsafe bootloader if the validation fails and launch the next bootloader, all in under 8K. Having disassembled the output using objdump the compiled code is often better than hand coded assembly since the compiler can often find a smaller sequence of instructions. Not only that, but the compiler can order the instructions better for performance since it knows the CPU pipeline quite well.
You don't need to write in assembly for something to be small, just don't throw in a bunch of unneeded crap.
-Aaron
Re:Gotta ask ! (Score:5, Insightful)
* It's fun?
* They can?
* They want to?
* To learn something?
I'm sure there are a few more. Does anyone have not-boring questions?
Re:Gotta ask ! (Score:5, Funny)
Masochism?
Re:Gotta ask ! (Score:5, Insightful)
*Because someone should remember how to do this?
Re: (Score:2)
Re: Gotta ask ! (Score:2)
And security, who is going to write malware for that? Iran needs this controlling the centrifuges.
Re:Gotta ask ! (Score:5, Funny)
Because * It's fun? * They can? * They want to? * To learn something? I'm sure there are a few more. Does anyone have not-boring questions?
* he's following in the great Finnish tradition of writing your own OS... :-)
Re:Gotta ask ! (Score:5, Funny)
Neither. That's why we write things in assembly.
Re: (Score:2)
Re: (Score:2)
Outside of the hobby aspect of it, there could be a real future in lower end devices.
Consider the resources that Android takes up. If you have something that is this small, efficient and presumably stable and you need to build out a lot of very small factor devices (phones, ereaders, tablets, medical equipment) something like this would be a very good thing.
Re: (Score:3)
There's already a pretty big market of embedded devices running Linux or FreeDOS... and who's to say that another competitor couldn't offer yet another, better option?
Re: (Score:3)
Re: (Score:2)
They better get it perfect the first time cause this thing will be impossible to maintain for anybody else and possibly for the original developers too. Forget it, there is no practical use for it, it's just a hobby (not that that's a bad thing).
Re: (Score:2)
The issue is though that powerful CPUs are getting really cheap. Devices that don't need such power are finding themselves embedded GHz-class processors because that's the lowest end available that has sufficient power.
There's a gap betw
Re:Gotta ask ! (Score:5, Insightful)
Why would someone want to do a rewrite of Minix 20 some odd years ago?
Re: (Score:2)
Re:Gotta ask ! (Score:5, Interesting)
Why?
Well, I'm not well versed with the MenuetOS, but I've written a few toy OSs; Particular of interest to me is 16bit x86 which retains memory segmentation, and via other modes of operation (such as Unreal Mode enabled via addressing line 20) one can escape the 640K and/or 1MB limit whilst retaining other hardware features. Why go low level? Because programming paradigms are platform dependent. C is a product of its Von Neumann environment, and the dialog back and forth between hardware and software designers over the years has shaped our current methodology and hardware feature-set -- sometimes for the worse, IMO. E.g.: When we sacrificed a whole register and its memory segmentation capabilities to the RAM gods we made position independent code far more inefficient, and things like coroutines, stack smashing protections, multi-dimensional V-Tables for stateful OOP (eg: methods that change functionality depending on an object's "state" field), separate stacks for code pointers and parameter data, heap code pointer isolation, etc. more difficult (or near impossible) to accomplish. Instead of ASM I use a low level compilable and interpretable bytecode language for writing my OSs, and have multiple functional C compilers implemented atop the byte-code language in order to utilize existing hardware drivers -- However, I frequently dive into the machine specific ASM in order to implement and explore new features.
I started doing hobby OS work for fun, but after reading Ken Thompson's On Trusting Trust ACM acceptance "presentation" I decided that it would be a worthwhile project to create and maintain a few machines isolated from the rest of the world which are bootstrapped and programmed entirely by me. I now use them to write my memoirs and for teaching children how to code and build custom hardware and robotics -- The parallel port is very simple to utilize without a massive OS like GNU/Linux in the way...
The better question is why would anyone attempt to implement a POSIX OS today? We have working implementations. That feature set is proven. If we want to make advancements in OSs and hardware architectures we'll have to try doing things in different ways. Some of my OSs are cross platform; In one all programs are compiled down to the bytecode form by the VM ASM, or C compilers. The OS itself translates byte-code into machine code ON INSTALLATION, and gives the option to instead interpret a program if its untrusted (or in development, or self modifying) to contain it in an actual sandbox. On 32 bit x86 with more than one bit worth of execution privileges I'm able to enforce at the hardware level something akin to kernel mode at the application level for applications to isolate themselves from possibly untrusted plugins, and allow transparent lazy linking with emulated bytecode modules and optional JIT translation.
For me the aim isn't to replace GNU/Linux or any mainstream OS. The aim is to explore unexplored problem spaces with the hope that any useful techniques may find their way into mainstream usage. Consider Google's NACL and LLVM. I'm not saying experimental OS projects which pre-date them are responsible for their current capability to compile C into cross platform bytecode and compile it into machine code, but us hobby OS folks HAVE been doing just that for longer than their projects have existed. Our experimental niche OS endeavours occasionally blaze the trail towards new features in mainstream hardware and software. Eg: I'm hoping for code pointer isolation and buffer overrun prevention via dual stack architecture, and for the ARM folks to give us more that 1 bit of privilege level so that secure designs other than Monolithic Kernels can be implemented...
To the other commenter above who thinks "decades-of-practice" will allow optimizing compilers to utilize hardware features that are forbidden or don't exist: Ugh, no. I have decades of experience writing ASM and can write or generate much smaller & more efficient position inde
Re: (Score:2, Insightful)
compilers can never optimise better than a human, if you disagree you havent witnessed the demoscene.
Re:Gotta ask ! (Score:5, Insightful)
Compilers can never optimize better than the *best* humans, operating without time constraints. Very few programmers have that level of skill, or the time to spend on the task. That's why optimizing compilers were invented.
Re:Gotta ask ! (Score:5, Interesting)
Re:Gotta ask ! (Score:5, Interesting)
Back when I first started, compilers were pretty stupid and basically did a 1-to-1 translation of source code to sequences of assembly language. It didn't take much to do better simply coding assembler by hand.
Somewhere about the mid 1980s that stopped being true. The IBM mainframe Pascal/VS compiler generated heavily-optimized code. It was sufficiently clever in its use of machine resources that it was on par with hand optimization.
The clincher, however, was that it could do this job in seconds, whereas hand-coding would take hours. And if you made significant changes to the algorithms you were coding, it would re-optimize each time. It would have been a total rewrite to get code that tight by hand. And even though we didn't have the insane time constraints that modern-day projects typically have, we didn't have enough time to make it worth doing that even to save expensive mainframe CPU cycles.
Re: (Score:3)
I have to agree on that. While I always thought ASM is great, its greatness isn't universal. At any rate, *this* language will be better than *that* language for *this* specific task/project.
e.g. industrial robot programming, where you need to obtain maximum of speed and precision fit within minimum output size (e.g. 4 KB of storage, be it ROM, EEPROM, whatever) - that's where ASM shines. On the other hand, if you want to build an operating system, ASM will be better in less than 10% of the entirety of the
Re: (Score:2)
That is NOT why optimizing compilers were invented.
Re: (Score:3)
Yes they can. Try making use of a CPU with 8 cores and writing your threaded application in assembly. Someone with a compiler in a higher level language will kick your ass back and forth. The simpler the arguement the truth your argument holds, but as CPUs get more complex. Nope! Look at how difficult game programmers found it to program for the PS3 Cell architecture. Try having them write an entire game in assembly. Ha!
Re: (Score:2)
And he's probably right, assuming he's a reasonably skilled coder.
It's the difference between a factory-made mass-produced "good enough for most uses" product and an artisan-made hand-crafted "best of the best" product.
Re: (Score:2)
So they're easier to carry, there is no good reason and because you have no choice in the matter, respectively.
Re: (Score:3)
cc -x c - <<'X' && ls -l a.out
#include <stdio.h>
int
main(void)
{
printf("hello world\n");
}
X
-rwxr-xr-x 1 fisted users 7318 Nov 15 17:25 a.out*
Re: (Score:2)
Now you've got to spend days trying to find a disk drive.
The researchers plan to add networking capability within the next 13 years, once they decide it's not just a temporary fad.
Re: (Score:2)
But did it support USB, TV tuners, and webcams?
Re:I can beat that (Score:5, Funny)
But did it support USB, TV tuners, and webcams?
Yes it did! GEOS supported all the existing USB TV tuners and webcams of its time.
Re: (Score:2)
I never said it wasn't an OS, I was just pointing out one of the things that would make this more of an accomplishment.
Re: (Score:3)
A lot of the bloat in modern OSes comes from having to support a wide range of hardware - it's one of the reasons Linux can scale down to run on a tiny embedded system if you strip out that hardware support and other unneeded features (such as a fancy UI). You might even find that it doesn't scale well onto higher-end systems.
Re: (Score:2)
Embarrassing the creators of all the OSs that take five minutes to reach desktop.
Re: (Score:2)
Assembler really isn't that hard. My first three years as a programmer were assembly language only for embedded applications. It's like anything else -- you memorize stuff and after awhile you see every problem as a list of assembly instructions. Later, using C seemed like cheating, in a way.
Re: (Score:3)
Against the license [menuetos.net]: