F-22 Avionics Require Inflight Reboot 587
An anonymous reader writes "The Atlanta Journal & Constitution is fronting a lengthy piece on the USAF's new F-22 and its upcoming shootout with the existing fleet of F-15's & 16's. One line in the article really jumped out at me: 'When avionics problems crop up now, pilots must restart the entire system as if rebooting a personal computer.' I did some googling, and this is about as much as I could find: The hardware backbone for the system is the Hughes Common Integrated Processor, which, in turn, appears to be built around the Intel i960 CPU. I couldn't find a name for the operating system, but it appears to be written in about one and a half million lines of Ada code; more on the Ada hardware integration and Ada i960 compilers is here. Any Slashdotters working on this project? If so, why do you need the inflight reboot? PS: Gamers will be interested to learn that nVidia's Quadro2 Go GPU and Wind River's VxWorks Operating System are melded in the F-22's Multi-Function Display."
Re:Boeing's Avionics press release (Score:3, Insightful)
According to what this says, the avionics package meets or exceeds expectations. Now, this is not an MS bash, but I can recall of the top of my head that our intelligence services have database software that can only search on one term that probably met or exceeded expectations, and there's that ship that had to be towed back to port due to some NT failures.
Now this is more of an MS bash... people have come to expect system failures, and I've read admissions that 5-9's uptime is just too difficult and expensive a goal, and so-on, and of course this mostly points to MS desktop and server software. I wonder if people who sit at desks and write specs all day for military projects decided that only having to reboot now and then exceeds expectations as set by people not flying in the aircraft.
I'll probably get modded down, but I just think this sort of thing (Boeing's press release, the actual performance as reported, and the overall state of technology in our government) is a bit troubling and it doesn't appear to be getting better.
How I solved this for a heads up display - 15 ya (Score:5, Insightful)
About fifteen years ago for a prototype heads up display I had the same exact problem: draw the tick marks for a compass rose with no memory and no time. There was no scaling of the circle, only rotation about a fixed center.
After some though, what I did was to store in a table the tickmark endpoints for 45 degrees of arc (I recall it being 22.5 and not 90 degrees) for all the displayable rotations of that arc. Then at runtime, my compass rose routine would exploit the symmetry of the situation to determine the endpoints of all the other displayable tickmarks.
It used very little memory since at any point in time we only displayed tick marks at 5 degree intervals. Therefore 45 degrees of those would be 9 tick marks, or 18 ints (two ints per tickmark). At 5 degree intervals with a resolution of 1 degree, you only need a table of 5 x those 18 ints, or 90 ints all told.
I always loved the 3am epiphany!
Re:Finally! (Score:3, Insightful)
I also thought Ada is a good language for teaching in Uni. You don't like it, but it will teach you a lot of important concepts, from its strong typing amongst other things.
That being said, it's not the right tool for most software development being done currently.
grrr (Score:2, Insightful)
Is this how low slashdot has sunk? Someone can't be assed to research themselves the answer to a question so they post it to our x million readership?
Or maybe it's just another shameless editor troll for reams and reams of the same tired old offtopic MS / Windows 98 / BSOD jokes?
Jesus, is there any chance of getting any intelligent replies? I checked out kuro5shin recently and was surprised at how intelligent most of the posts are.
Anyway, mod me down because I haven't slagged MS, whatever.
Re:There Is Something Rotten in Software Engineeri (Score:5, Insightful)
> Software functionality should not be fundamentally different from hardware functionality.
Am I to understand that you are saying that software, like hardware, should only fail when it fails?
Granted, we have a software reliability crisis on our hands. But hardware isn't generally fault-free either. I've had a lot more Zip drives die on me than I've had kernel panics. And arguably a kernel is much more complex than the design of a removable disk drive.
> An algorithmic system is temporally inconsistent and unstable by nature.
That's an absurd claim. It's possible to prove correct behavior for algorithmic systems. Time is explicitly accounted for in most such proofs.
The biggest engineering difference between software and hardware is that people find software errors acceptable, or even normal, whereas they have never reconciled themselves to, say, collapsing bridges or wings falling off of airplanes. When that attitude changes we'll start seeing software that rivals hardware in reliability, not before. Most of the engineering concepts required for producing good software have been around for quite a while.
Re:Why a reboot - because the creators are bozos (Score:1, Insightful)
Re:unfair reporting and out of context. (Score:4, Insightful)
The article was a very postive look at the F22, however it was from the Atlanta Journal Constitution which has a long history of acting as a cheerleader for aircraft from Lockheed's Marietta plant which is located in Atlanta's suburbs. The F22 is a kick ass plane, but the Atlanta newspapers are not an objective source of information for any problems the project may be having. They proved this many times by glossing over problems with the C5. (built at the same plant)
Re:There Is Something Rotten in Software Engineeri (Score:3, Insightful)
Ok, I buy it. Now show me some Cosa that can emulate my Linux Kernel, my Galeon browser and my Mplayer media player (or another tool/application at your choice) so that I can see which one's best.
[/sarcasm]
Algorithms do not make programs fail. Bad logic makes them crash and be unstable. The HIGHER the language level, the lower the failure rate and the faster/cheaper the implementation is. I'd love to see an OS developed as in a DSP fashion
Windows... better, but still not competitive (Score:4, Insightful)
However, to put it in perspective, doing normal development with Java, VBScript, IIS, MS SQL Server, MySQL, Flash (I am deliberately excluding crashes that occured while coding C/C++ and other "non-safe" systems), I observe Win2k either bluescreening, spontaneously rebooting, or getting to a state where it needs to be power-cycled approximately 2-4 times a month. This seems like heaven compared to NT4, which I I used to crash daily while doing Java development and writing ASP pages for IIS. Most NT4 production servers I am aware of are rebooted regularly, often nightly, to prevent them from falling apart altogether. My experience with NT4 has been unequivocal. Don't use it in production unless you want to suffer.
That's not counting Win2k's constant explorer crashes, which are generally not disruptive but still a bit unsettling. The majority of the problem appears to come from Microsoft being unable or unwiling to sanitize the GUI code and protect failures to handle the GUI layer correctly from killing the entire system. That, and I still see the standard device-related problems. Burning CDs and attaching new mice have both proved catastrophic for Win2k, in the latter case requiring a complete reinstall of the operating system. No, I didn't build the mouse myself; it was a Logitech mouse.
I also note that, as with all other versions of Windows, Win2k still has a tendency to "decay;" that is, to continually develop small but uncorrectable problems until reinstall is eventually required. However, the decay rate also seems to have been slowed.
Compare this to Linux, which I also give daily and roughly equivalent use, and which _never_ crashes. _Ever_. In fact AFAIR the last time I had to deal with unexpected shutdowns on Linux was due to a foolish attempt to build a complicated high-speed SCSI chain a year or two ago. I am not aware of any problems on Linux which cannot be corrected without a reinstall of the OS, but perhaps there are exceptions in the crowd who can share experiences.
So... Win2k. Finally usable. But still not competitive.
To all knee-jerk anti-MS-criticism-on-slashdot and pro-MS trolls... if you're just skimming, now is the part where you hit reply and do your thing.
Not unusual (Score:2, Insightful)
Jet fighters operate in an unbelievably harsh environment. High and low temps, high G forces, vibration, etc, etc. It's a wonder they get it to work at all.
Slashdot fodder:
For maintenance, diagnostics, and troubleshooting, the groundcrew uses laptops. Armored, waterproof, etc. Plug it in, and the jet tells you more or less what is wrong. The maintenance manuals are all on CD. These laptops are running on...wait for it....NT.
Why not Linux? Because even if it is demonstrably more stable, the specs for the F-22 were laid down several years ago, when Linux was but a wet dream. Too late to change now.
Embedded World (Score:3, Insightful)
Very good reason not to use Java: (Score:3, Insightful)
Fly Ada! (Score:2, Insightful)
Ada95 (it's not ADA, it's a name not an acronym) is a language that will never become popular to the average programmer because the compiler won't let them do a lot of the very (unsafe) things that they rely on in other languages. This is the stuff you always read about...
The tools that an Engineer use are very important! You could build the F22 using only slide rules but I wouldn't fly it! You could even write the flight control system in C but by the time the process made it as safe as the Ada program. it would be out of date. Good engineering can happen in any language, Ada helps the process, C,C++ hinder the process)
Writing the flight control software in a language (tool) like Ada makes the end product more reliable and predictable because of both the compile time and run-time checks. I can make just about any Ada code execute as fast as C if I get rid of the run-time checks. Even then Ada is much better then C/C++ because of the compile time checks that C/C++ lack.
Writing software is an art and a discipline. most programmers forget the discipline part.