Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Technology

New Supercomputer By Star Bridge 58

Ronin Developer writes with word of this "interesting article on CNN about a new desktop-size super computer that reconfigures itself on the fly. The company name is 'Star Bridge.' Ring any bells? If I remember correctly, wasn't there something on /. about this a year ago?" Indeedy do -- Star Bridge seems to go straight from wacky-but-cool promises to Where are they now? (and back) with finesse. It's the the Arnold Schwarzenegger movie plot of hardware companies -- simultaneously head-scratchingly implausible, mildly compelling, and numbingly persistent.
This discussion has been archived. No new comments can be posted.

New Supercomputer By Star Bridge

Comments Filter:
  • by Anonymous Coward
    Slashdot spell checker
  • by Anonymous Coward

    Remember that story [slashdot.org] on the 27th about actuality systems building a 3D display? Well guess who makes the processors... Yup. Xilinx. The same people who are behind the FPGAs in the Star Bridge system unit... So we have 2 products that potentially could revolutionize the computing industry as we know it and they are both tied back to the same company. Either a new Intel is born or well more vapor coming from the steam room that is the "internet era economy"

    What's that? You want one? Ha Ha Ha Ha! You silly peon of a civilian hacker! Did you really think that you could ride the wave of human intellegence into the future?? Well we here at the NSA, CIA, and FBI disagree. No one is ever going to need the power these machines provide (besides us of course)... No from now on computers suitable for civilians are to be restricted to 64-bit CPUs at most. If we can't install a backdoor into your encryptions then we'll just keep the most powerful processors that can brute-force them.

  • I'm gonna sue, I swear!
    I extensively discussed the idea of soft micro/nanoinstructions and purely combinatorial context reconfigurable cpu's with my buddies at a bazillion cocktail (there was weed there, too. of course.) parties in the mid 70's. We had a lot of it specced out, too, including hypermultiplexed optical bussing, tagged packet payloads, the whole nine yards.
    If we weren't all so busy getting wasted at the time, we might have actually built something!
    Does a bunch of tekkie wastoids babbling in the kitchen count as prior art?
    Hmmm...


  • The stuff that appeared 4 days ago was about NASA. There was no mention of Starbridge or anything.

    But both articles came up with the same idea - FPGA based computer sans CPU, achieving the speed of Two Thousand 800MHZ Pentium machine, in one small boxen.

    I did not downloaded NASA's brain-damaged .doc file, since I do not use MS-product, so I have no idea of what's in that .doc file. All I know is from another PR piece, a brief summary type.

    Anyone who knows please comment.

    Thank you.

  • That is kind of odd. Didn't their marketeers ever take a composition course? You're supposed to state your position unequivocably without resorting to "I think that..." or "We believe that..." or "In my opinion..."

    Maybe it's different when you're talking about an actual product rather than academic theories, but I've never seen other vendors use such wishy-washy language.


  • Bullshit. Starbridge was clearly mentioned last time. And FWIW, I believe they have a regular computer (with CPU, of course) acting as a FEP. So the FPGA system is like a co-processor board.
  • by deadline ( 14171 ) on Sunday April 01, 2001 @06:33AM (#324241) Homepage

    Some reality is in order here.

    FPGA computing is real and it has been shown to work for some problems. Take a look at TimeLogic [timelogic.com]. These guys have implemented search algorithms used in the human genome project on FPGAs.

    Now let's look the difference between "works" and "price to performance". In the case of TimeLogic they have produced a "stand-alone appliance that end-users do not program (i.e. users do not program the FPGAs.) I beleive the reasons for this is that this programming abstraction (remember this) is not easy to master (i.e. it is not a mainstream programming language.) Nor is the "edit, complie, run cycle" easily reporduced on a desk top. (This time is perhaps the single most limiting factor in software production). So FPGA computing works, but is expensive to implement and program. It does not support cost effective general programing practices that are used today.(i.e. unless you are building specific purpose machine and can justify the software development costs based on a real market, the cost of programming for every day production environments is too expensive)

    Which brings me to the main point. The issue is SOFTWARE. It is easy to build a Beowulf with 1000 processors and call it a supercomputer. It is hard (expensive) to write good software for this system. It is easy to string together a bunch of FPGAs and call it a supercomputer. It is hard (expensive) to write software for these things and it is harder (expensive squared) to write parallel software for these things.

    In general, there is a huge (I mean really huge) investment in the supercomputer world in programming abstratcions that use FORTRAN (and to some extent C) Side Note:Before all you "FORTRAN is dead language boneheads" start hitting the reply button, remember that there are more than a few 100,000+ line FORTRAN programs that determine everything from airplane wings, to weather, to new drugs, that are not going to go away because you think XML is great way to go. Indeed, the cost of reprogramming these applications is almost an economic impossibility!

    So where were we, ah yes, the software thing. My point is that until FPGA systems can take standard supercomputing FORTRAN or C applications and run them "out of the box" and thereby allow the tens of thousands of people who understand this type of programming to use FPGAs easily, they will remain application specific computers (albeit fast) and not realy a mainstream programmable computing devices. This is not to say in the future the FPGA computing will not dominiate (maybe it will), but there is a lot of work to be done on the software side before this will happen.

    BTW: I sent the Starbridge guys some simple FORTRAN benchmarks a while ago. I did not receive a response.

    Finally, remember this:

    The general always eats the specific.

    Any one remember a company called Symbolics?

  • I have played with the FPGA's from Xilinx and while they are very cool they are slow @ lookups and vectors this is where people doing custom ASIC hardware will be better but most software problems does not require this

    VHDL is nice but I always thought that the machine could do a better job if you described the problem better and arent so abstract

    this is intresting but you have to understand that this is like software agents to hardware engineers
    how many REAL applications of software agents are there ? this is all mangled up in the EXPERT systems design methodolgy and it become a real quagmire to sort out

    if Star Bridge Systems keeps focus then they will be alright
    all hell will break lose if they dream up fancy problems to solve, KISS is the order of the day

    hope they get somewhere

    oh and slashdot did run this story but it is nice that it is a story in its own right

    regards

    john jones

  • Actually, NASA didn't buy one -- they were GIVEN one by StarBridge. BIG difference.

    Go to starbridge's site [starbridgesystems.com] and poke around a bit. The "HAL 300" was enough to make me spew coffee on my monitor I was laughing so hard.

    The "faster than the IBM Pacific Blue (when simulating a 4-bit adder)" claims put the nail in their coffin for me. These guys are hucksters of the worst kind.

  • They were incredibly efficient, but Thompson couldn't understand why they worked. (He suspected such things as electromagnetic coupling and communication through the power supply.)

    I think I remember reading about something like that - a guy removed "islands" from the circuit and it stopped working, and when he put them back in it worked. Crazy :-)

    Genetic algorithms (in software) tend to be like that. Evolution doesn't value parsimony or maintainability; it only cares about what works. Turns out genetically evolved software desperately needs "junk DNA" (as safe places to recombine bits from two parent algorithms).

    On the other hand, much of the same thing seems to be true for the large C++ application I'm working on.-(
  • Does this mean the web will eventually become wholly self referrential and crawl up its' own arse? Oh, hang on.....

    Dave
  • This article seems suspiciously like the one that led to this slashdot posting [slashdot.org].

    I know that sometimes articles fade in and out of our collective consciousness but it was just posted on Wednesday and nothing new has happened since.

    Although I guess anything would be better than talking about how the Leafs managed to lose to the Habs tonight. Disgraceful that was.

  • by SMN ( 33356 ) on Saturday March 31, 2001 @07:56PM (#324247)
    Moderators: This just had to be said by someone. Remember there's no "-1: Bad Opinion" option.

    "Ring any bells? If I remember correctly, wasn't there something on /. about this a year ago?"
    If I remember correctly, wasn't there something on /. about this two days ago? [slashdot.org]

    Excerpting from this NASA press release [nasa.gov] that Slashdot linked to Friday:

    Via a Space Act Agreement, NASA Langley Research Center will receive a HAL (Hyper Algorithmic Logic)-15 Hypercomputer from Star Bridge Systems, Inc. of Midvale, Utah. The system is said to be faster and more versatile than any supercomputer on the market and will change the way we think about computational methods.
    And from this article [nasa.gov] that Slashdot linked to in the same writeup:
    Representatives of Star Bridge Systems, Inc. visited Langley Research Center on March 27 to demonstrate and deliver one of its Hyper Algorithmic Logic (HAL-15) supercomputers.

    Star Bridge President Brent Ward and Chief Executive Officer Kent Gilson presented the supercomputer to Doug Dwoyer, Langley's Associate Director for Research and Technology Competencies, after press and technical briefings in the Pearl Young Theater.

    I'm not trying to be a troll or start a flame war; I just think it's absurd that Slashdot's editors not only don't participate in posting comments (and claim they read them), but that they don't even their own articles!

    Strange how Slashdot was bought out, and now that our beloved editors are paid hefty sums with full editorial control, they still can't find the time to read their own site. This site was definately better back when it was Rob & Jeff posting stuff that interested them (and that they therefore actually read). It's still an amazing site, just not as amazing =(


  • sick and twisted? I think it's funny. although I actually checked a couple times to make sure this wasn't posted on April 1, just in case

    sean
  • by seanw ( 45548 ) on Saturday March 31, 2001 @07:22PM (#324249)
    so they invent a new supercomputer that's so smart it can reconfigure itself, and what do they name it? HAL.

    they just never learn.

    sean
  • FYI: This is the same thing as the posting about the FPGA [slashdot.org] computer that NASA just got. This thing is pretty durn cool, it works off of an iconic programming language, sort of like labview. Superfast, a few thousand times faster than a P3-800, well you can read the articles. I can't wait to play with it, since we just got one here at NASA . . . :)

    Patrik
    -------------
    Just your ordinary BOFH :) http://pjbutler.dhs.org/me

  • Err . . . Next time I should read through the posts, a lot of people already caught it. It's too depressing when there are a bunch of ignorant racists making stupid postings. Patrik
    -------------
    Just your ordinary BOFH :) http://pjbutler.dhs.org/me
  • Is this like a very fancy version of an ordinary CAD program (ie the ones used to design circuits)? It's just that it's real (physical) and on-the-fly, rather than represented by some software running a model of how a circuit is predicted to work (before they go spend money to make a real one and test it).
  • These guys should be building custom silicon, with one bit processors in an array... everything clocked... they could get the cost down to almost zip. They should FORGET the need to reprogram the thing in anything less than 1 minute. If you need high speed hardware, you've got to be able to trade away something, and program setup time is it.

    --Mike--

  • On a side note, the restructuring is software-controlled.

    From what I recall about parallel computing, the structure and using it are the hard part of parallel computing. So ,the "Viva" library is just as important as the hardware itself.

    However, from the look of it, the library just makes parallel processing easy - it doesn't transform serial code into parallel algorithms. This makes porting pretty difficult.

    From their web site:

    Star Bridge's unique computing environment that blurs the distinctions between hardware and software

    This seems like the opposite of abstraction to me. Of course, I could be wrong

    The main question is, when are these things going to be available? Seems like we've been reading about them for quite a while.

  • And guess which operating system it runs? That's right, Windows 98 [starbridgesystems.com].

  • Slashdot noted the press release [nasa.gov] in this Slashdot story [slashdot.org]. I wouldn't be surprised if CNN found out about the story from Slashdot.
  • It's sort of funny how almost all the statements on the Star Bridge [starbridgesystems.com] site are prefaced with "We believe that..." It's almost as if they're not quite willing to state their own hype as facts, so they qualify them in every sentence.

    Check it out, you'll see what I mean.

  • ...but I find it very interesting that according to this bio [starbridgesystems.com], Industry Week wrote about the Chairman, CEO and CTO on April 1 1996...

    --

  • And the founder's bio page [starbridgesystems.com] still has traces of the misunderstood-whiz-kid egotism that permeated their old site:

    At age 12 Kent built a commercial-quality, space-invaders-type computer game.

  • Timothy, are you taking late night classes in English (and spelling) from the illustrious CmdrTaco?

    'Indeedy do '? I think you meant 'Indeedy so '

    ' midly compelling'?

    And my favorite, from the previous story, has got to be; 'It will also probably strike at the heart of arguments about how regulated (and by whom) ISPs ought to by. '

    ;-)

  • by Grant Elliott ( 132633 ) on Saturday March 31, 2001 @09:45PM (#324262)
    I'm sorry. I just noticed that I wrote "serialization" when I meant "parallelization" (which doesn't even come close to sounding like an actual word, but you know what I mean). Sorry for any confusion.
  • by Grant Elliott ( 132633 ) on Saturday March 31, 2001 @07:43PM (#324263)
    These things use Field Programmable Gate Arrays (FPGA's) in order to restructure themselves dynamically. This, in and of itself, is not a new concept. FPGA's have been used for years in prototyping or in the first products released. It's much cheaper and easier to reprogram an FPGA if a bug is found than it is to create a new chip design. Once the bugs are gone, FPGA's are replaced by hard-wired silicon in the rest of the line.

    Now on to using FPGA's in supercomputers. First of all, an FPGA is slower than a hard-wired chip. These machines pick up speed from the fact that they can use portions of the chip that otherwise would have been on standby. It's super-charged serialization. By restructuring the circuitry for each task, they can take advantage of the majority of the chip at all times. This is not an easy task, and I find it quite impressive. (On a side note, the restructuring is software-controlled.)

    When I read this story, I immediately associated it with an article from several years back about Inman Harvey and Adrian Thompson. Thompson was using an FPGA to run genetic algorithms for hardware development. Essentially, make a machine design the chip. He had some very interesting results. The chip designs took advantage of the physical chip rather than just the wiring. They were incredibly efficient, but Thompson couldn't understand why they worked. (He suspected such things as electromagnetic coupling and communication through the power supply.) This is all only moderately related, but it's very interesting, regardless. The article is from June, 1998 and can be found here [208.245.156.153] if anyone is interested.
  • An actual, genuine, honest-to-goodness con job. Too bad they weren't able to finish it up before the Nasdaq crash - probably won't be able to find a sucker now. I suppose we should call the cops.
  • This is just laughable. Here's a snippet from the specs [starbridgesystems.com] of their HAL-300 box, supposedly the superest-duperest computer in the world:

    6. Built-in I/O:
    S-VHS video channels in/out
    RJ-11 telephony interfaces (POTS/DATA)
    50 Mbytes/s reconfigurable in/out

    I bet ASCI White wishes it had a S-VHS connection.

  • Isn't this [slashdot.org] the same thing. Come on guys, this was 4 days ago
  • The potential for fault-tolerance is interesting, though. If physical damage to the system occurs, the chip can simply reconfigure itself to not use the damaged portion. It will be slower than the undamaged version, but it would still work. For example, if you started the chip decoding an audio file, and then started deactivating portions of the processor, the playback would gradually slow down. I wonder if the designers have played "Bicycle Built for Two" on it yet. :)

    "The new high-performance computers, developed ... in Midvale, Utah..."

    They'd better get an office in Urbana, Illinois before they get around to making the 9000 model.

  • The Langley Research Center announced this week an agreement to use one of the computers, known as HAL (Hyper Algorithmic Logic)-15. Other customers that will use HAL-15 machines include the San Diego Supercomputer Center, the Department of Defense and Hollywood film companies.

    "But Dave, I don't like Hollywood.......Dave? Aren't they just asking for trouble here?

    I don't know about you, but I sure as heck hope that this bit is someone's April Fool's joke that launched a little early.

    really [theregister.co.uk]

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • Without seeing any of the specifics, and no runtime figures, does anyone else find it odd that the "co-processor" in the HAL-15 is a pIII 750? Until I see benchmarks this thing is still smoke and mirrors.
  • I don't think programming FPGAs would be such a stated difficulty if the approach is right. And although what little I've read seems to still be more difficult than it needs to be, I Suspect the source of the problems are human mentality oriented, or what you might call programming of the human mind problems..

    consider the details exposed in this link! [mindspring.com]


    3 S.E.A.S - Virtual Interaction Configuration (VIC) - VISION OF VISIONS!
  • So all we have to do is keep it ignorant and not have copies of Clarke's 2001 online where it can download it.
  • You're correct! Software IS the issue, which is why VIVA [nasa.gov] (programming entirely graphically) was developed, eliminating most of the previous HDL complexities. See: here [nasa.gov].

    Celoxia [celoxica.com] also appears to be addressing the SOFTWARE issue with links to C.

    Computer Savvy /.ers like yourself may wish to explore these many links [nasa.gov]

    for more detailed info than the NASA Press Release (geared for the general public) allows.

  • The Langley Research Center announced this week an agreement to use one of the computers, known as HAL (Hyper Algorithmic Logic)-15. Other customers that will use HAL-15 machines include the San Diego Supercomputer Center, the Department of Defense and Hollywood film companies.

    "But Dave, I don't like Hollywood.......Dave?

    Aren't they just asking for trouble here?

  • I heard that the crusoe processor's have some sort of morphing technology thats similiar to fpga. With the difference that crosoe does some of this in software and not hardware.

    Perhaps someone reading this could explain the differences between the tow different processors.
  • I am sorry Dave, I cannot do that......

    Actually, I think that the name might just be a joke, you have to understand that most brilliant minds have a sick and twisted sense of humor. They are not announcing AI or anything, just extremely fast processing. Now, if they give it an OS that has self preservation built into it and place it in control of a spaceship, then we are in trouble.
  • by cmowire ( 254489 ) on Saturday March 31, 2001 @07:27PM (#324276) Homepage
    From what I understand, it makes a lot of sense for some problems. Like, hard problems.

    Basicly, a FPGA can take up the properties of any chip that can be defined by VHDL or other such languages, with some restrictions, of course. So, theoretically, you use every last square inch of silicon for the problem at hand, minus whatever is there to make it reconfigurable.

    So that's nice, because if you are doing floating point problems, you don't really need the integer unit. Things like that.

    However, I suspect that the thing will, at least in the short term, be a pain in the arse to program efficently (Given that it's a completely different paradigm) and will probably be for specialized applications that suck on an ordinary computer.

    And it isn't something that just one company thought up. It's been in the cookers in the academic part of the world for 5-6 years at least.

    I mean, the best part about it is that all of your parts are off-the-shelf and cheap... ;)
  • In a way, though, this is a form of AI, though perhaps not in the form we're used to. The fact that the chips can test configurations, and pick and choose which ones to keep may in fact be brute force, but it's kind of like brainstorming, in a way. You can't get good ideas without having to plow through some bad ones. And there would be some really fascinating implications and possibilities if the chip could be taught to analyze it's own approaches at evolving itself. Pretty deep stuff.
  • OK, now that is just scary...
  • A quick web search yield this link [nec.com] to an abstract of an article entitled "The Natural Way To Evolve Hardware (1996)" by Adrian Thompson, Inman Harvey, and Philip Husbands. Links to the article in various formats are found in the upper right corner of that page. This page [nec.com] has links to several more related articles about evolutionary robotics and circuitry.
  • Alright,
    this is one sick mofo and its a shame he can't go with a user name...since he's so afraid..he has to hide behind his imaginary .45, lol.
    Tell us about the Pole, I DARE YOU
  • Why don't you come out and stop hiding behind anonomyous? I know this seem to be a petty fight, but this guy has hid far too long and posted too much shit. Mod me down, I don't care, but its time to see what this guy is made of!
  • Imagine a whole system like that...
    Forgot about that? It was a really premature April Fool's slashdot article. try alancoxonachip.com
    I make wild guesses for my own amusement, ignore randomly.
  • Yeah, four days ago, it was 60,000x, and that was a downgrade from the 300,000x industrial model.

    It's just looking worse and worse for this sad machine.

  • And my favorite, from the previous story, has got to be; 'It will also probably strike at the heart of arguments about how regulated (and by whom) ISPs ought to by. '

    Interesting you mention this. After reading this I went back to the main page and saw the error. Then I refreshed the main page (about 10-15 minutes had elapsed), and the "by" had been corrected to "be". Seems Timothy either read your comment or corrected his error on his own.

    Back to the subject on hand, I found it intriguing to read the old /. article [slashdot.org] posted 2 years ago and see how many people disregarded this as a total sham. While I am still a tiny bit skeptical (prefering to experience the performance first hand to truly understand it), it would appear the product is viable enough if NASA and other supercomputing centers are using this. It will be a good follow up to see in the next few months the actual 'real-world' computing performance this has imparted to NASA's research.

    - A non-productive mind is with absolutely zero balance.

  • I think timothy was so darn proud of himself for actually going out and finding the URLs of the stories from a year ago that he didn't bother to check the last few days!

    I can just imagine what he was thinking:
    Stories! Plural! Boy, those darn guys that are always on me for not checking for previous Slashdot stories on the same subject will be happy now!

    If the guy who submitted the story hadn't mentioned it, he probably wouldn't have even done that.

  • copied from the NASA's .doc file:

    Via a Space Act Agreement, NASA Langley Research Center will receive a HAL (Hyper Algorithmic Logic)-15 Hypercomputer from Star Bridge Systems, Inc. of Midvale, Utah. The system is said to be faster and more versatile than any supercomputer on the market and will change the way we think about computational methods.

  • Thanks Eminem...
  • People have tried to use FPGAs for building supercomputers as long as they have been around. So far, they have never proven cost effective. You gain a lot in processing speed by not having to interpret instructions, but you lose a lot in speed and area because of the hooks needed for reprogrammability. It sounds good and attracts a certain brand of investor, but I wouldn't put my money on it. I think you are going to see massively parallel machines with single-package processor/memory combos succeed before FPGAs.
  • I don't know about you, but I sure as heck hope that this bit is someone's April Fool's joke that launched a little early.

    That's a rather silly attitude. Firstly, do you honestly think that NASA would have a press release up on their site if this was the case? Secondly, don't you think if it were a hoax, by now somebody with an interest in the high-performance computing market would have piped up? Cray, Intel, IBM?

    There is a lot of hype surrounding this and it's not helping. The point is, everybody who has a clue knows that standard CPUs suck - being able to do architecture emulation on an FPGA is a big step forward. It's not about being able to produce a 4-bit adder, or replacing ASICs necessarily. Just because you don't understand it and thought that your overclocked Celeron with neon lights all over it was the fastest PC in the world, doesn't mean that other people aren't going to try and beat you with alternative technology. :-)
  • They were incredibly efficient, but Thompson couldn't understand why they worked. (He suspected such things as electromagnetic coupling and communication through the power supply.

    I think I remember reading about something like that - a guy removed "islands" from the circuit and it stopped working, and when he put them back in it worked. Crazy :-)

    --
    "May the forces of evil become confused on the way to your house"

  • I imagine you meant MILDLY compelling...and this same basic thing was posted a few days ago...
  • These chips seem to complement existing hardware perfectly. They are slower when it comes to running a multiple of tasks they are unadapted to, but gain incredible speed when they have time to perfect a means of computing it (and i would assume similar tasks could be evolved quickly), while convential processors can fill in where they would fail. Doesnt it make perfect sense to use existing processors in combination with these, letting the old do the drudgry of office work, and the new the hardcore applications of rendering and database crunching (where the time it takes to optimize is more than made up). They could work like 3d accelerator cards. Moreover, their adaptations can be stored and researched for the next time they come into use. (ie, remembering an optimal rendering configuration to apply to another situation). Someone must have already thought of this.. who knows about this stuff? -wisdom is simply the ability to accurately guage stupidity

Old programmers never die, they just hit account block limit.

Working...