Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

IBM's Deep View 140

BlackHat linked us to IBMs Deep View, a research system for rendering and other advanced applications (Q3A). The PC is 8 Linux boxes in a rack, which is needed to generate the content for the T221 display which operates at 3840x2400.
This discussion has been archived. No new comments can be posted.

IBM's Deep View

Comments Filter:
  • Not an OS solution. Linux is peripheral to the whole article.
    • No need to retract, Linux isn't the most important part of the set-up at all.

      I did't see anywhere where it said what processor it's using though. A 866MHz what?

      Chips at that speed include PIII and Alpha 21264, but not as far as I know a Power or PPC speed (or HPPA, MIPS, or Sparc).

      I know Alphas are popular in render-farms, but have gone 'out of fashion' now. Are IBM _embarassed_ by their choice of processor?

      YAWIAR.
      • I did't see anywhere where it said what processor it's using though. A 866MHz what?

        Most of IBMs big machines seem to run on PPC chips of one form or another, and 866 is a common speed for G3 CPUs.

        Deep Blue ran on the old PPC604e's!

        I would think they are using IBM processors.

        • That would have been my first guess too, but in a scan down www.spec.org, and a google for ``866MHz PPC'' I found _nothing_ that matched at all.

          However, if you claim they exist, that's enough proof for me, as, as you say, it does make the most sense.

          YAWIAR
          • That would have been my first guess too, but in a scan down www.spec.org, and a google for ``866MHz PPC'' I found _nothing_ that matched at all.

            Apple used to sell an 867MHz G4 from July 01 - Jan 02.

            That's a Mototola part of course, but I'm sure IBM has a similar PPC CPU.

            • I knew that such speeds were possible, it's just hta Apple etc. seem to be quite poor at posting SPEC results. It seems like intel and dell are in competition with each other to see who can post more SPEC results each month. If they can fill the table with 10000 P4 results, then noone will be able to see that an IBM Power system's rocking the #1 spot currently.
              Cunning, and devious...

              Anyway, thanks for the heads-up PPC-wise.

              YAWIAR.

      • IA-64 Itanium perhaps? However, looking at Intel's specs I don't see 866 listed as an available clock speed. They list the original Itanium [intel.com] at 733 and 800 MHz, and Itanium 2 [intel.com] at 900 MHz and 1 GHz.

        I would have thought that IBM would plug their own hardware whenever possible -- the T221 display is certainly phenomenal, and they provide a link so you could buy one of those... It has me wondering how I can come up with $8400 to get one. (Heck, when those things came out, they were $20,000! Ah, progress.)

        So, this leaves us to wonder... no mention of processors, "low" clock speed -- compared to what we're used to seeing -- something new from AMD? IBM pissed off at Intel? Some new massively parallel top secret silicon from IBM?

        Watch... it'll turn out to be Pentium IIIs -- they call 'em "workstations," so they might have recovered them from some other project. (Or all the engineers got new workstations and wondered what cool project they could do with their old ones... Q3A at 3840x2400? What the heck!) What's the limiting factor in this case, processor power or network bandwidth?

  • Sounds Like (Score:1, Funny)

    by Anonymous Coward
    SGI had finally lost it's place at every table.
    • Seriously,
      There was a time when the only inventions in Visualization came out of SGI or Evans & Sutherland. Way to go IBM!
    • This used to be almost exclusively SGI's territory.

      A few years ago I attended a presentation at NCSA where Larry Smarr was talking about their plans for a similar display, driven by about $1M of SGI boxes. I think they wanted to call it "The Great Wall of Power".

      Some future PlayStation "n" will do this in your living room.

      Isn't Moore's Law great?

      Which leads me to ask, have we ever had a /. poll on your favourite law?
      -Moore's
      -Murphy's
      -Amdahl's
      -Newton's
      - Hooke's
      -Boyle's
      -whatever?
      • Sturgeon's
      • NCSA has been making good progress on developing this. The million bucks of SGI hardware has of course been replaced by a rack of Linux PC's. Instructions are actually on line [uiuc.edu]. It's not trivial to build, though. And using LCD projecters does have real downsides. If you buy a dozen identical projectors, they won't have the same brightness and color saturation. Actually, they aren't even consistent from edge to edge. So your display isn't perfect, and you can definitly see the tile edges. Not to mention the fun of building a usable support structure which lets you get all the alignments right. If you don't need it to take up an entire wall, I think the IBM T221 display is a cheaper way to get super high-res output. But of course your high-res Quake won't be lifesize either. :-)
    • Why, do you have performance benchmarks comparing the experimental, non production DeepView with the commercially available InfinitePerformance [sgi.com] visualization system?
    • Re:Sounds Like (Score:2, Interesting)

      by YeeHarr ( 187241 )
      I have seen this display driven by one Octane2.

      http://www.sgi.com/workstations/octane2/

      Not one rack of 8 PCs and one half rack of some graphics engine.

      It was drawing the full display at about 30fps.

      It was as easy to use as any other workstation rather than the 'interesting' mix of 8 pcs in a rack and some other half rack of graphics stuff.

      The hardest thing was to read the tiny little fonts on the screen (the display is 200dpi IIRC) - you need a magnifying glass (or of course you could increase the font size).

      The Octane2 can do this because you can install two V12 graphics engines each with a dual channel adapter.

      http://www.sgi.com/workstations/octane2/dual_cha nn el.html

      SGI software stiches the cards together transparently.

      It is a beautiful display.

      The amuzing/annoying thing about these sorts of announcements is that customers have been using SGI stuff to do this for the last 8 years or so.

      If you were on the leading edge of this kind of work would you wait that long for some kludged together solution which might work if you have enough Duct tape to stick it together?

      Or would you pay the extra cash for a solution that works and gives you a huge jump on your competition.

      The sort of software layer that can be used to make these four channels (two channels from each graphics card) into one display is stuff like:

      http://www.sgi.com/software/multipipe/sdk/

      Oh yeah - and re your 'every table comment'. The Octane2 fits on one table - it doesn't surround the table like the IBM stuff.
    • SGI did the graphics cluster thing over two years ago and even released it as a product. Very, very similar to the IBM. They even ported several of their programming APIs and SDKs from InfiniteReality to the Linux Graphics Cluster. Not many were sold, however. Heck, Slashdot didn't even cover it. It was still pretty neat to see multiple spanned monitors and even composited high res projectors driven by a half rack of Linux PCs. Many of the demos were actually ported from SGI's big iron Onyx machines and worked just as well on the cluster. The basic setup was a stack of rackmount Linux boxes using nVidia AGP cards and custom PCI cards daisy chained together to provide sync for glSwapBuffers among other things. Also availble were gigE and Myrinet for networking the machines with something better than 100BT. A compositor (similar to what's used in InfinitePerformance) was also available.

      More information (white paper and data sheet) can be found on SGI's legacy systems page:
      http://www.sgi.com/products/legacy/vis_systems.htm l [sgi.com]

      I belive a few .edu and .org have also rolled their own graphic clusters... though I don't know who supplied the compositors.
  • Links (Score:5, Informative)

    by SkipToMyLou ( 595608 ) <b@b.b> on Sunday August 04, 2002 @11:43AM (#4008025)
    An article [computer.org] about the methods IBM used to cluster eight dual processor Linux workstations to provide the necessary graphics power.

    Movies [ibm.com] of Deep View in action.
    • the movies not really worth the download.

      just showing some quite simple models being rotated in Catia.

      my P4 can do that in Pro/E too.

      or did i miss something important?
  • by anticypher ( 48312 ) <anticypher.gmail@com> on Sunday August 04, 2002 @11:46AM (#4008034) Homepage
    9.2 megaPixel, up to 56 Hz refresh in a 22 inch LCD screen. Want!

    This is now very high on my luxury wish list. When is the next IT bubble scheduled to happen again? I have to start my plan to get rich on other people's stupidity and greed so I can afford a system like this.

    the AC
  • Bandwidth... (Score:2, Interesting)

    by vofka ( 572268 )
    Quoth the article:
    since no single graphics adapter has the necessary horsepower and bandwidth to feed a 9.2 million pixel display (at 41Hz using 24 bits per pixel)

    Hmm, doing the math:
    3840*2400 pixels = 9216000 Pixels per Frame
    9216000*3 (3=24/8) = 27648000 Bytes per Frame
    27648000*8 = 221184000 Bits Per Frame
    221184000 (bpf) * 41 (fps) = 9068544000 Bits per Second
    9068544000/1024 = 8856000 KiloBits per Second (approx)
    8856000/1024 = 8648 MegaBits per Second (approx)
    8648/1024 = Just over 8 GigaBits per Second

    Now, with newer DX9-type graphics Adapters, and AGP 8x, we can do about 2.5 to 3 Gigabits per second just now (over the AGP Bus, haven't calculated Actual Display bitrates!)... Applying Moore's law, (theorum, whatever!), we can safely say that this kind of horsepower will be common in a single, average, desktop PC inside of two to three years.

    Sure, this may be a boost to Hollywood today - but soon enough, it will be pretty commonplace technology. (Though I'm betting the most expensive item in that bunch of kit is the actual LCD Display, not the kit driving it!).
    • Re:Bandwidth... (Score:5, Informative)

      by be-fan ( 61476 ) on Sunday August 04, 2002 @12:20PM (#4008138)
      Umm, just over 8 gigabits per second is nothing. That's about 1 GB/sec, and an 8x AGP (2 GB/sec) graphics adapter (Radeon 9700) would have no trouble handling that data rate. Besides, you don't have to pump *all* of that data over the AGP bus. You send only display lists and textures and whatnot over the AGP bus. The local graphics card (where the actual data rate would hit near 1 GB/sec) has (on a Radeon 9700) about 20 GB/sec of bandwidth.
      • Grrr..

        You're right of course... As a matter of habit, I tend to think about this kind of data-transfer in GigaBits per second - I totally forgot that AGP4x is specced in GigaBytes per second.

        Time for a Homer Moment....... Doh!
  • Great game engine (Score:5, Interesting)

    by johnlcallaway ( 165670 ) on Sunday August 04, 2002 @11:53AM (#4008059)
    From one of the links [ibm.com]:
    Quake III: Using the Chromium software, we can play Quake III Arena at a resolution of 3840x2400 pixels.
    Now, where is that money I was setting aside for my kids college education....
    • Yeah, but at what frame rate?
      • Since Chromium allows applications to demonstrate pretty much linear scalability as nodes are added, I would guess that they got interactive frame rates. At least, that's what I've seen with Chromium displaying to one of those displays at Stanford.
  • Deep Thought coming?
  • It states Click to see the full resolution image (1086x1058) but the full resolution image (407 kilos) was already loaded to show you a thumbnail. Funny to see this bad web-design on IBM's page about new graphic system (kind of marketing?). Or people just forgot why it's polite to use thumbnails in the Net?
  • 41 hz ... ouch! That thing must look like a strobe light.
    • Huh?

      Standard cinema runs at 24 Hz non-Interlaced, domestic TV at 25 (UK) or 30 (US) Hz Interlaced (50 or 60 Hz effective, though only half of the image is rendered in each frame), neither of these show a visible strobe effect!

      Also, the switching time of the Transistors in the Display Matrix, coupled with signal delays which are bound th be present in the Display Driver Unit (even if they are pico- or fento-second delays), will add up to a smooth(er) experience anyways.

      Sure, the display would be more realistic at 100Hz or faster, but really, it's not vital... The human eye only resolves 12 to 15 frames per second, and (depending on the ambient light level, and image contrast levels) about 11 to 15 million distinct colours.
      • The human eye only resolves 12 to 15 frames per second

        Where did you get that from?? It's easily proved wrong by the fact that you can see the flicker of a 60Hz monitor, but even then, if that's all you could see it would be impossible to play sports (such as hitting a baseball).

        The whole "what is the frame rate of the eye" question is an incredibly complex one that I won't try and do here (yet again). Suffice it to say that it's not as simple as "frame rate".

        • No, it's not quite that simple - and perhaps my original response was biased by the fact that I am Partially Sighted (6/36 or 20/120 depending on your side of the Pond), and hence have a totally different view on the world to most of you!

          Most people I have spoken to only 'see' the 60Hz flicker of a CRT in artificially lit conditions, and report that the flicker disappears in Natural Lighting - the visible component of the flicker is caused by resonance between the light-source and the CRT.

          As for sports - you don't need to resolve images faster than 15 'cycles' per second - your brain will quite happily fill in the missing detail about an object's position, be it a Baseball or whatever, quite unconciously - you don't have to see it to know it's there and where it's going!
          • your brain will quite happily fill in the missing detail about an object's position, be it a Baseball or whatever, quite unconciously - you don't have to see it to know it's there and where it's going!

            Spoken like someone who doesn't play sports (no offense). Anticipation is certainly an important part, but only a minor part. You will NOT be able to hit a 80-90 MPH curve ball by just watching the angle coming from the pitcher's hand. Think about the margin for error in hitting a baseball, and then think about the angle you are watching the baseball come in at, which is basically straight at you.

            Or heck, an easier experiment is to have someone toss a ball to you and close your eyes at the halfway point when it reaches the top of the arc. It will NOT be easy to catch, but by your theory is should be since you have data from half the travel of the ball.

            This really shouldn't be surprising. Which would be easier to design... a device that tracks a ball and then tries to predict where a grabber should be to catch it, or a device that watches the flight of the ball and continuously updates the grabber position based on where it sees the ball going? Again, anticipation is certainly a part of it since you want a rought approximation of where the ball is going to be, but it's not nearly enough data to be accurate.

          • I've gotta call bullshit there. The only case where lighting would have any effect on the perception of a CRT would be its intensity, and even then its effect would be indirect.

            Your eyes are much more sensitive to flicker in the periphery than looking dead on. If the room is very bright, your eyes will be less dialated, and you'll be less sensitive to the flicker of your monitor.

            Artificial light does have it's own flicker component, but that won't interfere with a crt because it doesn't depend on the reflection of that light for its operation. Now if you take an HP48 calculator, you will probably notice some flicker in rooms lit mostly with flourescent lights. The refresh on flourescents (in the US anyway) are close to the refresh of the reflective lcd on the HP48, hence the banding.

            Another big factor on flicker is the rate of decay of the phospher elements in your monitor. The slower the glow decay, the less likley you are to see a flicker--the pixel is still glowing from the last time it was hit, when it is struck again. The longer this decay, the lower refresh rate you can get away with from a flicker point of view.

            However, now you suffer from smearing or stuttering (sometimes called ghosting). The optimal setup would be a phospher coating with nearly infinite decay rate, operating with an infinite refresh rate.

            Television, here in the US is refreshed at something like 30Hz (non hdtv). The reasons you don't see the flicker are: 1) slow decay rate of the phospher. 2) you are usually 5 or 6 feet away from a tv when you are watching--so it isn't in your peripheral vision. 3) While big screen tv's are getting more common, most people are still below the 36 inch mark, which also means it is mostly in your non-peripheral vision.

            Try this: Go up to like a 13 inch tv or something small like that, turn it on to some show that has a lot of white to it. Stand about 1 foot or so away, and look just above the TV. I guarantee that you will see flicker. Some people are more sensitve to flicker than others, and it will depend a little on the TV, but at 30hz, I imagine everyone in the world can see it.

            LCD--I think all of the consumer LCDs out today suck as far as pixel decay. I don't know the reason, capacitance maybe, but they suck. Much slower decay than CRTs. On many LCDs today, you still can't tell if you have "mouse trails" turned on or off (in ms windows). So that is why you don't see flicker as much at such low refresh rates on LCDs. There may be other reasons too... I don't know.
            • If you had read the whole post, you would know that at 1 foot away from a 13 inch TV, I, personally, would have the same visual perception as you would at 6 feet. In short, I would be able to resolve practically bugger all on the display, let alone percieve any Flicker! In order to even use a PC, I have to sit less than three inches from the CRT, and I have never in over 15 years noticed considerable flicker or storbe effects - Except in cases where the CRT has been located directly underneath office strip-lighting.

              What kind of LCD's have you been using recently? The last time I had the 'do I have mouse trails on-or-off' feeling was with a DSTN [pcwebopaedia.com] type LCD, over five years ago! Modern TFT Breed LCD's (particularly those from the past 18 months or so) exhibit excellent refresh charachteristics - to the point where it is quite comfortable to watch complete DVD on the LCD, rather than pushing it out to a CRT.

              A Key reason for flicker on US Television is the choice of Colour encoding (NTSC, as opposed to here in the UK where PAL is used). NTSC is much more prone to interference in the Chrominance burst due to it's encodiing system, which causes the signals for odd- and even-fields to interfere, leading to excess blurring and flicker. PAL Encoding does not suffer so badly from these artefacts, due to the fact that the odd- and even-field Chrominance Bursts are out of phase with each other by 120 Degrees. This is probably why I have only ever heard Yanks complaining about flicker on Domestic TV - It doesn't Happen nearly so badly with PAL Anyway.

              In the US, A Full-Frame refresh occurs at 30 Hz, but the Field refresh (the important number here!) is at 60Hz. Remember that traditional broadcast TV, and analogue Video encoding is all Interlaced. (For the UK Frame = 25 Hz and Field = 50 Hz).

              Remember where the Ambient and CRT Light all ends up - the Retina. Most flicker artefacts are at the perceptual level, or are caused by interference as different flicker components enter the eye, and not by an individual item of technology. Probably the most important factor in assessing whether a given person 'sees' a flicker from a given source is not the light-source itself, but that persons eye. If like me, you suffer from a number of combined optical deficiencies, including Astigmatism [seeclearly.com], Albinism [lowvision.org], Nystagmus [spedex.com], Short Sight and Central-vision blind-spots, I guarantee you that you will percieve flicker in a much different way to someone with the textbook '20-20' eye. There are a significant number of factors which account for visual flicker - and Experience in the world tells me that you are quite Wrong to discount the effect of Ambient Lighting on percieved flicker from a CTR/LCD Display.

              You can say what you want, and you can Mod this post all the way to oblivion, but you cannot change how anyone else sees the world.
              • Hey listen, all you said was that you were "partially sighted", which doesn't mean squat, and that your vision was 20/120. Maybe you meant something else, but 20/120 in the US means you can almost get by without glasses at all.

                "You can say what you want, and you can Mod this post all the way to oblivion, but you cannot change how anyone else sees the world" ...nor can you enlighten a fool.
                • Not to mention these little things called "glasses", apparently they use advanded algorithms to bend light, thus increasing vision, perhaps this should just get its own story.....
    • by MosesJones ( 55544 ) on Sunday August 04, 2002 @12:48PM (#4008242) Homepage
      Films are 24Hz, but you don't worry about that, and using an LCD screen means that you don't get the blurring and flashing of a normal monitor.
      • You're right, but I just had a brainwave..

        Could you make a monitor that has multiple skewed electron beams so that you get a more LCD-like effect of the whole image appearing at once?
        • Someone's already done this. I forget if it was Sony or Hitachi, but I remember seeing a proof-of-concept model of a 1280x1024 monitor made from several smaller units. The whole thing was about an inch thick. The CRTs used some high-somethingorother phosphor that only needed a couple of hundred volts between anode and cathode. This was about a year after TFTs hit the mainstream, I think, so it died a death. Nice idea, though.
      • Actually film runs at 24 frames a second, and even though this is enough for the illusion of motion, persistence of vision does not stretch that far. Projectors in movie theatres open and shut twice every frame, so standard theatre film movies are shown at 48Hz
      • Well the difference between film at 24Hz and video games at 24Hz is that film is motion blurred, so each frame captures everything from the start of the frame till the end of the frame (temporally). I remember reading about videocards to use motion blur to improve upon the monitors natural refresh rate. Even if the monitor is going at only 85Hz, using motion blur (combining multiple frames into one) should make games smoother.
    • 30 Hz has been the standard for viz sim for quite a while. Onyx IR has always built around how much realism it can put into that 30 FPS, not on making the frame rate faster.

      The fact is, most games move through space at unrealistic speeds, and a higher frame rate helps. But most airline and military flight simulators are driven by SGI Onyx IR systems, running at 30 Hz.
  • by HawaiianMayan ( 550426 ) on Sunday August 04, 2002 @12:27PM (#4008165)
    IBM brought one of these screens by Alias|Wavefront to show. The image detail is unbelievable.

    In fact, you don't need a zoom tool on your paint program anymore,.. you just need a real magnifying glass sitting next to the monitor (IBM brought one), because it's showing much more detail than you can really see!

    One thing it shows, though, is the need for vector-bases scalable interfaces... the default Windows UI was so tiny on that screen it was really hard to use!
    • I believe that ROX Desktop http://rox.sf.net is vector-based, so would scale very well to this screen. It's quick as all-get-out and light to a point that it would be wise to use on a graphic-intensive ws; though I think this bad boy (DeepView) could handle just about any environment you through at it.
    • That's actually something I've been thinking about.. the idea I finally came up with after brainstorming it a bit is that you would need two coordinate systems, one in pixels, and one in some other unit system of your choosing. You could then specify a button as 5 units by 1 unit, and set the screen to be 64 units by 48 units or something. This would allow you to a) change the size of the workspace independent of the screen resolution and b) change the screen resolution independant of the workspace size. Of course, you would probably need some safeguard to make sure you didn't have a workspace that was much larger than your current resolution could handle, which might result in unreadable text, etc... Back to your original point though, I definately second the notion it's high time for some vector-based interfaces.
    • One thing it shows, though, is the need for vector-bases scalable interfaces...

      Like Mac OS X? ;)

      • Like Mac OS X? ;)

        Um, no. A huge percentage of OS X is drawn with fixed size bitmaps. Some things, if you're lucky, are drawn with very large bitmaps scaled down (so they could conceivably be made bigger), but most of it is fixed-size and small.

        Amazingly, OS X is actually LESS scalable than Windows (at least prior to XP), which allows font scaling and draws the window glyphs with TrueType fonts.

        When Aqua was first shown in public a few optimistic types believed the system was actually drawing those pretty buttons from scratch, and that therefore they could be scaled, but it's simply not the case.

        (I wish Aqua really DID draw them... then you could change the color! Aqua's almost sick reliance on bitmaps is the major reason you can't choose a custom interface color (they actually store two sets of bitmaps for blue and graphite. Unbelievable!))

  • Quake III: Using the Chromium software, we can play Quake III Arena at a resolution of 3840x2400 pixels.

    Where can I buy it and how much does it cost?

    Really, it always amazes me what a laid back company IBM is.

  • I use dual 21" monitors with an nvidia geforce4 900. Can't remember the exact resolution but it's around 3600x1500. Just standard parts you can get from CompUSA.
    • He has a point. A dual headed GeForce 4 can drive two displays at 1600x1200 without problem, which comes out to 3200x1200. Okay, that's not quite 3600x2400, but we're not talking miles off. Why does it require 8 Linux machines just to rustle up the power to display images on a screen of that resolution?

      I could understand if the Linux boxes were running a powerful simulation or something, but surely we can get devices of that resolution running on a single PC with some pretty intense hardware.

      Or, is GeForce 4 et al really on the cutting edge? What do the people with millions of dollars use? Do they have to start using multiple machines like IBM? Sounds unlikely to me. What about the military? Surely someone is one step ahead of the latest consumer technology?
  • PS3... (Score:4, Interesting)

    by MosesJones ( 55544 ) on Sunday August 04, 2002 @12:46PM (#4008236) Homepage
    Ummm and Sony have announced that IBM and Toshiba will be joining up to develop the architecture and processors for the Playstation 3.

    The odds on these two pieces of work not being related have to be pretty slim. Its a pretty clear gameplan, XBOX2 is a "Windows Home Gateway", PS3 is a "Multimedia Home Gateway" that happens to be running linux.
    • This is the stupidest thing I've ever heard
      The odds on these two pieces of work not being related have to be pretty slim. Its a pretty clear gameplan, XBOX2 is a "Windows Home Gateway", PS3 is a "Multimedia Home Gateway" that happens to be running linux.

      You have absolutely no proof to back up these claims. Do you know HOW much research IBM does?
      Here is a list of research products [ibm.com] [ibm.com]. Thats about 400 different projects that IBM is currently working on. Given the amount of projects I'de say there is a greater probability that they're NOT related. Except that they both involve parallel processing.
  • Bright Idea (Score:2, Funny)

    by SeanTobin ( 138474 )
    ... I know! Lets have a press release about our high resolution graphics products, and put a high resolution picture in with the article!

    Oh, and while we're at it, lets make it a 256 color gif!
  • by rnd() ( 118781 )
    One thing puzzles me about the images that were shown from popular games (Flight Simulator, Quake III, etc.): Why did the graphics look so fuzzy at that ultra-high resolution? The site mentioned that texture-mapping on Quake III isn't optimized for that resolution, but my question is:

    Is the point that it runs in real time at that resolution (even though it looks mediocre) or is the point that it supposedly looks great at that resolution? If it's about real-time high-res, then that makes sense to me, but practically speaking who cares what the resolution is if the image quality doesn't improve with higher resolution as seems to be the case with Quake III and Flight Simulator.
    • Re:hmmm (Score:3, Insightful)

      by glwtta ( 532858 )
      um, you can't expect the actual textures to look better just because they are shown with better resolution (in fact, quite the opposite). the "point" is that you can run it in real time with that resolution, and make software that has high enough resolution textures and looks really good. It's kinda like bumping your monitor resolution to 1600x1200 and expecting old DOS games to look better as the result.
    • Possibly the reason why the screenshots look fuzzy is because they're photographs of the screen, as opposed to full-resolution screen captures.
    • Man you're right those q3a graphics suck. Why would they even bring it up? It looks horrid my Geforce 2 Ultra blows the shit outa that YUKKY maybe with games like Doom 3 it'll look good but q3a is just to old for something like that, yuk.
  • For such a nice setup, the presentation is dull!

    The site design & choice of image formats (gif) aren't all that appealing.

    The vidoes could use better lighting, but its still nice to see what the thing is capable of.

  • What's happened to IBM's Power4? I believed it would be much better for such kind of tasks? Or Power4 is dead?
    • I would venture to guess because the Power4 processor is not optimized, nor designed to be used in a workstation environment. I would also suspect that it is not designed to be used as a graphics processor.

      Power chips are not PowerPC chips. If you are refering to PowerPC Generation 4 processors, your message is a missleading question. Additionally, I don't believe that the PPCg4 is quite up to an x86 processor (off the shelf speeds) yet. Don't get me wrong, for the work I do, a G4 Mac would probably be sufficient, but if you are talking raw graphics processing, that's a different market.

      -Rusty
  • With this resolution and clarity, rendered images of virtual objects come to life and become harder to distinguish from the real thing.

    sigh... what a man go through to get high quality Pr0n.

  • I have two fully funtional eyes. Why the hell isn't this in 3D? (When I say 3D, I mean using both of your eyes) It's not really that tough to do.

    Nvidia (amonst others) has kick-ass drivers that let you see all sorts of groovy stuff in real 3D (games, movies, pictures).

    I am not a cyclops.

  • Chromium is a project at Stanford. There was a paper published at this years SIGGRAPH which discusses how the T221 is driven by a cluster and the SGE, as well as other applications including a parallel volume renderer. (http://graphics.stanford.edu/papers/cr/) Chromium is an open source project and you can get it from http://chromium.sourceforge.net. Chromium is designed to enable people to harness the power of a "graphics cluster" and/or use multiple displays. You don't have to buy a T221 and an SGE to render Quake at high resolution, you can use multiple monitors/projectors instead. -Mike
  • There are already many packages that allow for distributed rendering across a network. One of them is chromium [sourceforge.net] (a spin-off of WireGL), which according to some can run Quake in a VR cave (3 walls, stereo).

    Like the first post said, this isn't new.

  • Way to go IBM! Now let's see you put it to a good use... say...

    CAVE Quake [uiuc.edu]!!

    : )

    Now if only my .edu would build this excellent... um, tool!
  • Imagine a beo..

    i mean, imagine single node workstations of these.
  • I have seen it at TJ Watson, connected to an SP3 through the switch of the SP. It's really nice. You can read 2 A4 pages of PDF side-by-side, and it's gorgeous. To program also it's usefull, you can open plenty of Emacs, but you have to get big fonts, otherwise it's unreadable. One problem, when connected to a single PC, is the graphics card. I have seen it with a single quad-output PCI matrox card. The card can't keep up. You really need a cluster to drive the 4-dual DVI inputs.

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...