Posted
by
CmdrTaco
from the hardware-to-lust-after dept.
BlackHat linked us to IBMs
Deep View, a
research system for rendering and other advanced applications (Q3A). The
PC is 8 Linux boxes in a rack, which is needed to generate the content for the
T221 display which operates at 3840x2400.
This discussion has been archived.
No new comments can be posted.
I knew I misphrased that as soon as I hit submit. This is an advancement in algorithms (hardware-based in this case), rather than any special functionality of an OS.
The 'SGE' is similar to SGI's compositor which is a product you can buy off the shelf which composits the output of multiple graphics cards in hardware.
The IBM research device does some other things that the SGI IP compositor doesn't - but at least the SGI compositor has been available for around a year - so you can use it today.
I knew that such speeds were possible, it's just hta Apple etc. seem to be quite poor at posting SPEC results. It seems like intel and dell are in competition with each other to see who can post more SPEC results each month. If they can fill the table with 10000 P4 results, then noone will be able to see that an IBM Power system's rocking the #1 spot currently. Cunning, and devious...
IA-64 Itanium perhaps? However, looking at Intel's specs I don't see 866 listed as an available clock speed. They list the original Itanium [intel.com] at 733 and 800 MHz, and Itanium 2 [intel.com] at 900 MHz and 1 GHz.
I would have thought that IBM would plug their own hardware whenever possible -- the T221 display is certainly phenomenal, and they provide a link so you could buy one of those... It has me wondering how I can come up with $8400 to get one. (Heck, when those things came out, they were $20,000! Ah, progress.)
So, this leaves us to wonder... no mention of processors, "low" clock speed -- compared to what we're used to seeing -- something new from AMD? IBM pissed off at Intel? Some new massively parallel top secret silicon from IBM?
Watch... it'll turn out to be Pentium IIIs -- they call 'em "workstations," so they might have recovered them from some other project. (Or all the engineers got new workstations and wondered what cool project they could do with their old ones... Q3A at 3840x2400? What the heck!) What's the limiting factor in this case, processor power or network bandwidth?
This used to be almost exclusively SGI's territory.
A few years ago I attended a presentation at NCSA where Larry Smarr was talking about their plans for a similar display, driven by about $1M of SGI boxes. I think they wanted to call it "The Great Wall of Power".
Some future PlayStation "n" will do this in your living room.
Isn't Moore's Law great?
Which leads me to ask, have we ever had a/. poll on your favourite law? -Moore's -Murphy's -Amdahl's -Newton's - Hooke's -Boyle's -whatever?
NCSA has been making good progress on developing this. The million bucks of SGI hardware has of course been replaced by a rack of Linux PC's. Instructions are actually on line [uiuc.edu]. It's not trivial to build, though. And using LCD projecters does have real downsides. If you buy a dozen identical projectors, they won't have the same brightness and color saturation. Actually, they aren't even consistent from edge to edge. So your display isn't perfect, and you can definitly see the tile edges. Not to mention the fun of building a usable support structure which lets you get all the alignments right. If you don't need it to take up an entire wall, I think the IBM T221 display is a cheaper way to get super high-res output. But of course your high-res Quake won't be lifesize either.:-)
Why, do you have performance benchmarks comparing the experimental, non production DeepView with the commercially available InfinitePerformance [sgi.com] visualization system?
Not one rack of 8 PCs and one half rack of some graphics engine.
It was drawing the full display at about 30fps.
It was as easy to use as any other workstation rather than the 'interesting' mix of 8 pcs in a rack and some other half rack of graphics stuff.
The hardest thing was to read the tiny little fonts on the screen (the display is 200dpi IIRC) - you need a magnifying glass (or of course you could increase the font size).
The Octane2 can do this because you can install two V12 graphics engines each with a dual channel adapter.
http://www.sgi.com/workstations/octane2/dual_cha nn el.html
SGI software stiches the cards together transparently.
It is a beautiful display.
The amuzing/annoying thing about these sorts of announcements is that customers have been using SGI stuff to do this for the last 8 years or so.
If you were on the leading edge of this kind of work would you wait that long for some kludged together solution which might work if you have enough Duct tape to stick it together?
Or would you pay the extra cash for a solution that works and gives you a huge jump on your competition.
The sort of software layer that can be used to make these four channels (two channels from each graphics card) into one display is stuff like:
http://www.sgi.com/software/multipipe/sdk/
Oh yeah - and re your 'every table comment'. The Octane2 fits on one table - it doesn't surround the table like the IBM stuff.
SGI did the graphics cluster thing over two years ago and even released it as a product. Very, very similar to the IBM. They even ported several of their programming APIs and SDKs from InfiniteReality to the Linux Graphics Cluster. Not many were sold, however. Heck, Slashdot didn't even cover it. It was still pretty neat to see multiple spanned monitors and even composited high res projectors driven by a half rack of Linux PCs. Many of the demos were actually ported from SGI's big iron Onyx machines and worked just as well on the cluster. The basic setup was a stack of rackmount Linux boxes using nVidia AGP cards and custom PCI cards daisy chained together to provide sync for glSwapBuffers among other things. Also availble were gigE and Myrinet for networking the machines with something better than 100BT. A compositor (similar to what's used in InfinitePerformance) was also available.
9.2 megaPixel, up to 56 Hz refresh in a 22 inch LCD screen. Want!
This is now very high on my luxury wish list. When is the next IT bubble scheduled to happen again? I have to start my plan to get rich on other people's stupidity and greed so I can afford a system like this.
It's an LCD. It's not like a monitor that flickers. The glow is fully solid, it's just to color of the pixels that refresh at 56Hz. That's plenty fast when the whole damn thing ain't flashing at you.
Well, the referesh rate on an LCD doesn't matter that much. 56 HZ doesn't cause any flicker. And its low enough ( 25 ms response time) that you probably wouldn't see any ghosting in games.
No, if you look (well, if I look) it chops if the minimum goes below about 60. I would rather have a card that has a minimum of 60 fps and a max of 65 than a card with an average of 200fps but routinely drops to 40-50.
Quoth the article: since no single graphics adapter has the necessary horsepower and bandwidth to feed a 9.2 million pixel display (at 41Hz using 24 bits per pixel)
Hmm, doing the math: 3840*2400 pixels = 9216000 Pixels per Frame 9216000*3 (3=24/8) = 27648000 Bytes per Frame 27648000*8 = 221184000 Bits Per Frame 221184000 (bpf) * 41 (fps) = 9068544000 Bits per Second 9068544000/1024 = 8856000 KiloBits per Second (approx) 8856000/1024 = 8648 MegaBits per Second (approx) 8648/1024 = Just over 8 GigaBits per Second
Now, with newer DX9-type graphics Adapters, and AGP 8x, we can do about 2.5 to 3 Gigabits per second just now (over the AGP Bus, haven't calculated Actual Display bitrates!)... Applying Moore's law, (theorum, whatever!), we can safely say that this kind of horsepower will be common in a single, average, desktop PC inside of two to three years.
Sure, this may be a boost to Hollywood today - but soon enough, it will be pretty commonplace technology. (Though I'm betting the most expensive item in that bunch of kit is the actual LCD Display, not the kit driving it!).
Umm, just over 8 gigabits per second is nothing. That's about 1 GB/sec, and an 8x AGP (2 GB/sec) graphics adapter (Radeon 9700) would have no trouble handling that data rate. Besides, you don't have to pump *all* of that data over the AGP bus. You send only display lists and textures and whatnot over the AGP bus. The local graphics card (where the actual data rate would hit near 1 GB/sec) has (on a Radeon 9700) about 20 GB/sec of bandwidth.
You're right of course... As a matter of habit, I tend to think about this kind of data-transfer in GigaBits per second - I totally forgot that AGP4x is specced in GigaBytes per second.
Why? (good) TFT Displays are not fed with a pure Analog Signal like CRT's are, they are fed with a pure-digital signal, which addresses each colour-pixel individually, and sequentially. Surely the Analog Mindset does not apply Here!
Since Chromium allows applications to demonstrate pretty much linear scalability as nodes are added, I would guess that they got interactive frame rates. At least, that's what I've seen with Chromium displaying to one of those displays at Stanford.
It states Click to see the full resolution image (1086x1058) but the full resolution image (407 kilos) was already loaded to show you a thumbnail. Funny to see this bad web-design on IBM's page about new graphic system (kind of marketing?). Or people just forgot why it's polite to use thumbnails in the Net?
Standard cinema runs at 24 Hz non-Interlaced, domestic TV at 25 (UK) or 30 (US) Hz Interlaced (50 or 60 Hz effective, though only half of the image is rendered in each frame), neither of these show a visible strobe effect!
Also, the switching time of the Transistors in the Display Matrix, coupled with signal delays which are bound th be present in the Display Driver Unit (even if they are pico- or fento-second delays), will add up to a smooth(er) experience anyways.
Sure, the display would be more realistic at 100Hz or faster, but really, it's not vital... The human eye only resolves 12 to 15 frames per second, and (depending on the ambient light level, and image contrast levels) about 11 to 15 million distinct colours.
The human eye only resolves 12 to 15 frames per second
Where did you get that from?? It's easily proved wrong by the fact that you can see the flicker of a 60Hz monitor, but even then, if that's all you could see it would be impossible to play sports (such as hitting a baseball).
The whole "what is the frame rate of the eye" question is an incredibly complex one that I won't try and do here (yet again). Suffice it to say that it's not as simple as "frame rate".
No, it's not quite that simple - and perhaps my original response was biased by the fact that I am Partially Sighted (6/36 or 20/120 depending on your side of the Pond), and hence have a totally different view on the world to most of you!
Most people I have spoken to only 'see' the 60Hz flicker of a CRT in artificially lit conditions, and report that the flicker disappears in Natural Lighting - the visible component of the flicker is caused by resonance between the light-source and the CRT.
As for sports - you don't need to resolve images faster than 15 'cycles' per second - your brain will quite happily fill in the missing detail about an object's position, be it a Baseball or whatever, quite unconciously - you don't have to see it to know it's there and where it's going!
your brain will quite happily fill in the missing detail about an object's position, be it a Baseball or whatever, quite unconciously - you don't have to see it to know it's there and where it's going!
Spoken like someone who doesn't play sports (no offense). Anticipation is certainly an important part, but only a minor part. You will NOT be able to hit a 80-90 MPH curve ball by just watching the angle coming from the pitcher's hand. Think about the margin for error in hitting a baseball, and then think about the angle you are watching the baseball come in at, which is basically straight at you.
Or heck, an easier experiment is to have someone toss a ball to you and close your eyes at the halfway point when it reaches the top of the arc. It will NOT be easy to catch, but by your theory is should be since you have data from half the travel of the ball.
This really shouldn't be surprising. Which would be easier to design... a device that tracks a ball and then tries to predict where a grabber should be to catch it, or a device that watches the flight of the ball and continuously updates the grabber position based on where it sees the ball going? Again, anticipation is certainly a part of it since you want a rought approximation of where the ball is going to be, but it's not nearly enough data to be accurate.
I've gotta call bullshit there. The only case where lighting would have any effect on the perception of a CRT would be its intensity, and even then its effect would be indirect.
Your eyes are much more sensitive to flicker in the periphery than looking dead on. If the room is very bright, your eyes will be less dialated, and you'll be less sensitive to the flicker of your monitor.
Artificial light does have it's own flicker component, but that won't interfere with a crt because it doesn't depend on the reflection of that light for its operation. Now if you take an HP48 calculator, you will probably notice some flicker in rooms lit mostly with flourescent lights. The refresh on flourescents (in the US anyway) are close to the refresh of the reflective lcd on the HP48, hence the banding.
Another big factor on flicker is the rate of decay of the phospher elements in your monitor. The slower the glow decay, the less likley you are to see a flicker--the pixel is still glowing from the last time it was hit, when it is struck again. The longer this decay, the lower refresh rate you can get away with from a flicker point of view.
However, now you suffer from smearing or stuttering (sometimes called ghosting). The optimal setup would be a phospher coating with nearly infinite decay rate, operating with an infinite refresh rate.
Television, here in the US is refreshed at something like 30Hz (non hdtv). The reasons you don't see the flicker are: 1) slow decay rate of the phospher. 2) you are usually 5 or 6 feet away from a tv when you are watching--so it isn't in your peripheral vision. 3) While big screen tv's are getting more common, most people are still below the 36 inch mark, which also means it is mostly in your non-peripheral vision.
Try this: Go up to like a 13 inch tv or something small like that, turn it on to some show that has a lot of white to it. Stand about 1 foot or so away, and look just above the TV. I guarantee that you will see flicker. Some people are more sensitve to flicker than others, and it will depend a little on the TV, but at 30hz, I imagine everyone in the world can see it.
LCD--I think all of the consumer LCDs out today suck as far as pixel decay. I don't know the reason, capacitance maybe, but they suck. Much slower decay than CRTs. On many LCDs today, you still can't tell if you have "mouse trails" turned on or off (in ms windows). So that is why you don't see flicker as much at such low refresh rates on LCDs. There may be other reasons too... I don't know.
If you had read the whole post, you would know that at 1 foot away from a 13 inch TV, I, personally, would have the same visual perception as you would at 6 feet. In short, I would be able to resolve practically bugger all on the display, let alone percieve any Flicker! In order to even use a PC, I have to sit less than three inches from the CRT, and I have never in over 15 years noticed considerable flicker or storbe effects - Except in cases where the CRT has been located directly underneath office strip-lighting.
What kind of LCD's have you been using recently? The last time I had the 'do I have mouse trails on-or-off' feeling was with a DSTN [pcwebopaedia.com] type LCD, over five years ago! Modern TFT Breed LCD's (particularly those from the past 18 months or so) exhibit excellent refresh charachteristics - to the point where it is quite comfortable to watch complete DVD on the LCD, rather than pushing it out to a CRT.
A Key reason for flicker on US Television is the choice of Colour encoding (NTSC, as opposed to here in the UK where PAL is used). NTSC is much more prone to interference in the Chrominance burst due to it's encodiing system, which causes the signals for odd- and even-fields to interfere, leading to excess blurring and flicker. PAL Encoding does not suffer so badly from these artefacts, due to the fact that the odd- and even-field Chrominance Bursts are out of phase with each other by 120 Degrees. This is probably why I have only ever heard Yanks complaining about flicker on Domestic TV - It doesn't Happen nearly so badly with PAL Anyway.
In the US, A Full-Frame refresh occurs at 30 Hz, but the Field refresh (the important number here!) is at 60Hz. Remember that traditional broadcast TV, and analogue Video encoding is all Interlaced. (For the UK Frame = 25 Hz and Field = 50 Hz).
Remember where the Ambient and CRT Light all ends up - the Retina. Most flicker artefacts are at the perceptual level, or are caused by interference as different flicker components enter the eye, and not by an individual item of technology.
Probably the most important factor in assessing whether a given person 'sees' a flicker from a given source is not the light-source itself, but that persons eye. If like me, you suffer from a number of combined optical deficiencies, including Astigmatism [seeclearly.com], Albinism [lowvision.org], Nystagmus [spedex.com], Short Sight and Central-vision blind-spots, I guarantee you that you will percieve flicker in a much different way to someone with the textbook '20-20' eye.
There are a significant number of factors which account for visual flicker - and Experience in the world tells me that you are quite Wrong to discount the effect of Ambient Lighting on percieved flicker from a CTR/LCD Display.
You can say what you want, and you can Mod this post all the way to oblivion, but you cannot change how anyone else sees the world.
Hey listen, all you said was that you were "partially sighted", which doesn't mean squat, and that your vision was 20/120. Maybe you meant something else, but 20/120 in the US means you can almost get by without glasses at all.
"You can say what you want, and you can Mod this post all the way to oblivion, but you cannot change how anyone else sees the world"...nor can you enlighten a fool.
Not to mention these little things called "glasses", apparently they use advanded algorithms to bend light, thus increasing vision, perhaps this should just get its own story.....
Now we have anonymous fools. My vision is 20/300 in my right eye, 20/400 in my left, and with glasses I see just fine. So quit this cry-baby shit about 20/120 vision being some kind of handicap.
On a side note, I think the tendency here in america is to measure vision in diopters.
Someone's already done this. I forget if it was Sony or Hitachi, but I remember seeing a proof-of-concept model of a 1280x1024 monitor made from several smaller units. The whole thing was about an inch thick. The CRTs used some high-somethingorother phosphor that only needed a couple of hundred volts between anode and cathode. This was about a year after TFTs hit the mainstream, I think, so it died a death. Nice idea, though.
Actually film runs at 24 frames a second, and even though this is enough for the illusion of motion, persistence of vision does not stretch that far. Projectors in movie theatres open and shut twice every frame, so standard theatre film movies are shown at 48Hz
Well the difference between film at 24Hz and video games at 24Hz is that film is motion blurred, so each frame captures everything from the start of the frame till the end of the frame (temporally). I remember reading about videocards to use motion blur to improve upon the monitors natural refresh rate. Even if the monitor is going at only 85Hz, using motion blur (combining multiple frames into one) should make games smoother.
30 Hz has been the standard for viz sim for quite a while. Onyx IR has always built around how much realism it can put into that 30 FPS, not on making the frame rate faster.
The fact is, most games move through space at unrealistic speeds, and a higher frame rate helps. But most airline and military flight simulators are driven by SGI Onyx IR systems, running at 30 Hz.
IBM brought one of these screens by Alias|Wavefront to show. The image detail is unbelievable.
In fact, you don't need a zoom tool on your paint program anymore,.. you just need a real magnifying glass sitting next to the monitor (IBM brought one), because it's showing much more detail than you can really see!
One thing it shows, though, is the need for vector-bases scalable interfaces... the default Windows UI was so tiny on that screen it was really hard to use!
I believe that ROX Desktop http://rox.sf.net is vector-based, so would scale very well to this screen. It's quick as all-get-out and light to a point that it would be wise to use on a graphic-intensive ws; though I think this bad boy (DeepView) could handle just about any environment you through at it.
That's actually something I've been thinking about.. the idea I finally came up with after brainstorming it a bit is that you would need two coordinate systems, one in pixels, and one in some other unit system of your choosing. You could then specify a button as 5 units by 1 unit, and set the screen to be 64 units by 48 units or something. This would allow you to a) change the size of the workspace independent of the screen resolution and b) change the screen resolution independant of the workspace size. Of course, you would probably need some safeguard to make sure you didn't have a workspace that was much larger than your current resolution could handle, which might result in unreadable text, etc... Back to your original point though, I definately second the notion it's high time for some vector-based interfaces.
Um, no. A huge percentage of OS X is drawn with fixed size bitmaps. Some things, if you're lucky, are drawn with very large bitmaps scaled down (so they could conceivably be made bigger), but most of it is fixed-size and small.
Amazingly, OS X is actually LESS scalable than Windows (at least prior to XP), which allows font scaling and draws the window glyphs with TrueType fonts.
When Aqua was first shown in public a few optimistic types believed the system was actually drawing those pretty buttons from scratch, and that therefore they could be scaled, but it's simply not the case.
(I wish Aqua really DID draw them... then you could change the color! Aqua's almost sick reliance on bitmaps is the major reason you can't choose a custom interface color (they actually store two sets of bitmaps for blue and graphite. Unbelievable!))
I use dual 21" monitors with an nvidia geforce4 900. Can't remember the exact resolution but it's around 3600x1500. Just standard parts you can get from CompUSA.
He has a point. A dual headed GeForce 4 can drive two displays at 1600x1200 without problem, which comes out to 3200x1200. Okay, that's not quite 3600x2400, but we're not talking miles off. Why does it require 8 Linux machines just to rustle up the power to display images on a screen of that resolution?
I could understand if the Linux boxes were running a powerful simulation or something, but surely we can get devices of that resolution running on a single PC with some pretty intense hardware.
Or, is GeForce 4 et al really on the cutting edge? What do the people with millions of dollars use? Do they have to start using multiple machines like IBM? Sounds unlikely to me. What about the military? Surely someone is one step ahead of the latest consumer technology?
Ummm and Sony have announced that IBM and Toshiba will be joining up to develop the architecture and processors for the Playstation 3.
The odds on these two pieces of work not being related have to be pretty slim. Its a pretty clear gameplan, XBOX2 is a "Windows Home Gateway", PS3 is a "Multimedia Home Gateway" that happens to be running linux.
This is the stupidest thing I've ever heard The odds on these two pieces of work not being related have to be pretty slim. Its a pretty clear gameplan, XBOX2 is a "Windows Home Gateway", PS3 is a "Multimedia Home Gateway" that happens to be running linux.
You have absolutely no proof to back up these claims. Do you know HOW much research IBM does? Here is a list of research products [ibm.com] [ibm.com]. Thats about 400 different projects that IBM is currently working on. Given the amount of projects I'de say there is a greater probability that they're NOT related. Except that they both involve parallel processing.
One thing puzzles me about the images that were shown from popular games (Flight Simulator, Quake III, etc.): Why did the graphics look so fuzzy at that ultra-high resolution? The site mentioned that texture-mapping on Quake III isn't optimized for that resolution, but my question is:
Is the point that it runs in real time at that resolution (even though it looks mediocre) or is the point that it supposedly looks great at that resolution? If it's about real-time high-res, then that makes sense to me, but practically speaking who cares what the resolution is if the image quality doesn't improve with higher resolution as seems to be the case with Quake III and Flight Simulator.
um, you can't expect the actual textures to look better just because they are shown with better resolution (in fact, quite the opposite). the "point" is that you can run it in real time with that resolution, and make software that has high enough resolution textures and looks really good. It's kinda like bumping your monitor resolution to 1600x1200 and expecting old DOS games to look better as the result.
Man you're right those q3a graphics suck. Why would they even bring it up? It looks horrid my Geforce 2 Ultra blows the shit outa that YUKKY maybe with games like Doom 3 it'll look good but q3a is just to old for something like that, yuk.
I would venture to guess because the Power4 processor is not optimized, nor designed to be used in a workstation environment. I would also suspect that it is not designed to be used as a graphics processor.
Power chips are not PowerPC chips. If you are refering to PowerPC Generation 4 processors, your message is a missleading question. Additionally, I don't believe that the PPCg4 is quite up to an x86 processor (off the shelf speeds) yet. Don't get me wrong, for the work I do, a G4 Mac would probably be sufficient, but if you are talking raw graphics processing, that's a different market.
Chromium is a project at Stanford. There was a paper published at this years SIGGRAPH which discusses how the T221 is driven by a cluster and the SGE, as well as other applications including a parallel volume renderer. (http://graphics.stanford.edu/papers/cr/)
Chromium is an open source project and you can get it from http://chromium.sourceforge.net. Chromium is designed to enable people to harness the power of a "graphics cluster" and/or use multiple displays. You don't have to buy a T221 and an SGE to render Quake at high resolution, you can use multiple monitors/projectors instead.
-Mike
There are already many packages that allow for distributed rendering across a network. One of them is chromium [sourceforge.net] (a spin-off of WireGL), which according to some can run Quake in a VR cave (3 walls, stereo).
I have seen it at TJ Watson, connected to an SP3 through the switch of the SP. It's really nice. You can read 2 A4 pages of PDF side-by-side, and it's gorgeous.
To program also it's usefull, you can open plenty of Emacs, but you have to get big fonts, otherwise it's unreadable.
One problem, when connected to a single PC, is the
graphics card. I have seen it with a single quad-output PCI matrox card. The card can't keep up.
You really need a cluster to drive the 4-dual DVI inputs.
This is a software solution (Score:1, Insightful)
Re:This is a software solution (Score:2, Insightful)
Re:This is a software solution (Score:1)
http://www.sgi.com/visualization/onyx/3000/ip/t
The IBM research device does some other things that the SGI IP compositor doesn't - but at least the SGI compositor has been available for around a year - so you can use it today.
(Not that I am SGI biased or anything
Re:This is a software solution (Score:2, Interesting)
I did't see anywhere where it said what processor it's using though. A 866MHz what?
Chips at that speed include PIII and Alpha 21264, but not as far as I know a Power or PPC speed (or HPPA, MIPS, or Sparc).
I know Alphas are popular in render-farms, but have gone 'out of fashion' now. Are IBM _embarassed_ by their choice of processor?
YAWIAR.
Re:This is a software solution (Score:2)
Most of IBMs big machines seem to run on PPC chips of one form or another, and 866 is a common speed for G3 CPUs.
Deep Blue ran on the old PPC604e's!
I would think they are using IBM processors.
Re:This is a software solution (Score:1)
However, if you claim they exist, that's enough proof for me, as, as you say, it does make the most sense.
YAWIAR
Re:This is a software solution (Score:2)
Apple used to sell an 867MHz G4 from July 01 - Jan 02.
That's a Mototola part of course, but I'm sure IBM has a similar PPC CPU.
Re:This is a software solution (Score:1)
Cunning, and devious...
Anyway, thanks for the heads-up PPC-wise.
YAWIAR.
Re: An 866 MHz... what? (Score:2)
IA-64 Itanium perhaps? However, looking at Intel's specs I don't see 866 listed as an available clock speed. They list the original Itanium [intel.com] at 733 and 800 MHz, and Itanium 2 [intel.com] at 900 MHz and 1 GHz.
I would have thought that IBM would plug their own hardware whenever possible -- the T221 display is certainly phenomenal, and they provide a link so you could buy one of those... It has me wondering how I can come up with $8400 to get one. (Heck, when those things came out, they were $20,000! Ah, progress.)
So, this leaves us to wonder... no mention of processors, "low" clock speed -- compared to what we're used to seeing -- something new from AMD? IBM pissed off at Intel? Some new massively parallel top secret silicon from IBM?
Watch... it'll turn out to be Pentium IIIs -- they call 'em "workstations," so they might have recovered them from some other project. (Or all the engineers got new workstations and wondered what cool project they could do with their old ones... Q3A at 3840x2400? What the heck!) What's the limiting factor in this case, processor power or network bandwidth?
Who the FUCK cares if it's Linux--It's cool anyway (Score:1)
Sounds Like (Score:1, Funny)
Re:Sounds Like (Score:1)
There was a time when the only inventions in Visualization came out of SGI or Evans & Sutherland. Way to go IBM!
Exactly! (Score:2)
A few years ago I attended a presentation at NCSA where Larry Smarr was talking about their plans for a similar display, driven by about $1M of SGI boxes. I think they wanted to call it "The Great Wall of Power".
Some future PlayStation "n" will do this in your living room.
Isn't Moore's Law great?
Which leads me to ask, have we ever had a
-Moore's
-Murphy's
-Amdahl's
-Newton's
-Boyle's
-whatever?
Re:Exactly! (Score:1)
Guess what? (Score:2)
Re:Sounds Like (Score:2)
Re:Sounds Like (Score:2, Interesting)
http://www.sgi.com/workstations/octane2/
Not one rack of 8 PCs and one half rack of some graphics engine.
It was drawing the full display at about 30fps.
It was as easy to use as any other workstation rather than the 'interesting' mix of 8 pcs in a rack and some other half rack of graphics stuff.
The hardest thing was to read the tiny little fonts on the screen (the display is 200dpi IIRC) - you need a magnifying glass (or of course you could increase the font size).
The Octane2 can do this because you can install two V12 graphics engines each with a dual channel adapter.
http://www.sgi.com/workstations/octane2/dual_ch
SGI software stiches the cards together transparently.
It is a beautiful display.
The amuzing/annoying thing about these sorts of announcements is that customers have been using SGI stuff to do this for the last 8 years or so.
If you were on the leading edge of this kind of work would you wait that long for some kludged together solution which might work if you have enough Duct tape to stick it together?
Or would you pay the extra cash for a solution that works and gives you a huge jump on your competition.
The sort of software layer that can be used to make these four channels (two channels from each graphics card) into one display is stuff like:
http://www.sgi.com/software/multipipe/sdk/
Oh yeah - and re your 'every table comment'. The Octane2 fits on one table - it doesn't surround the table like the IBM stuff.
SGI did this two years ago, as a *product* (Score:2)
More information (white paper and data sheet) can be found on SGI's legacy systems page:
http://www.sgi.com/products/legacy/vis_systems.ht
I belive a few
Links (Score:5, Informative)
Movies [ibm.com] of Deep View in action.
Re:Links (Score:1)
just showing some quite simple models being rotated in Catia.
my P4 can do that in Pro/E too.
or did i miss something important?
Bumps the Mig-19 from my list (Score:4, Funny)
This is now very high on my luxury wish list. When is the next IT bubble scheduled to happen again? I have to start my plan to get rich on other people's stupidity and greed so I can afford a system like this.
the AC
56 Hz? (Score:3, Funny)
Re:56 Hz? (LCD) (Score:2, Informative)
Re:Bumps the Mig-19 from my list (Score:1)
Really cool would be to have all that in _one_ computer. And, of course, at 100Hz refresh rate.
Re:Bumps the Mig-19 from my list (Score:2)
Re:Bumps the Mig-19 from my list (Score:1)
Re:Bumps the Mig-19 from my list (Score:2)
Re:Bumps the Mig-19 from my list (Score:1)
Bandwidth... (Score:2, Interesting)
since no single graphics adapter has the necessary horsepower and bandwidth to feed a 9.2 million pixel display (at 41Hz using 24 bits per pixel)
Hmm, doing the math:
3840*2400 pixels = 9216000 Pixels per Frame
9216000*3 (3=24/8) = 27648000 Bytes per Frame
27648000*8 = 221184000 Bits Per Frame
221184000 (bpf) * 41 (fps) = 9068544000 Bits per Second
9068544000/1024 = 8856000 KiloBits per Second (approx)
8856000/1024 = 8648 MegaBits per Second (approx)
8648/1024 = Just over 8 GigaBits per Second
Now, with newer DX9-type graphics Adapters, and AGP 8x, we can do about 2.5 to 3 Gigabits per second just now (over the AGP Bus, haven't calculated Actual Display bitrates!)... Applying Moore's law, (theorum, whatever!), we can safely say that this kind of horsepower will be common in a single, average, desktop PC inside of two to three years.
Sure, this may be a boost to Hollywood today - but soon enough, it will be pretty commonplace technology. (Though I'm betting the most expensive item in that bunch of kit is the actual LCD Display, not the kit driving it!).
Re:Bandwidth... (Score:5, Informative)
Re:Bandwidth... (Score:1)
You're right of course... As a matter of habit, I tend to think about this kind of data-transfer in GigaBits per second - I totally forgot that AGP4x is specced in GigaBytes per second.
Time for a Homer Moment....... Doh!
Re:Bandwidth... (Score:2)
Re:Not that kind of bandwith .... (Score:1)
Great game engine (Score:5, Interesting)
FPS? (Score:1)
Chromium (Score:2)
So when's .... (Score:1)
Re:So when's .... (Score:2)
http://www.azer.com/aiweb/categories/magaz
Re:So when's .... (Score:3, Funny)
It is rumored that the Bush admin is using IBM computers to plan an Iraqi invasion. It will be called "deep shit" (for which side is unclear).
Try this? (Score:2)
Add a little, traveling a sales man [freshmeat.net]
and a few neurons [freshmeat.net]
Re:So when's .... (Score:1)
Have you noticed that image thumbnail is faked? (Score:1)
41 hz ... ouch! (Score:2)
Re:41 hz ... ouch! (Score:1)
Standard cinema runs at 24 Hz non-Interlaced, domestic TV at 25 (UK) or 30 (US) Hz Interlaced (50 or 60 Hz effective, though only half of the image is rendered in each frame), neither of these show a visible strobe effect!
Also, the switching time of the Transistors in the Display Matrix, coupled with signal delays which are bound th be present in the Display Driver Unit (even if they are pico- or fento-second delays), will add up to a smooth(er) experience anyways.
Sure, the display would be more realistic at 100Hz or faster, but really, it's not vital... The human eye only resolves 12 to 15 frames per second, and (depending on the ambient light level, and image contrast levels) about 11 to 15 million distinct colours.
Re:41 hz ... ouch! (Score:2)
The human eye only resolves 12 to 15 frames per second
Where did you get that from?? It's easily proved wrong by the fact that you can see the flicker of a 60Hz monitor, but even then, if that's all you could see it would be impossible to play sports (such as hitting a baseball).
The whole "what is the frame rate of the eye" question is an incredibly complex one that I won't try and do here (yet again). Suffice it to say that it's not as simple as "frame rate".
Re:41 hz ... ouch! (Score:1)
Most people I have spoken to only 'see' the 60Hz flicker of a CRT in artificially lit conditions, and report that the flicker disappears in Natural Lighting - the visible component of the flicker is caused by resonance between the light-source and the CRT.
As for sports - you don't need to resolve images faster than 15 'cycles' per second - your brain will quite happily fill in the missing detail about an object's position, be it a Baseball or whatever, quite unconciously - you don't have to see it to know it's there and where it's going!
Re:41 hz ... ouch! (Score:2)
your brain will quite happily fill in the missing detail about an object's position, be it a Baseball or whatever, quite unconciously - you don't have to see it to know it's there and where it's going!
Spoken like someone who doesn't play sports (no offense). Anticipation is certainly an important part, but only a minor part. You will NOT be able to hit a 80-90 MPH curve ball by just watching the angle coming from the pitcher's hand. Think about the margin for error in hitting a baseball, and then think about the angle you are watching the baseball come in at, which is basically straight at you.
Or heck, an easier experiment is to have someone toss a ball to you and close your eyes at the halfway point when it reaches the top of the arc. It will NOT be easy to catch, but by your theory is should be since you have data from half the travel of the ball.
This really shouldn't be surprising. Which would be easier to design... a device that tracks a ball and then tries to predict where a grabber should be to catch it, or a device that watches the flight of the ball and continuously updates the grabber position based on where it sees the ball going? Again, anticipation is certainly a part of it since you want a rought approximation of where the ball is going to be, but it's not nearly enough data to be accurate.
Re:41 hz ... ouch! (Score:3, Insightful)
Your eyes are much more sensitive to flicker in the periphery than looking dead on. If the room is very bright, your eyes will be less dialated, and you'll be less sensitive to the flicker of your monitor.
Artificial light does have it's own flicker component, but that won't interfere with a crt because it doesn't depend on the reflection of that light for its operation. Now if you take an HP48 calculator, you will probably notice some flicker in rooms lit mostly with flourescent lights. The refresh on flourescents (in the US anyway) are close to the refresh of the reflective lcd on the HP48, hence the banding.
Another big factor on flicker is the rate of decay of the phospher elements in your monitor. The slower the glow decay, the less likley you are to see a flicker--the pixel is still glowing from the last time it was hit, when it is struck again. The longer this decay, the lower refresh rate you can get away with from a flicker point of view.
However, now you suffer from smearing or stuttering (sometimes called ghosting). The optimal setup would be a phospher coating with nearly infinite decay rate, operating with an infinite refresh rate.
Television, here in the US is refreshed at something like 30Hz (non hdtv). The reasons you don't see the flicker are: 1) slow decay rate of the phospher. 2) you are usually 5 or 6 feet away from a tv when you are watching--so it isn't in your peripheral vision. 3) While big screen tv's are getting more common, most people are still below the 36 inch mark, which also means it is mostly in your non-peripheral vision.
Try this: Go up to like a 13 inch tv or something small like that, turn it on to some show that has a lot of white to it. Stand about 1 foot or so away, and look just above the TV. I guarantee that you will see flicker. Some people are more sensitve to flicker than others, and it will depend a little on the TV, but at 30hz, I imagine everyone in the world can see it.
LCD--I think all of the consumer LCDs out today suck as far as pixel decay. I don't know the reason, capacitance maybe, but they suck. Much slower decay than CRTs. On many LCDs today, you still can't tell if you have "mouse trails" turned on or off (in ms windows). So that is why you don't see flicker as much at such low refresh rates on LCDs. There may be other reasons too... I don't know.
Re:41 hz ... ouch! (Score:1)
What kind of LCD's have you been using recently? The last time I had the 'do I have mouse trails on-or-off' feeling was with a DSTN [pcwebopaedia.com] type LCD, over five years ago! Modern TFT Breed LCD's (particularly those from the past 18 months or so) exhibit excellent refresh charachteristics - to the point where it is quite comfortable to watch complete DVD on the LCD, rather than pushing it out to a CRT.
A Key reason for flicker on US Television is the choice of Colour encoding (NTSC, as opposed to here in the UK where PAL is used). NTSC is much more prone to interference in the Chrominance burst due to it's encodiing system, which causes the signals for odd- and even-fields to interfere, leading to excess blurring and flicker. PAL Encoding does not suffer so badly from these artefacts, due to the fact that the odd- and even-field Chrominance Bursts are out of phase with each other by 120 Degrees. This is probably why I have only ever heard Yanks complaining about flicker on Domestic TV - It doesn't Happen nearly so badly with PAL Anyway.
In the US, A Full-Frame refresh occurs at 30 Hz, but the Field refresh (the important number here!) is at 60Hz. Remember that traditional broadcast TV, and analogue Video encoding is all Interlaced. (For the UK Frame = 25 Hz and Field = 50 Hz).
Remember where the Ambient and CRT Light all ends up - the Retina. Most flicker artefacts are at the perceptual level, or are caused by interference as different flicker components enter the eye, and not by an individual item of technology. Probably the most important factor in assessing whether a given person 'sees' a flicker from a given source is not the light-source itself, but that persons eye. If like me, you suffer from a number of combined optical deficiencies, including Astigmatism [seeclearly.com], Albinism [lowvision.org], Nystagmus [spedex.com], Short Sight and Central-vision blind-spots, I guarantee you that you will percieve flicker in a much different way to someone with the textbook '20-20' eye. There are a significant number of factors which account for visual flicker - and Experience in the world tells me that you are quite Wrong to discount the effect of Ambient Lighting on percieved flicker from a CTR/LCD Display.
You can say what you want, and you can Mod this post all the way to oblivion, but you cannot change how anyone else sees the world.
Re:41 hz ... ouch! (Score:2)
"You can say what you want, and you can Mod this post all the way to oblivion, but you cannot change how anyone else sees the world"
Re:41 hz ... ouch! (Score:1)
Re:41 hz ... ouch! (Score:2)
On a side note, I think the tendency here in america is to measure vision in diopters.
Which is why you use an LCD... (Score:4, Informative)
Re:Which is why you use an LCD... (Score:1)
Could you make a monitor that has multiple skewed electron beams so that you get a more LCD-like effect of the whole image appearing at once?
Re:Which is why you use an LCD... (Score:1)
Re:Which is why you use an LCD... (Score:2)
Re:Which is why you use an LCD... (Score:1)
Re:41 hz ... ouch! (Score:1)
The fact is, most games move through space at unrealistic speeds, and a higher frame rate helps. But most airline and military flight simulators are driven by SGI Onyx IR systems, running at 30 Hz.
The death of the zoom tool (Score:5, Interesting)
In fact, you don't need a zoom tool on your paint program anymore,.. you just need a real magnifying glass sitting next to the monitor (IBM brought one), because it's showing much more detail than you can really see!
One thing it shows, though, is the need for vector-bases scalable interfaces... the default Windows UI was so tiny on that screen it was really hard to use!
Re:The death of the zoom tool (Score:1)
Re:The death of the zoom tool (Score:2)
Re:The death of the zoom tool (Score:2)
Like Mac OS X? ;)
Re:The death of the zoom tool (Score:1)
Um, no. A huge percentage of OS X is drawn with fixed size bitmaps. Some things, if you're lucky, are drawn with very large bitmaps scaled down (so they could conceivably be made bigger), but most of it is fixed-size and small.
Amazingly, OS X is actually LESS scalable than Windows (at least prior to XP), which allows font scaling and draws the window glyphs with TrueType fonts.
When Aqua was first shown in public a few optimistic types believed the system was actually drawing those pretty buttons from scratch, and that therefore they could be scaled, but it's simply not the case.
(I wish Aqua really DID draw them... then you could change the color! Aqua's almost sick reliance on bitmaps is the major reason you can't choose a custom interface color (they actually store two sets of bitmaps for blue and graphite. Unbelievable!))
Noone read to page 3 yet? (Score:1)
Quake III: Using the Chromium software, we can play Quake III Arena at a resolution of 3840x2400 pixels.
Where can I buy it and how much does it cost?
Really, it always amazes me what a laid back company IBM is.
Am I meant to be impressed? (Score:1, Troll)
The original poster is not a troll, here's why. (Score:2)
I could understand if the Linux boxes were running a powerful simulation or something, but surely we can get devices of that resolution running on a single PC with some pretty intense hardware.
Or, is GeForce 4 et al really on the cutting edge? What do the people with millions of dollars use? Do they have to start using multiple machines like IBM? Sounds unlikely to me. What about the military? Surely someone is one step ahead of the latest consumer technology?
Re:The original poster is not a troll, here's why. (Score:1)
Not quite off?? (Score:1)
3600x2400 = 8 640 000
Which is 2.5 times more. So you didn't really mean to make that casual remark did you?
Re:Not quite off?? (Score:2)
Re:Not quite off?? (Score:2)
Re:Not quite off?? (Score:2)
PS3... (Score:4, Interesting)
The odds on these two pieces of work not being related have to be pretty slim. Its a pretty clear gameplan, XBOX2 is a "Windows Home Gateway", PS3 is a "Multimedia Home Gateway" that happens to be running linux.
Re:PS3... (Score:2)
The odds on these two pieces of work not being related have to be pretty slim. Its a pretty clear gameplan, XBOX2 is a "Windows Home Gateway", PS3 is a "Multimedia Home Gateway" that happens to be running linux.
You have absolutely no proof to back up these claims. Do you know HOW much research IBM does?
Here is a list of research products [ibm.com] [ibm.com]. Thats about 400 different projects that IBM is currently working on. Given the amount of projects I'de say there is a greater probability that they're NOT related. Except that they both involve parallel processing.
Bright Idea (Score:2, Funny)
Oh, and while we're at it, lets make it a 256 color gif!
hmmm (Score:2)
Is the point that it runs in real time at that resolution (even though it looks mediocre) or is the point that it supposedly looks great at that resolution? If it's about real-time high-res, then that makes sense to me, but practically speaking who cares what the resolution is if the image quality doesn't improve with higher resolution as seems to be the case with Quake III and Flight Simulator.
Re:hmmm (Score:3, Insightful)
Re:hmmm (Score:1)
Re: hmmm (Score:1)
Ugly Presentation! (Score:1)
The site design & choice of image formats (gif) aren't all that appealing.
The vidoes could use better lighting, but its still nice to see what the thing is capable of.
Intel again? Why not Power4? (Score:1)
Re:Intel again? Why not Power4? (Score:2)
Power chips are not PowerPC chips. If you are refering to PowerPC Generation 4 processors, your message is a missleading question. Additionally, I don't believe that the PPCg4 is quite up to an x86 processor (off the shelf speeds) yet. Don't get me wrong, for the work I do, a G4 Mac would probably be sufficient, but if you are talking raw graphics processing, that's a different market.
-Rusty
uh huh... definitely want (Score:1)
sigh... what a man go through to get high quality Pr0n.
IANAC (Score:1)
Nvidia (amonst others) has kick-ass drivers that let you see all sorts of groovy stuff in real 3D (games, movies, pictures).
I am not a cyclops.
Chromium, the software that makes it all possible (Score:2, Interesting)
not new or interesting (Score:1)
Like the first post said, this isn't new.
Let's see it run a CAVE! (Score:2)
CAVE Quake [uiuc.edu]!!
: )
Now if only my
Imagine... (Score:1)
i mean, imagine single node workstations of these.
Really nice (Score:1)