Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Monitor One-Upmanship From IBM 137

openSoar writes: "So here is a solution for your lounge or media room setup and a nice display for your office. 61 inches of plasma sounds sweet but a $28K price tag doesn't. The IBM LCD will do 3840x2400 which would make me SO much more productive ;-)" Who says 200dpi is only for the labs? I'd rather have two of these than one 61" display anyhow. 3840 x 2400 would mate nicely with the Nikon D1x I also don't have.
This discussion has been archived. No new comments can be posted.

Monitor One-Upmanship From IBM

Comments Filter:
  • by motherhead ( 344331 ) on Friday October 05, 2001 @03:28AM (#2391200)
    it only costs 18K
    • by Tony-A ( 29931 )
      I'm looking at 16K IBM Web Price.
      • yeah, i messed up... i was going to make up some bullshit about shipping and handling and tax and a graphics card that could drive it. but nah, i just had clumsy fingers.
      • It's also a better value in the long run because from what I understand the average plasma display dies in about 18 months when the plasma "wears out" (though the technology is improving) while the LCD would last quite a bit longer (though I'm not sure of the exact amount of time).
    • by Anonymous Coward
      it only costs 18K

      Is this because they throw away every screen that comes with a dead pixel? If that's the case, I'd like to know where IBM's landfill is located. ;)
  • by yokimbo ( 525881 ) on Friday October 05, 2001 @03:33AM (#2391204) Journal
    when you think about it, it's cheaper than a wife and kids...
  • Confusing write up. (Score:4, Informative)

    by supine ( 94843 ) on Friday October 05, 2001 @03:33AM (#2391205) Homepage
    I thought i was being taken to something about a 61" flat panel for $28,000 but the first link actually takes you to where you can purchase a 22" IBM flat panel for US$16,000. After finding that out, the write up starts to make sense...

    marty
  • Once you take into account the fact that you do not need reading-glasses any more with this baby. Just sit back 10ft or more
    • what? it IS only a 22inch widescreen. That means it has less image height than a normal 20inch CRT. If you sit 10ft away, you won't see shit - especially as its 200ppi whih means your Win2k icons, buttons and text are gonna be SMALL
      • OK, so I'm fucked either way. Lets go out and buy a normal screen and a pair of glasses then...
      • it IS only a 22inch widescreen. That means it has less image height than a normal 20inch CRT.

        If you read the chart at the bottom of the web page describing the T220, you will see that the T220 has a (diagonal) viewable image size of 22.2 inches, or the equivalent of approximately a 24-25 inch CRT, since CRT's are measured by the outside tube dimensions while flat panels are generally measured by actual viewable area.

        On the other hand, if you only want viewable image size, you can get some pretty nice projectors or big monitors for a lot less than $16k.

  • Mine eyes have seen the glory ...

    I now know my goal in life, this screen must one day sit upon my desk. I could finally be able to see the facial expressions of those that I frag .. um .. I mean I could finally be able to increase my uh productivity.

    Such wonderous inventions...

    Kataklyzm
  • Do you think they'll get many impulse buy sales?
  • Bit pricy! (Score:1, Informative)

    by jweatherley ( 457715 )
    You could buy 2 Apple 22 inch displays for $5000 AND a maxed out G4 and still have $1000s in change!

    I realise the IBM has a higher pixel density than the Apple models but I can't see many peope rushing out to buy this one.
    • You could buy 2 Apple 22 inch displays for $5000 AND a maxed out G4 and still have $1000s in change!

      Thats silly.
      If I wanted a maxed out abacus with a cool display, I would buy one. But not from Apple.

    • I know a few CAD pros who would consider it.
    • It is pricey, but it might make an impact in some places. I have a cousin who works for an ad company editing commercials. Some of the editing workstations they have cost in the neighborhood of 125k. So 16K for a monitor isn't really all that unreasonable. (of course he'd need two, but still)

  • QUXGA-W (Score:3, Informative)

    by stuffman64 ( 208233 ) <stuffman@gmail.SLACKWAREcom minus distro> on Friday October 05, 2001 @03:49AM (#2391227)
    Ah yes, the acronyms keep stacking up. Ok, this is how I remeber it: MGA->CGA->EGA->VGA->SVGA->XGA->X SGA->UXGA, now the all-mighty QUXGA-W (I'm sure I missed a few in there, especially the Apple ones). This stands for "Quad Ultra eXtended Graphics Array [something]." Does anyone know what the "-W" means?

    Why don't manufacturers use a simple naming convention instead of these hideously long acronyms (hell, "Quxgaw" sounds like a word), and use something a little more desciptive- namely resolution. I think saying "Hey, my monitor is 3840x2400!" sounds better than "I got a Quxgaw sitting on my desk!"
    • Re:QUXGA-W (Score:1, Funny)

      by Anonymous Coward
      Why don't manufacturers use a simple naming convention [...] 3840x2400

      Using screen resolution would be a good idea in principle, but marketing people haven't found a way to cheat with those numbers yet, so they don't like it. Remember, these guys are used to measuring the size of a screen diagonally, counting the black border, rounding up to the next integer inch, and adding one for luck.
    • Does anyone know what the "-W" means?

      Widescreen - so you can frag in letterbox format

    • "I'm sure I missed a few in there, especially the Apple ones" No, you didn't. Apple don't use these stoopid terms. Incidentally, the Apple Cinema Display - also 22 inch but 1600x1024res - is VERY good value compared to this. There is also a Samsung 24 inch TFT available, and don't forget the 17inch SGI display which is EXTREMELY good value (same res as the Apple), but a fair bit smaller. Did anyone else notice that the display use 4 960x2400 stripes - using ONE stripe for VGA? That sure could look crap.
      • I just bought a Samsung SyncMaster 760V TFT (17", 1280x1024). Not quite as nice as those displays, but I only paid $650, which is less than I spent on a 17" CRT 4 years ago.... It's really an incredible screen, and Mac OS X running on this thing could cause keyboard damage (from all the drool, you know). The crispness and the brightness (and the totally lack of flicker!) are just amazing, and anyone who tells you that an LCD can't match the color of a CRT hasn't seen the most recent batch.

        If you spend a large fraction of your day staring at a screen, the time to migrate is now.
        • [...] anyone who tells you that an LCD can't match the color of a CRT hasn't seen the most recent batch.

          An interesting report on Solarism's new ultra-bright LCDs didn't make Slashdot, but check out this [gamers.com] if you're interested.
        • there is every chance that your Samsung is the same panel as Apple's 17incher, as Samsung is Apple's LCD supplier. Bet you don't get the Apple Disply connector though :-[ still, you save what $150?
    • -W is widescreen. Also, you're missing my favorite: QVGA! That's right the almighty Quarter VGA... 320x240.
    • Why don't manufacturers use a simple naming convention instead of these hideously long acronyms

      Hey, they could use the japanese convention: Graphics Alpha 3 EX Perfect!!.

      (yes, the exclamation points are part of the name)
    • Why don't manufacturers use a simple naming convention instead of these hideously long acronyms (hell, "Quxgaw" sounds like a word),

      Heh heh heh... gotta remember THAT word next time I play Scrabble ;{>

      • Thats a 26 point word, even before score multipliers! Just hope you are playing someone who knows what Quxgaw is, or an arguement is sure to ensue!
    • Actually, it's CGA first, EGA, then MCGA and VGA. MCGA was very much like VGA (the most important aspect was 256 colors at 320x200), except it didn't support 16 colors at 640x480, only 2.
    • QUXGA...


      Clearly the next one needs to be QUUXGA.


      --j

    • WOOT!
    • MGA->CGA->EGA->...

      'Hercules' should be in there somewhere too. It becames a defacto standard for hi-res (720x348!) graphics.

      Why don't manufacturers use a simple naming convention instead of these hideously long acronyms

      Outside of marketing literature, I haven't heard anything past "VGA" used very often in real life. People usually say either the exact res ("ten-twenty-four by seven-sixty-eight"), abbreviated res ("ten by seven"), or just the horizontal res ("ten-twenty-four"). I usually use the latter.

    • MGA->CGA->EGA

      Actually, I think that CGA was first. Also, MGA is what is currently being used by Matrox products and I can guarentee that my G400 Max can do a lot more with its MGA than CGA, EGA, or VGA, especially considering it can do 2048x1536 True Color.

      Matrix has been using MGA for quite a while. I know that their original Millenium products were MGA, and I was look at them in '96. It is actually quite powerful, especially in the OpenGL area.
    • There was no MGA before the CGA. There was an MDA, but that didn't do graphics. You might be thinking of the HGC (Hercules Graphics Controller), which was MDA-compatible but could do a cool 720x348 in black-and-white.

      MGA nowadays actually refers to Matrox Graphics Adapter.
  • <p><I>The IBM LCD will do 3840x2400</I></P>

    <P> forgive me for being a mister poopie pants here, because I realize how cool this thing is. Coming from a strong publishing background, there have been times I would have given my arm for one of these.</P>

    <P> But unless you are doing CAD/CAM or publishing , imagine how tiny your icons are going to look at 3840x2400... that is a huge expanse of desktop. Not to mention our beloved porn, hell 800x600 pictures of Anna K bending over will be postage stamp sized... can't even appreciate image....</P>

    <P> Since I haven't seen specs for the included PCI graphics card, I am going to go ahead and assume Nvidia had nothing to do with it, so don't get your Quake III ya-yas in a lather....</P>

    <P> So basically, you spend the 16k+ and then you? I guess you gloat like a son of a bitch while running six applications at the same time, all in their own little piece of screen real estate....</P>

    <P> Oh yeah, photography, the Nikon thing... yeah, that too. .</P>
    • But unless you are doing CAD/CAM or publishing , imagine how tiny your icons are going to look at 3840x2400... that is a huge expanse of desktop. Not to mention our beloved porn, hell 800x600 pictures of Anna K bending over will be postage stamp sized... can't even appreciate image

      If you can afford that display, you can also afford a Nikon D1x, and can probably afford to hire models...Just make you own hi-rez porn.

    • Something like this would be great for me. I do work in virtual reality using large table top displays like the VR Bench© [ctc.com] and the Barco Baron [barco.com].

      These displays are bulky, heavy and take up way too much room. Something like this plasma display would be great. Unfortunately, they still don't have the refresh rate to do stereo very well yet. :-(

      Just my 2 cents...


  • When I used to work on Radar display I had the ultimate dev setup (few years ago this was), my own quad processor Alpha box, one 30" 2048x2048 display, and two 21" displays either side, all running off a 50k dedicated graphics generator.

    One monitor is never enough, you need at least two, one the boss sees with work on it, the other playing xconq.
  • Included PCI graphics card - 'out of the box' readiness for work

    How about a computer to go with it? That would make it REALLY plug and play (just plug in the power plug!).

    The price justifies such convenience :)

    • I did notice that while it apparently came with a PC graphics card, the "Prerequisites" line in the specs implied that it required an Intellistation E/M/Z computer from IBM. So yeah, it probably does come with an IBM computer but not quite in the way you mean-- you have to pay extra on top!
  • Is there any true relationship between res and productivity. I know I FELT a hell of a lot more productive for a few days everytime I switched up a size 12 > 14 > 15 > 17 > 19 over the past how ever many years.

    BUT I recently started using my 14.1" LCD laptop as my main machine simply because I found it more convenient most of the time - and I can't say, thinking about it, that I've ACTUALLY become any less productive.

    Maybe I'll switch to my 19" again and report back in a week as to how much more or less work I get done! I know I alt-tab more than I used to.
    • Well, I do believe extra screen real estate is good; less scrolling and MUCH better window tiling. But I think that the ultimate kicker, since I sit in front of these damn boxes all day, is REDUCED EYE STRAIN.
      As my monitors get bigger, my eyes hurt less. This is why I am in love with my InFocus [infocus.com] LP-530 [infocus.com] (XGA resolution, progressive scan component video, DVI connection, and 2000 lumens!)...it kicks but, especially at lan parties.
    • My last job, I had a big editor's monitor, one at work and one for my home, both about 21", and at first it was great bcs I could have all kinds of documents, graphics, etc. open at the same time. But after a while, I started getting a persistent "crick" in my neck from craning my neck to see the stuff at the top of the screen. It even hurt to sit on the couch and watch TV on a big-screen across the room. I couldn't just crank my desk chair up a little higher, because I am somewhat small in stature, and have to crank it low enough to keep my feet on the ground. Post-Tech Wreck, I started a new but similar job and was a little disappointed with the 17" monitor that came with it. But I'm getting just as much done, and my neck doesn't hurt all the time. So I think there may be an optimum size, perhaps related to user dimensions.

      What would Rain-in-the-Face do?
      • spot-on. We use 2 x 22inch Misubishis on each of our Avid set-ups. To be comfortable over long periods, your straight ahead eye level should be at or around the top edge of the screen. With larger screen, that means setting the screen down below you desk/kbd level. Works out great, just make sure you have no lights above and behind you :-]
      • Walking around work, I'm amazed how many people have poor ergonomic setups.

        And if you're small in stature, you're probably always fighting with averages. I had a friend that was so small, she couldn't sit on a bus and have her feet touch the floor.

        You'll probably *always* have to adjust things for your size - maybe even with a 17" monitor setup.

        Sounds like you really need to lower your desk down towards your keyboard. Your keyboard should probably be a little over your lap so your elbows are at a right angle and your forearms are parallel to the floor when typing. Your eye level should be right about level with the top of the viewing area. Also make sure you're sitting square and looking straight on at the monitor - I'm amazed at how many people look one way and type another...
  • Hopefully it comes with some clever drives to optionally scale application windows. Many apps seem to use absolute pixels for at least their UI if not the rest of their dimensions. These are all going to be half (or is it 1/1.44? I never get this right) the size that they were on a not-much-bigger screen.

    This is what shows that Atheos' GL-based scalable windows are a good idea, as long as you have the texture-ram to go with it, and texture RAM is cheap after a $16k monitor.

    [insert obligatory - grumble grumple win2k drivers only [and PCI only] ]
    • and texture RAM is cheap after a $16k monitor

      so is my car.

    • Many apps seem to use absolute pixels for at least their UI if not the rest of their dimensions.

      Yes, pixel sizing on the web is already hell.

      I use a Sony GDM-500R 21-inch monitor with a 1856x1392 desktop. I use Windows Classic (Large) appearance to make much of the UI larger, I increase the font in all my applications, and use Mozilla's Ctrl-mousewheel to zoom windows. Everything's groovy and productive.

      Then some dumb Web designer specifies a 9-pixel font size that looks fine to him because he's too clueless to increase his desktop beyond 1024x768. He's even got 640x480 and 800x600 outlines on his desktop background, so he thinks he's being accessible because the page still looks good when he sizes the browser small.

      But nine pixels is only 8/100 of an inch on my screen! It's damn well invisible. I could specify my own CSS to override his pixel settings, but I resent feeling disabled just because I actually used my video card's hi-res mode. The HTML to do propotional sizing while specifying font sizes is a little tricky, you have to do browser-dependent stylesheets. CNN and Macromedia do it, but it's apparently beyond the average Webmaster.

      Bigger monitors are great, but until Web designers wise up, realize that increasing dpi without increasing monitor dimensions just leads to incompatibility and eyestrain. I guess that's why RAMDAC progress has stalled at 350MHz and 2048x1536. The sweet spot for a CRT is about 1600x1200 on a 19-inch GDM-F400 (my other monitor).

      At least on a CRT you can drop back to a lower resolution. An LCD is going to look bad scaling 1024x768 up to fit its pixels. But I've not been able to see the Samsung SyncMaster 240T (1920 x 1200, 24 inches) or Viewsonic VP230mb (1600x1200, 23 inches) in person to compare. No computer store carries these things!

      --
      You can never have too many inches or too many pixels.

  • Rather than seeing gigantic LCD panels with gigantic pricetags, how about gfx card manufacturers start playing with the idea of "virtual resolutions" ? ... No, no, no, not the old boring "If I move my mouse off screen, the whole view follows me!"...Thats called a virtual desktop...What i'm talking about falls more along the lines of...... You have a virtual display of 3200x3200, anti-aliased and scaled down on the fly to fit within 1600x1600 at 60-90 FPS..You accomplish the same effect in a smaller piece of real-estate. Seems like it would be easy to pull off, actually. Does such a thing already exist?

    Cheers,
    • Applications could be written to use such anti-aliasing features of course, but I haven't seen them incorporated under-the-covers in 3D graphics drivers and thus used automatically by all applications. If they did so, they'd have to be careful to market around the 4x drop in fill rate performance in that mode.

      If you did such resampling in the video display device itself, you'd have a bandwidth problem communicating all that info over the DVI graphicscard-to-LCD bus.

      Frankly, I'd prefer to see people working on tiling together LCD panels to make economical but larger displays, somewhat like Mass Multiples [slashdot.org]. But even there you still have the problem with DVI bandwidth. Even IBM's display required dual DVI cables between graphics card and monitor which is why they tossed in the card.

      --LP
    • You have a virtual display of 3200x3200, anti-aliased and scaled down on the fly to fit within 1600x1600 at 60-90 FPS..You accomplish the same effect in a smaller piece of real-estate. Seems like it would be easy to pull off, actually. Does such a thing already exist?

      It's called supersampling, and it's how anti-aliased fonts work today. I don't think there's much point to doing "FSAA" on the desktop. It makes sense for arbitrary angled-polygon-style graphics, where you're blending edge detail. But desktops are generally generated as bitmaps, which would only get blurry with AA.

    • What i'm talking about falls more along the lines of...... You have a virtual display of 3200x3200, anti-aliased and scaled down on the fly to fit within 1600x1600 at 60-90 FPS..You accomplish the same effect in a smaller piece of real-estate.

      If you want to see an example of what you are talking about, try using a LCD projector with a larger resolution than it supports. The result is ugly and unreadable.

      Scaling is just another way to do antialiasing, and it's not the best way. If you take a look at MacOS X, programs written with the "Carbon" (old Mac/QuickDraw) APIs use antialiased fonts that are rendered at double size and scaled down. They look awful. Programs written with "Cocoa" (NeXT-derived/Display PostScript) are antialiased using a smarter technique, and they are gorgeous.

      I've noticed that MacOS X and Windows XP have started using bigger fonts and icons. Maybe this is just considered more "user-friendly" especially when screen real estate is larger. Maybe they want to better support people with bad eyesight. Or maybe they actually want the state of the art to progress beyond the current 75-90dpi range where it's stayed since the original Mac in 1984. Even when we've had the technology to produce really small pixels, the GUIs we have tend not to support them, so they don't sell except in very specialized markets.

  • by neema ( 170845 ) on Friday October 05, 2001 @05:36AM (#2391362) Homepage
    They have a cool monthly plan where you pay only 566 dollars a month, 288 required hours of slave work from your children and your soul. I figure I can get this baby easy.
  • So what graphics card does it use? I noticed that the specs implied that you needed dual-DVI connections to supply all the bandwidth between graphic card and LCD display. (And you can't use 2 AGP cards to get 2 DVI connections since AGP is designed to be limited to a single point-to-point bus; one AGP slot per system.) But the only card I know of with those is from Appian Graphics [appiangraphics.com] but even there, I don't know which 3D chipset they're using these days. (Still 3Dlabs?) Anybody know?

    --LP
    • The Appian Gemini cards use the Savage/MX chipset. The Jeronimos use 3Dlabs Permedias. I still prefer a good G400, me.
      • G400s (or 450s) are great until you try anything with OpenGL. Very poor. Radeon boards have a RAMDAC ALMOST as sharp as the G400, and decent 3D performance too. And then there's the GeForce 3...
      • Hmm, looks like Appian's latest "AppianX" uses some custom chip they might have developed, based on 3Dlabs VHDL? At least, reading between the lines of this press release [appiangraphics.com] where Osman Kent mentions licensing VHDL cores but the current products mentioned there use off-the-shelf 3Dlabs parts, and this press release [appiangraphics.com] announcing AppianX but not specifying who made the chipset and thus implying that Appian did, presumbably not totally from scratch given the complexity required, right? Ah, speculation.

        Still pretty removed from whose (3D) graphics chipset goes with this display, which is what I really want to know. Kinda a relevant question for the CAD market which could afford these things, no?

        --LP
    • Appian designed thier own chip based off 3Dlabs VX1 if i remember things right. And yes this IBM panel is using the AppianX video card with some modified drivers.

      its really cool but dont plan on using it for gaming since pushing pixels to that much real estate is eating up most of your bandwidth before you even start using 3D acceleration.

      But I would still love to have one.
  • 18k (Score:2, Funny)

    by donabal ( 116308 )
    18k isn't too bad.

    didnt someone say that 640k would be enough for everyone?

    --donabal
  • Does it run on linux?
  • It'd be one thing if this was for a PowerPC machine... MacOS X's PDF-based display system could actually make use of this resolution.

    But at over 200dpi, the file menu in Win2k is going to be less than a tenth of an inch tall!

    So you set your display settings to "really really big fonts and a big theme" but that won't really solve the problem, as a huge portion of the interface (web pages) are still designed for a raster-based scale and will either a) look like crap, or b) have parts of the interface be too tiny to hit with the mouse.

    Even so, I hope this causes folks to start realizing that screen scale and resolution need to be independant. The "just squint your eyes, you wuss!" attitude to separating the two doesn't really work with this new tech.

    -Erik
  • On the cheap. (Score:2, Interesting)

    by WildTurk ( 317470 )
    How about hooking up 4 15" lcd displays at 1024x768. Costs are about $1100 for the 4 displays. Just hack (hacksaw :-)) the cases and put them in a bigger case. Yes you need 4 adapters to run them. But throw in a machine just for running an X server and you still are in the $2000 range for a 2048x1524 resolution 30" monitor.

    Sheesh, my first 21" monitor (NEC) cost $3600 back in 1982.
    • Yes you need 4 adapters to run them

      Actually I was just visiting the Matrix Graphics site and their new Millenium G200 MMX has quad display capabilities at 1920x1600 16 million color resolutions. Sure it costs $800+, but if you are willing to spend the money on 4 lcd screens then what is another $800?
  • So... (Score:1, Insightful)

    by Anonymous Coward
    What's going on with the recent fashion of starting posts with "So..." ? Your post isn't a conclusion of some previous hidden statement, is it? I've noticed this several times lately. What will we see next, people opening their Slashdot articles with "But..." or "Nevertheless..." ?!
  • Hey, Tim (Score:3, Funny)

    by ackthpt ( 218170 ) on Friday October 05, 2001 @07:47AM (#2391586) Homepage Journal
    Dropping a few hints, hoping Santa reads /. and will load up his sleigh with these goodies?


    Uh... not a bad idea, I hope writing Christmas lists online hasn't been patented [uspto.gov] yet.

  • by foobar104 ( 206452 ) on Friday October 05, 2001 @07:57AM (#2391623) Journal
    I saw this monitor last November in Dallas, at SuperComputing 2000. It was sort of stashed away inside the IBM booth behind some of their big iron. It was big and bright and sharp, and I got the story of its origin from an IBM guy standing nearby.

    Warning: the person who told me this may have been a salesman. I can't claim to know this to be absolutely true.

    According to the IBM guy, the folks from Livermore National Labs wanted, for some reason related to monitoring or surveillence or something like that, a monitor that could display four HDTV-resolution images in a tile. IBM tiled four 1920x1200 images (HDTV's 1920x1080 fits nicely inside the 1920x1200 display standard) on one monitor and sold bunches of them to LLNL for a red-blooded American fortune.

    At that time, IBM called the monitor "Big Bertha." That was the official name and everything; they had data sheets printed up to hand out at the show.

    And everything everybody has said so far is true: at that kind of resolution, your desktop icons are about a quarter of an inch across. And xterms? Forget it. You've gotta set the font size to 36 points just to be able to read it comfortably!

    But then they IBM guy opened up a full-color satellite image of some city or other-- I forget which one. He full-screened it, and then used the mouse to pan and rotate around it. I actually got dizzy; it was like looking through a window. It was AMAZING. I've never, ever seen anything like that before.

    Of course, to push about 10 million full-color pixels around in real-time like that required something more that a $99 graphics card; the monitor was hooked up to an SP node or something similarly impressive.

    But damn, what a show.
    • Me too.

      Some minor tidbits:

      Give scientists bigger computers and they'll tackle larger datasets. Big datasets means ``Big Vizualization'', so there is a branch of the Accellerated Strategic Computing Initiative called VIEWS [llnl.gov] that deals with the design of systems and software that can do ``terascale'' visualization. This includes things like bertha displays, large power walls, big SGI systems, terabyte-sized filesystems, and the occasional linux cluster.

      [Disclaimer: I'm a sysadmin with VIEWS (and we have an open sysadmin position).]

  • The downside is that the graphic card is a Matrox G200/Quad-MMS. The Matrox card itself is a sexy little beast, meant to drive 4 DVI or SVGA monitors. But it's not so sexy when used this way.

    With this monitor, you don't get one big framebuffer, you get four, so you'll need to run Xinerama or similar. If you want to run a game, it'll be in a single head of that card, which (on that monitor) turns into a tall strip about 1/4 the width of the display, and at the speed of a Matrox G200 card.

  • It looks like a great product, but I notice a few minor nits:

    Vertical refresh rate: 41-56Hz. Since no multiple of 30Hz is available, playback of DVD's will not be as perfectly smooth as it could be. On the other hand, people generally do not run CRT's at a multiple of 30Hz, due to issues of phosphor image persistence and 60Hz AC power in many countries.

    26.4 pounds, 7.7 inch depth, 111 Watt power consumption (so it probably has a fan). In terms of lightness, sleakness, power efficiency and quiet, this display is about a third of the way toward a CRT. So, it's not as appealing as a really expensive high tech toy.

    Video card is PCI rather than AGP. With 24-bit pixels, the frame buffer is at least 26MB, and a 33MHz 32-bit PCI bus can only tranfer a maximum of 133MB/second, so the entire screen can only be redrawn from scratch at five frames per second. Maybe the PCI card is 66Mhz or 64-bit (probably not).

    On the positive side, I wonder if the card that comes with the T220 can be obtained separately at a reasonable price and can drive the ITQX20's digital inputs (the 2048x1536 20.8" TFT display that is in the T210 monitor). Then you could build something for a few thousand dollars that would still be a big step up from the 1600x1024 flat panels.
  • If I recall correctly, the GeForce3 doesn't do LCD screens... and probably isn't aware of that pixel depth either.

    so when are they going to make me a kick ass graphics card to go with that kick ass monitor? (and I suppose just to make it fair, the graphics card really should be way out of my price range as well... perhaps a lil' something from SGI)
    • by Anonymous Coward
      It's a Matrox card with 4 digital outputs.
      No 3D acceleration.
      I've been using it for a few weeks now -
      here's a mini-review:

      After seeing it for 10 minutes, I said "I *want it* as my main monitor"...

      And after 20 minutes, I added "when it'll be possible to use it with a different setup"...
      right now, the card driving is slow, and it only
      runs under Windows :-P

      Needless to say, Windows does *not* scale well to that resolution.

      Mac OS X would do much better, with picture-like icons, antialiasing, etc. But there are no drivers. Although it would be a perfect monitor
      for A/V applications (AKA multimedia editing)
      apparently IBM targets a much more specific market
      at the moment.

      Ant the biggest problem as of now:
      **content**
      So far, the only thing making sense is viewing large maps, images, etc. For both reasons mentioned above (slow, no SW/OS support).

      But we're working on it...

      (sorry, AC-posting to do it quickly)
      • why the hell is this AC's post modded down? it is very informative!

        the one thing I've thought about this monitor is that I'd actually have to turn my head to see stuff on screen, whereas now I can run on a 19-21" at 1600x1200 and it is just right to fit a good amount on there, things are small enough that my eyes are happy (I like things "far away") and I don't have to turn my head, it is all in my field of vision...
  • Register: [theregister.co.uk] "Sources familiar with the W3C's patent policy have confirmed that demands for the standards body to adopt RAND licensing were initiated by IBM."

    Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents, Patents....

  • by openSoar ( 89599 ) on Friday October 05, 2001 @09:44AM (#2392039)
    It seems something happened to my posting - I did check it thoroughly before I posted - honestly..

    Anyway - the plasma TV I was referring to is here [nectech.com] and it is indeed 61" and $28K
  • If I don't stop drooling over this I may have to get a new keyboard.
  • I'm running my 22" monitors at 1600x1200, and it's PLENTY high enough resolution for me. At the resolution the IBMs are running, you'd either have to sit 6" away, or use a magnifying glass, or turn your default fonts way, way up. Linux apps may be able to handle that but lots of Windows progs don't work right with nonstandard font sizes. And my dual flat-screen Viewsonic 22" monitors cost 1/10 as much as that thing.
  • *Zap* Can't see *Zap* stupid *Zap* letters.*Zap* *Ka-BOOM*!
  • by TheGratefulNet ( 143330 ) on Friday October 05, 2001 @11:41AM (#2392574)
    when the 1600sw [sgi.com] was in the process of being discontinued (its now fully discontinued), there were many available on ebay and other outlets [firesale.com]. I bought 2 of them along with an AGP and PCI card (one of each; both purely digital). I asked around to see if anyone could confirm that you could run Xinerama (dualhead single-logical-screen) but no one at SGI could confirm this (at least regarding xfree86) except the xf86 driver developer, himself! so I gave it a try and I couldn't be happier for it. having 2 1600x1024 lcd screens with a contrast ratio of 350 is just amazing. and the lcd+video card combo was usually less than $1k each (yes, I spent about $2k on my dualhead setup. is that a lot? well, for folks with poor/failing eyesight, all-digital lcd's are a godsend!)

    however, since these are long gone from the usual retail channels, their used price has skyrocketed and the used prices are now approaching the price of the units as if they were new! guess that tells SGI that they shouldn't have retired this design. (and they replaced it with a far inferior unit that only does 1280x1024, and via analog, too!) ;-(

    the downside of the lcd's is that they aren't the best for doing photo retouch work. interesting that you mentioned the nikon d1x - I just bought a used nikon d1 (original) and while its "only" 2000x1300 in output resolution, its still a darned good camera body and being able to shoot off 4.5frames/sec with no noticeable shutter lag or latency is still state of the art. but I do have to do my last stage of retouching on an actual CRT.

    CRTs will never go away. LCDs are uber-cool but bright highlights get blown out when you view on an LCD. I do mostly C-coding and sysadmin type stuff at home (and only occasional photo work), so the dual LCDs pretty much fit my need. but don't think that they're a complete substitute for a CRT in all cases, 'cause they're not.

  • This thing comes with it's own video controller card.

    This means that it's damn unlikely to work straight out with your GeForce3's.
  • What, no AGP? Fuck that, it's worthless :)
  • The monitor's spec page lists the desktop versions of IBM's Intellistation brand pc as a hardware requirement. Here's the glossy page [ibm.com] for the Intellistations. Basically, it's another $1300-1500 for the pc to drive the monitor (if the 'hardware requirement' is legit, and not a 'marketing requirement'). I wonder what makes the Intellistation required (custom logic on either component?)
  • Not only is this not the world's highest res. computer display (as IBM claims on their site) but its late by about 15 years. For a good discussion of why high-res displays fail to sell, see:

    http://www.useit.com/alertbox/980531_comments.ht ml

    which is a post by the inventor of a 4096 x 3300 pixel display.
  • I use a 16:10, 23" wide-screen monitor (SONY) at work, at 1800x1200.
    I think, that if you keep the monitor at a comfortable distance, that's about the right resolution for that screen size, maybe 2048x1280 tops.

There is no opinion so absurd that some philosopher will not express it. -- Marcus Tullius Cicero, "Ad familiares"

Working...