Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

ViewSonic shows 200 dpi display 254

prostoalex writes "On Intel Developer Forum ViewSonic introduced its 200 dpi display. The 22.2 inch 3840x2400 monitor will sell for around $8,000." Maybe there's hope for all those obsessive folks trying to run Quake 3 at insane resolutions. Provided they'd rather have a monitor than eight grand!
This discussion has been archived. No new comments can be posted.

ViewSonic shows 200 dpi display

Comments Filter:
  • Give them 20 more years to get these down to $100 each, and ebooks might actually start selling.
  • The resolution can go as high as 3840x2400. That is insane. I think the question is no longer how high can the resolution go. But on the otherhand, how high can I set the resolution with having to be able to squint to see the letters that I am typing. I can barely see the letters that I type at 1600x1200. I can imagine what 3840x2400 would look like.
    • What we could do is invent some sort of font scaling mechanism so your letters could be displayed larger.

      What's the phone number of the patent office?
    • by Gryffin ( 86893 ) on Wednesday September 11, 2002 @11:20AM (#4238126) Homepage
      how high can I set the resolution with having to be able to squint to see the letters that I am typing. I can barely see the letters that I type at 1600x1200. I can imagine what 3840x2400 would look like.

      How high? That depends on whether or not OS developers get their sh*t together.

      Current, mainstream operating systems, or more properly, windowing systems (Windows, Mac OS X, X11) all tend to assume a screen resolution, or offer limited capability to change the resolution.

      • X Window System: for font scaling, allows you to choose from 75dpi or 100dpi. Woo whee.
      • Mac OS X: no capability to scale the display resolution at all, despite the fact that their Quartz rendering engine, with it's PostScript base, should be able to handle the chore in it's sleep.
      • Windows: While it allows the user to choose the DPI of her monitor, this seems to be applied to application fonts only; the fonts used in many user interface elements are not scaled, making it difficult to use many UI elements at high resolutions.

      None of these systems have truly separated the "internal" measurement of graphic objects with their display size; all rely on an assumed point-to-pixel ratio. The cost, of course, for this level of abstraction would be performance, i.e. display speed.

      But it seems to me that modern display adapters shold be more than capable of doing this. What are lacking are the APIs to make the graphics hardware do the math, and the OS support to enable this feature. I think Mac OS X already has most of the capability already; lets see if they actually take the next step.

      • X Window System: for font scaling, allows you to choose from 75dpi or 100dpi

        That's just for bitmapped fonts, since the X server knows the physical dimensions of the screen (and XFree86 seems to be the nicest implementation since it queries the monitor(s) via DDC and automatically computes the horizontal and vertical resolution). Just run xdpyinfo and look for the "dimensions" and "resolution" lines: you may be surprised.

        If your display is in the 100dpi range, it makes more sense to scale bitmapped fonts using bitmaps thought for that resolution, instead that the ones though for a resolution of 72dpi (and vice-versa).

        Of course, this is senseless for vectorial fonts, or if one specifies font sizes in pixels instead of tenths of a point.

    • Re:How high? (Score:2, Informative)

      > The resolution can go as high as 3840x2400.
      > That is insane. I think the question is no

      On one hand, there's the (apochryphal?) saying floating around regarding memory: "640k is more than enough memory for anyone" or something to that effect.

      On the other hand, I think you have a legitimate point. To some extent, I think the CPU battle is basically over for most people. For an office environment, who the hell needs more than 500 mhz? My secretary (does word processing, some light spreadsheet stuff) doesn't need to have her 300 mhz machine upgraded ever, probably.

      > longer how high can the resolution go. But on
      > the otherhand, how high can I set the
      > resolution with having to be able to squint to
      > see the letters that I am typing.

      I think another legitimate issue is whether a monitor should be replaced if it absolutely does not need to. One issue of technology is "can X be done?" An often overlooked issue is "Should we bother implementing X?"

      I suspect that on newer machines fancier-pants monitors will be de rigeur as manufacturing techniques improve and prices ultimately make these consumer-priced models, but should you consider dumping an old monitor (and the many pounds of lead) into a landfill or send it to China for "recycling" when the marginal benefits of new technology are minimal?

      By all means, lets keep doing research and development. Let's let the market consider what to do with the technology that companies develop. Let's not forget that technology should serve us and not the other way around. At least until the machines start to think, and then all bets are off.

      guac-foo
    • Re:How high? (Score:4, Insightful)

      by marmoset ( 3738 ) on Wednesday September 11, 2002 @11:21AM (#4238140) Homepage Journal

      You're approaching it the wrong way. On a 72-dpi screen, a 12-point character can be represented by 144 pixels (I know, I'm deliberately omitting the effects of subpixel aliasing / anti-aliasing, hinting, and all those other tricks that modern display technologies use to boost perceived resolution in order to make this easier to follow) On a 200 dpi screen, over 1100 pixels can be brought to bear on that very same character. This means that the character can be rendered with much greater fidelity, so that if it's rendered at the same height as on the 72DPI screen it should be far more readable. Of course, your OS has to be smart enough to compensate for the much smaller pixels, but modern GUIs have this one figured out, for the most part.



      • Re:How high? (Score:3, Insightful)

        by Ramjet350 ( 582868 )
        Unfortunately the misconception needs to be broken that higher resolution = smaller fonts. If OSs handled it properly, higher resolution would = nicer looking fonts.
      • Good explanation, but your math is off :) On a 72 dpi screen, one pixel is one point. A 12 point character is 12 pixels high. On a 200 dpi screen, the same character is 33 pixels (and looks much nicer).

        My laptop has a 133 dpi screen (1600x1200 pixels, 15" diagonal), and text looks much better than it does on a regular CRT. I'd love a 200dpi display, although I can wait until the prices come down some :)

    • by frovingslosh ( 582462 ) on Wednesday September 11, 2002 @11:27AM (#4238195)
      The resolution can go as high as 3840x2400. That is insane. I think the question is no longer how high can the resolution go. But on the otherhand, how high can I set the resolution with having to be able to squint to see the letters that I am typing. I can barely see the letters that I type at 1600x1200. I can imagine what 3840x2400 would look like.

      This isn't insane, although running a display at a resolution you claim to hardly be able to read might be. The extra resolution gives more dots, so you end up with easier to view type. It's easy to demonstrate how this affects things: Hold a piece of printed text with small but clearly readable text next to text o your monitor. You'll likely find (if you can read the text on your monitor) that the printed text is both smaller and more readable. The reason for this is that there is a greater dot density to the printed text, helping you to read it despite it's apparent small size. Most current monitors just don't have the dot density to match this, so once text shrinks beyond a certain point it's the compromise in pixel selection, not the actual small text, that makes it hard to read the type. A higher density monitor does help in this area. Of course, if you try to make characters the same number of pixels on he new screen then your problem only gets worse, but you can have both more pixels and smaller text, which can result in a very readable display.

      Then again, maybe you just need reading glasses.

    • but its a good start.

      Did you know that to produce a reasonable quality printed page, you need an image with at least 300dpi resolution?

      Once LCD monitors reach these resolutions, reading off a monitor will be as easy and relaxed for the eyes as reading from a piece of paper
  • Seriously. 200 DPI is still not enough.

    Let's take a quick survey. All those of you who'd be happy with a 200 DPI priner, please raise your hand. Right--I thought so.

    I'll say that displays have matured when they're at least 1,000 DPI--though most people can still tell the difference between 1,000 DPI and 2,000 DPI.

    Yes, you can play games with AA. Yes, we need resolution-independent display mechanisms lest bitmapped graphics vanish. Folks, this has all been done before--with printers. When the display engineers catch up to the printer engineers (and, granted, their problems are much harder), those problems will also be solved.

    Cheers,

    b&

    • Re:Still not enough (Score:3, Informative)

      by sh4de ( 93527 )

      You forget that printers and imagesetters don't render colour gradations the way monitors can. Any video card can feed the 200 DPI display with at least 8 bits per channel, effectively hiding the "low" resolution from your prying eyes.

      Inkjet printers do mix the primaries (CMYK) to produce different colours, but I'd be surprised if the number of gradient steps were nowhere near the 256 per primary that monitors enjoy.

      Imagesetters don't produce contone images at all. Each dot is either on or off. That's why you need 2400 DPI or more resolution to render a fine screen for high quality offset printing.

    • I'll say that displays have matured when they're at least 1,000 DPI--though most people can still tell the difference between 1,000 DPI and 2,000 DPI.

      Suggesting we need 1000 dpi monitors just doesn't make sense. Even 300 dpi would be better than common laser printer output. (yes, some now print at higher resolutions, but even these are usually run at the basic 300 dpi settings because of the quality vs. speed tradeoff.) 200 Dpi with all the possible gray levels for AA and sub-pixel font enhancing technology could give results that would contrast nicely to 300 dpi laser print output.

      Of course, you can complain that even 1000 dpi is not good enough for you. And then if you ever get 2000 dpi you can complain about how slow the screen updates are. I'm more concerned with seeing that $8000 price come down.

    • Just for the heck of it, get a stopwatch and time how long it takes to print a 600 dpi graphic to your printer.

      Now to try to imagine what it would be like to wait that long for your screen to update.

      Now, I know that the bus between your video and monitor is faster than the bus to the printer, but the point is that as you increase resolution you significantly increase the amount of data which significantly decreases your frame rates.

      Extremely rough calculations:
      200 dpi is 40,000 pixels per square inch

      My 20" monitor has a 15.5" by 11.75" viewable space, or 182.125 square inches

      If my system were 200 dpi it would be displaying a total of 7,285,000 pixels instead of the 1,310,720 I have it displaying now, or a factor 5.55 times as much data.

      Jumping to 1000 dpi as you suggest takes us 1,000,000 pixels per square inch, which on a screen the size of my monitor would equal 182,112,500 pixels, or a factor of 138.94 times as much data.

      That's a heck of a lot of pixels to be calculating and transmitting and still maintaining a non-headache inducing flicker.
      • DVI (or whatever the pop digital interface is today) might have that problem...

        VGA (arguably the most popular monitor connection standard), however, doesn't. The wonders of analog media continue. :)
        • VGA (arguably the most popular monitor connection standard), however, doesn't

          True, but your video card would still have to be able to handle 182,112,500 pixels regardless of what kind of monitor connection you have. I freely admit that I phrased my initial message very poorly (OK, downright stupidly), as there is no 'data' transfer between a video card and an analog monitor, but there is inside the PC between your software and your video card.
      • A couple of things need to be pointed out:

        • Printers use slow processors, so of course they're going to handle large chunks of PostScript slowly.
        • The original AGP bus is oodles and oodles faster than even USB 2.0. Until they're using the same data transfer interface, live-graphics devices(monitors) will always be a heck of a lot faster than static-graphics devices(printers and the eventual niche-product digital picture-frames.)
      • ..by the time we're looking at 1000 dpi monitors, 182 mpix/sec won't be a big deal.


        The parent's point is well taken - at 200 dpi, the monitor is just at fax quality. (Albeit helped by the additional color information.)

    • DPI is NOT ppi (Score:3, Informative)

      by MadCow42 ( 243108 )
      There's a difference... 200 pixels per inch is just that... a pixel can be any shade of any color.

      200 dots per inch can only really render about 25 pixels per inch (with full 24-bit color) because it takes an array of 8x8 "dots" OF EACH COLOR INK (on a printer) to be able to represent 256 shades of each color.

      So, to equal 200ppi resolution on an inkjet printer, you need somewhere around 1600dpi resolution (ok, there's some "tricks" that newer inkjets do to make it look higher with fewer dots, but that's besides the point).

      So, to answer your own question, a 200ppi monitor is much HIGHER resolution than a 1000dpi printer.

      madCow.
      • There's a difference. Right.
        Color is a property of area, not of a point. Just try seeing the exact shade of a tiny paint chip. Paint half a wall. Then try to exactly match the shade on the other half.
        Resolution has to do with where or how many. A 1000 dpi printer can draw 500 lines in one inch. A 200 ppi monitor can draw 100 lines in one inch.
        There are tricks that can be pulled on both sides, but translating between a 200ppi monitor and a 1000dpi printer loses in BOTH directions.
        • Actually, I suspect that most 1000dpi printers cannot draw 500 lines to an inch, due to the fact that a dot from a 1000dpi printer is likely to be a fair bit larger than 1/1000th of an inch wide. What a 1000dpi printer *can* do is position the centre of its dots to within 1/1000th of an inch.

          I reckon the smallest that most printers can print dots is about 1/10mm, or about 1/250th of an inch. Therefore a 1000dpi printer is going to be able to do about 125 lines/inch. However I would imagine that the printer is better at doing shallow slopes and curves without aliasing, as it has better positional accuracy.
    • A 300 dpi printer cannot make a like 1/300th of an inch wide. However, it _can_ position this too-wide line with 1/300th inch precision.

      However, a monitor can do both.

      Not that this matters much: you seldomly need feature sizes that small.
  • Reviewer. (Score:5, Funny)

    by undeg chwech ( 589211 ) on Wednesday September 11, 2002 @11:05AM (#4237975) Homepage
    Dear ViewSonic

    I have decided, today, to become a professional monitor reviewer. Please send me one of theses ASAP so I can get my new career started,

    Thank you very much,

    Undeg.
    • Re:Reviewer. (Score:3, Interesting)

      by unicron ( 20286 )
      We used to try that shit on beer companies in college. We rarely got a reply but every now and again we'd find a case or 2 of beer waiting for us at the P.O. Box, completely wrapped up and inconspicuous. Hell, we even got bottles because they weighed less.

      I've heard cigarette manufacturers are a lot more giving than the beer companies, but I never tried it.
  • Not only does this cost $8000 bucks! But it must be rather taxing on the CPU to have such a resolution
    • That's just a boatload of pixles to be pumping down the pipe. I wonder how long it will be until hi-res monitors incorporate some vector rendering hardware internally, in order to assauge the bandwidth problems of dvi.

      you probably don't want a full-blown postscript renderer, but something along the ideas of display postscript, or even quickdraw, would probably reduce bandwidth incredibly.

      I've often mused about perhaps using proto-mpeg down the display connector as well, as scaling only needs to be done at the very last stage. However, this would be more complicated, as you'd need to be able to download new drivers into the monitor somehow. Perhaps you could write those drivers in postscript... (loop to top of post)
  • I wonder (Score:5, Funny)

    by cr@ckwhore ( 165454 ) on Wednesday September 11, 2002 @11:06AM (#4237983) Homepage
    I wonder how many folks will look at the picture of that monitor, and honestly think to themselves, "Wow, that looks like a really clear picture."
    • They've done it with advertising TV's on TV comercials for years.

      They do it to advertise stereo systems on the radio.

      Evidently it actually works.
    • I wonder how they actually make it look clearer. Is it because they make the surroundings blurry, or put more colors, or is it simply the power of suggestion...
  • This with the 320 GB drives means more better looking pr0n than ever!
  • reinvention (Score:2, Interesting)

    by sloth jr ( 88200 )
    years and years ago ('88/'89), some companies
    were trying to sell 300dpi monitors to the
    desktop publishing set. No one bought them,
    and they died. There's not much point in them
    now, given the wide-spread use of anti-aliasing.

    I wonder how useful this will be for CAD - won't
    the thin lines be too difficult to see?
    • Umm, on a 300 DPI monitor, the thin lines would be more pixels. And yes, one-pixel lines would be impossible to see. Even on my LCD, one pixel-wide text is very difficult to read.
  • by u8nogard ( 546370 ) on Wednesday September 11, 2002 @11:09AM (#4238018) Homepage
    That's still pretty pricey -- what makes this panel so special?

    Try 9.2 million pixels, for one thing.

    To Loan Officer: Ah, yes, I would like to take out a loan?
    Loan Officer: Good, what type of loan are you interested in?
    To Loan Officer: A Monitor Loan.
    Loan Officer: ...
    To Loan Officer: It has 9.2 million pixels ...
    Loan Officer: Ahh, I'll...be right back...

  • IBM T221 is $8400 (Score:4, Interesting)

    by jlund ( 73067 ) on Wednesday September 11, 2002 @11:09AM (#4238019)
    The article makes it sound as if the IBM is still 20k, this is not the case.

    http://commerce.www.ibm.com/cgi-bin/ncommerce/Pr od uctDisplay?cntrfnbr=1&prmenbr=1&prnbr=9503DG3&cntr y=840&lang=en_US
    • Go here [ibm.com] on IBM's product page, scroll down to the "Display Size:" link and click on it. This will open up a popup showing IBM's description of their Thinkpad displays.

      Its pretty obvious what they meant, but what they meant is not what they said. I emailed them a few months back, but it remains unchanged to this day. Here's the text:

      Display size is determined by the diagonal measurement of the TFT display (i.e. 14.1"), while resolution is the degree of sharpness of a displayed image. Resolution is also expressed in a matrix of dots such as 1024 X 768 representing the number of pixels per square inch.

      * ThinkPad X Series 12.1" TFT display with resolution up to 1024x768 dpi (dots per inch)
      * ThinkPad T Series up to 14.1" with resolution up to 1400x1050 dpi
      * ThinkPad R Series up to 14.1" with resolution up to 1024x768 dpi
      * ThinkPad A Series up to 15" with resolution up to 1600x1400 dpi

  • Video Card (Score:3, Insightful)

    by phorm ( 591458 ) on Wednesday September 11, 2002 @11:09AM (#4238020) Journal
    It might be difficult to find a 3d card that renders 3D properly at the max resolution. Actually, it might be hard to find something that renders 2D at resolution.

    I'd rather go for a size 22" with a really good projector or something, instead of paying $8000 for a super-resolution display. As mentioned in the article, this would be pretty good for 3d design stuff... although the mini-pixels would probably hurt they eyes when you're trying to click on 1 little line or dot.

    Then again, I only have a 15" monitor that I run at 1024x768, maybe I'm just outdated.

    One of these days, my video card will have more RAM than my computer, I just know it - phorm
    • You are outdated. My 15" LCD runs at 133 dpi. It makes reading /. like reading the morning paper. However, it still has to use tricks (like sub-pixel antialiasing -- ClearType) to do that. That creates lots of problems for me on Linux, because Xft really isn't as good at Cleartype-style rendering as Windows XP. When we get ultra-high-res flat panels, we can stop using all these tricks and enjoy incredible text rendering.
  • Instead buy a 50" HDTV and use it for a monitor.

    Problem solved.
    • That's nice, but from my own experience with 50+ inch HDTV units (we have one), they are cool for gaming, but are totally unsuitable for text. I am talking about using a real db-15 hooked up to the television with some proper discrete timing tweaks. It is still very rough on the eyes.

      The fact is, HDTV units (all of them) are still not "HD" enough for use as a monitor. They work, but my parent's mid-90's Packard Bell monitor had a much more crisp image than any projection HDTV that I have seen.

      A high quality projector will still get you a better image than any consumer grade HDTV.
      • Most HDTV screens are 1080 interlaced, which I believe puts your resolution at about 1080 by 768, which is less than awe inspiring while reading text on a 50" screen. Even the really nice ones I have seen, are not ready to replace a good monitor yet. Projectors seem to be the current preference of the rich home theater pc set. The biggest flat panel monitor I know of is Samsung's 24T, I think they support resolutions of 1920*1600 or something similar. Sun or SGI might have something similar to go along with their high end workstations. Last I checked they were also at about 22" or so.
  • by ip_vjl ( 410654 ) on Wednesday September 11, 2002 @11:10AM (#4238036) Homepage
    Not that anyone cares, but it should be 200ppi (pixels per inch)

    DPI (dots per inch) more accurately describes print devices where a number of print dots are needed to accurately describe a single pixel.

    For example, to show a single 50% black square pixel - you'd need a 2x2 array of black dots (BWBW) - so if your image is 100PPI - you need a print device at least 200PPI to show the same resolution. For a monitor this doesn't really apply - as each pixel corresponds to a single pixel of image data. (Unless of course they were talking about the individual R G B elements - but the article seemed to indicate the contrary)

    ---

    Just a pet peeve, as its often hard to get people to understand that there ARE differences between DPI, PPI, and LPI in the print world.

    - vin

    • Just a pet peeve, as its often hard to get people to understand that there ARE differences between DPI, PPI, and LPI in the print world.


      That may be true, but the standard way to describe monitors is dpi. Look out at the box of any monitor at your favorite computer store and they tell you the dpi.
    • For example, to show a single 50% black square pixel - you'd need a 2x2 array of black dots (BWBW) - so if your image is 100PPI - you need a print device at least 200PPI to show the same resolution.

      Wrongo. That's not even close to how single-color printing works. A 50% black square pixel will appear, on the printed page, as a round spot occupying about 50% of the area of a square cell. The process of rendering continuous tone images, like black-and-white photographs, as a pattern of cells filled with spots of varying sizes is called halftoning. The size of the cells you use, in cells per linear inch, is called the line screen. That's where we get the idea of lines-per-inch (or "LPI") from.

      In order to draw those round dots, the printing device you use (typically an imagesetter that exposes photographic film or a metal printing plate) will use a laser to expose tiny spots in a pattern that forms a dot. The perfect imagesetter would be able to draw 256 different sizes of dots, to print what is effectively 257 levels of color, counting 0% and 100%. Obviously, to draw dots this way the resolution of your imagesetter has to be many times your line screen. A good guideline is 16 : 1. So printing a line screen of 150 lpi requires an imagesetter capable of resolving at least 2,400 spots to the linear inch. This isn't a problem for modern laser imagesetters, but it's a bit much for your average Lexmark.

      Of course, it's important to realize that the idea of lines-per-inch only applies to tints or continuous tone images. For printing areas of solid ink, such as letters on a page, the imagesetter draws the shapes at full 2,400 (or whatever) spot-per-inch resolution. Film and printing plates actually have jagged edges on curved and angled edges, but because the imagesetter resolution is so high, you can only see them through a magnifying glass. And once ink hits paper, it bleeds just enough to smooth out all of those jaggies anyway.
      • I understand how halftoning works.

        My example was at the simplest level to show that printer dot != pixel, and that it takes many printer dots to show one pixel.

        The process of halftoning is far more complex than my example - but was enough to show the point for this discussion.


        The perfect imagesetter would be able to draw 256 different sizes of dots, to print what is effectively 257 levels of color, counting 0% and 100%. Obviously, to draw dots this way the resolution of your imagesetter has to be many times your line screen. A good guideline is 16 : 1. So printing a line screen of 150 lpi requires an imagesetter capable of resolving at least 2,400 spots to the linear inch.


        Using the formula of:
        levels = (printer_res / screen_rule)^2 + 1

        means that a print device at 2400dpi running 150lpi will be able to reproduce 256 (257 actually) so that is good - BUT, the image you are trying to reproduce doesn't need to be 2400PPI - since a good deal of the printer dots are used just to represent a single shade.

        rule of thumb - image_res = screen_rule * 2
        So my whole point was that, in the case of a grayscale image to a 2400dpi printer (running 150lpi) - you only need to send image data at 300+PPI.

        Which means your final printout would be:
        300 PPI
        2400 DPI
        150 LPI
        - all three would be true, hence why it is important to distinguish between DPI and PPI. Since if somebody asked, what DPI is that picture, the answer is 2400, but the resolution of the image is no better than 300 PPI.

        Nothing you said was wrong, I think you just missed my point because I had used a very simple example in my first post.

        - vin
        • BUT, the image you are trying to reproduce doesn't need to be 2400PPI - since a good deal of the printer dots are used just to represent a single shade.

          Uh-oh. You're confusing (deliberately or otherwise) "pixel" with "sample."

          In context of this discussion, pixels are physical objects, spots on a screen, transistors in the LCD. In a different context, pixels are numbers representing the color or luminance of a point in a raster, but that's a different context.

          Here, it's more accurate to talk about "samples-per-inch," rather than pixels-per-inch. When you send data to an imagesetter for halftoning, you should have four image data samples for every halftone cell. In other words, your linear sample resolution should be twice your linear halftone resolution, which in turn is usually 16 times the linear spot resolution of your imagesetter.

          If you're trying to clarify things, you're not doing a very good job by using the ambiguous words "dot" and "pixel" over and over again.

          Since if somebody asked, what DPI is that picture, the answer is 2400, but the resolution of the image is no better than 300 PPI.

          Actually, once you halftone screen the image, you have completely thrown away the excess resolution that was originally captured in scanning. The resolution of a printed photograph is its line screen, full stop. If that line screen is 133 lpi, then the resolution of the image is 133 lpi, and no more.
    • Why do we have to use dots per inch at all? Why not just measure resolution in micrometres (as is done for integrated circuit manufacturing, for example)? So a '75dpi' screen is about 340 micrometres resolution. Professional printing might use a resolution of 21 micrometres. And so on.

      (Yeah I know dots per inch is the established standard and it's not going to go away any time soon. But I still think metric is the way to go. Alternatively you could measure dots per millimetre - but something measured in 'per unit of length' is a more complex measurement than just a unit of length. Still, often there's the feeling that higher is better (eg CPU speeds are measured in Hz, not cycle time) so maybe dots per millimetre is it.)
  • Lets see 73in HDTV or 22 inch monitor....... Hang on i'm thinking.
  • IBM has been selling 9 MP displays for over a year. I saw one at last years supercomputer convention. Its hard to read standard XWindows fonts because they are so small. High end photos (5 MP and up) are fabulous and look like prints.
  • Too small (Score:3, Interesting)

    by bobdehnhardt ( 18286 ) on Wednesday September 11, 2002 @11:22AM (#4238148)
    I'm currently running my 21-inch monitor at 1280x1024, and the icons and text are starting to get a little difficult to see (yeah, go ahead and laugh now - you'll break 40 someday, too). At 3840x2400 on a display marginally bigger than this one, the icons will be about 1cm square.

    This thing may find a place in CAD work, but the raw resolution will be utterly useless in normal day-to-day applications.
    • Re:Too small (Score:2, Informative)

      by tomkarlo ( 15606 )
      You're confusing the problem of raster icons with problems associated with higher screen resolutions.

      Many operating systems are already using some form of vector icon or considering moving to it (KDE,Mac OSX)... it takes more compute time but not a lot (you only have to compute the raster equiv once given the screen size.)

      Once that happens, you'll be happy using the high res screen and setting the icon size to say, 1 inch, while others might choose a 0.5 inch icon.
    • Re:Too small (Score:5, Insightful)

      by Phork ( 74706 ) on Wednesday September 11, 2002 @11:49AM (#4238407) Homepage
      here's a thought, maybe you need bigger icons. There is nothing that says icons MUST be 64 pixels tall(well, maybe windows ui guidlines, but they dont count). The idea behind these new displays is that you will use gui elements designed to be rendered on 200dpi displays, not on 72 or 100 dpi displays. So if things were done properly an incon on this monitor would be the same size or larger than one on your current monitor, it would just ne higher quality.
    • That's assuming that you'll be using pixel-for-pixel copies of your icons. I imagine that, by the time this is actually affordable, maybe OpenGL accelerated desktops [apple.com] will be a norm. Then all your icons reside as textures on polygons and stretch to the desired size.
    • At 3840x2400 on a display marginally bigger than this one, the icons will be about 1cm square.

      The problem isn't the monitor, the problem is our windowing systems, be they MS Windows, X-Windows, MacOS, or otherwise. 150 dpi was the effective upper limit on resolution for so long that people started treating it like it was carved in stone.

      A good windowing system (and any software under it), should assume that 1,500 dpi monitor might appear tomorrow. How will you make use of it? Don't just assume that I'll still want my text to be 30 pixels high. We should all be enjoying 150 dpi text on screen now, and looking forward to 300 dpi soon. Resolution improvements in printeres from 150 to 300 to 600 to 1200 were heralded as great improvements, but no one seems excited about nice crisp text on screen! Most modern systems handle fonts pretty well, but the setting isn't obvious or easy enough. (I've know too many people who keep MS Windows in 640x480 "because the text is bigger" instead of increasing the resolution and the font size.) Less correctly handled are icons and buttons. Apple has made some improvements by requiring high resolutions bitmaps for their new task manager bar thing. As a result, high resolution displays get nicer looking icons. Some systems support vector based icons that will scale to any size (Irix's default file manager comes to mind).

      But if your windowing system shows you unusuable small icons or other widgets on high resolution monitors, complain to your windowing system/operating system provider!

    • Perhaps that's why OS X went with 128x128 icons?

      Even on a 200dpi screen, that means an icon would be good for 0.7" on the screen.

      Perhaps that's also why OS X is going with wysiwyg screen fonts, with the assumption that higher resolution displays will mean better font fidelity without additional font tweaking?
  • by DnemoniX ( 31461 ) on Wednesday September 11, 2002 @11:25AM (#4238177)
    I used to love Viewsonic monitors, until the day one of them failed and I called customer service. Upon initial review of the warranty I notice that a CRT is covered for 3 years parts and labor. Great I thought! I called the tecnician. I had already troubleshot the monitor. I changed power cords, I changed outlets, I changed machines that I plugged into it. It was done. He still made me jump through hoops for the better part of a day before they told me I would need all of the following (cut and pasted from their website).

    To obtain warranty service, you will be required to provide
    The original dated sales slip
    Your name
    Your address
    The serial number of the product
    A description of the problem.

    A dated sales slip? Even after 3 years? Come on! Ok well fine I can dig out an invoice. But they also want you to ship it back in the ORIGINAL box! Who has that after three years? This is rediculous. They wouldn't take it since I didn't have the original box! My yearly IT budget is only around $150,000 but rest assured they won't see a dime. After that I started buying HP monitors only, one goes bad, I call, no run around, they next day ship a replacement, and pay for the return shipping. Class act right there.
    • "But they also want you to ship it back in the ORIGINAL box! "

      Have you ever shipped glasswear by freight? No?

      Maybe the people who make and ship thousands of monitors made out of brittle glass know how to pack them, and maybe they want to eleminate another variable. My Daytek monitors require you use the original boxes and materials for warrantee and shipping, and I'm right with them on that. I don't want them broken by some drunken mover, nor do they want users claiming a monitor is broken because of improper shipping.

      If you really had a bugdet of $150,000 USD, you could easily have the boxes unfolded and stored with the foam, put next to the room where you keep all those Windows licence papers.
    • A former roomie of mine had a similar problem - no IT budget here, but when you live in a dorm room, every little square foot counts. So, toss toss toss go the monitor boxes. When the monitor failed, they are picky about the box because most companies will have the monitor destroyed before it gets to the destination in any other enclosure. So, there was a $20 or so fee to get a monitor box (Approved) sent to him. I would imagine a polite request probably would have gotten you a reasonably priced box sent your way.

      If that didn't work, a couple more calls probably would have done the trick.

      YMMV. I believe the company was viewsonic, but I might be wrong.

  • Anyone have a url for a video card that can drive this? We got some end of the year money....
  • Now I only need to set windows to use 300x300 icons and 50pt print in order to be able to use regular software on this thing!
  • Something to think about - at 24 bit color depth, which is 16,777,216 colors for the math challenged, we've already vastly surpassed the amount of colors that the human eye can distinguish between.

    So at what PPI do we surpass the ability of the human eye to distinguish the individual pixels? I run my desktop at 1600 x 1200 and it's *very* tough to see the individual pixels. At what point does it become impossible?

    • How many times does it need to be proven that this is COMPLETELY UNTRUE before people get it???? The human eye can EASILY tell the difference between a large majority of any two colors in a 24 bit spectrum seperated by only one bit. Place two colors on screen, each taking up 1/2 the screen and you will see the interface where they join through a process that is referred to as "Mach Banding".

      If they are NOT touching, then you are right. But since most images are made up of colors that touch each other, it a very important phenomenon.

      Also find something that draws a black like diagonally at 10 degrees across a white background and tell me that 1600 x 1200 is enough. The entire reason that there is so much attention paid to antialiasing in games, fonts and graphics programs is precisely because there is no where near enough resolution on a montior. 200 dpi is a step in the right direction but it'll be at least 300 dpi before computer displays start approaching the confort level of looking at a printed page.
    • Regardless of what colors the human eye can distinguish *between*, the main issue in color is the gamut. Monitors have a much larger color gamut than four-color printing (of any kind), and we can easily see the difference. Try making neon green on your inkjet printer sometime. Or even a bright blue... half the time it will come out looking navy even though it's electric on the screen.

      We face a similar issue with resolution. It's not whether we can see the individual dots, but what things look like in the aggregate because of the number of dots. Yes, there is a vanishing point, or at least an issue of diminishing returns, but we can continue to see improvements in overall quality when we can't actually focus on individual pixels anymore.
  • At 200 pixels/inch, you could very nicely use this with a "lenticular lens screen" to display 3-D images without the need for special glasses or other accessories.

    Remember those cool little "flip cards" you got in Cracker-Jacks, where the image changed when you rotated the card? Well, that's lenticular imaging. This technology is also used for 3-d imagery because the image that you see depends on the angle at which you view the image. Because your eyes each see the same point on the screen from a slightly different angle, the screen shows each eye a different image (allowing proper 3-D).

    Using this screen (200ppi) and a 40-line-per-inch lenticular screen, you could see 5 different images depending on the angle you are viewing from... not bad at all.

    (BTW, I write "shareware" to produce lenticular images... http://www.lenticularshareware.com)

    MadCow.
    • I always thought it would be a great idea for 3-d games to include a l10r lens in the box, for the true 3-d effect. Positioning it on the monitor would be fiddly, but shoudl work ok: many crts give you screen re-sizing controls, and you could use subpixel positioning on LCDs.

      I would suggest just interleaving two images -- or at most three. I've written about this before, but can't be bothered to fight with slashsearch to find it.
  • The other part of the article mentioned the PIV running at 4.7Ghz. They need to get a PIV at 4.77Ghz and an 8088 at 4.77Mhz side by side.

    It'd make a neat statement.
    • They need to get a PIV at 4.77Ghz and an 8088 at 4.77Mhz side by side. It'd make a neat statement.

      Windows XP would still take about as long to boot on the 4.77 GHz machine as MS-DOS 2 would on the 8088.

      Gates's law: The time taken to perform simple operations in mass-market software, measured in microprocessor clock cycles, will increase in subsequent versions of a software product at a rate roughly proportional to the increase in clocks per second of newer microprocessors. Thus, given a lack of funding for increasing hardware speed and a requirement to "keep up with the Joneses" dictated by changing proprietary file formats, the speed of software halves every 18 months [tuxedo.org].

  • Well, I'm not a gamer, or a doctor, or anybody else who needs that kind of resolution. But I'd kill to have a 19" by 11" LCD monitor (that's almost enough room for three full-page windows!) at an ordinary resolution. But nobody seems to be selling that. Oh well, couldn't afford it anyway.

  • I have little faith left in Viewsonic monitors. I suffered through eight RMA's on an 817 series monitor in the course of two years. All, except one, failed for the same reason, just went "poof". One monitor died with 8 hrs of recieving it. The final straw was when they shipped a monitor that looked like something had broken loose inside the tube and bounced around, scarring up the back of the panel.

    Each and every time I called, they professed ignorance and told me that thier was no quailty control problem with the 817. But they had an abundent supply of refurbished 817's. And I had to pay freight for each and every return. At 71 lbs, those babies weren't cheap to ship.

    After about the third return, I tried to convince them to ship me a different model. They wouldn't.

    Funny though, my 815, which sits on the same table, has been lite for going on 5 years now, with not a problem.
  • Please don't go on about, "I won't spend XXX money to hook up that monitor to my Voodoo2-SLI boxen." A monitor of this capability is for medical, military, research applications, etc. Not for your pr0n and lan parties. (or your lan and pr0n parties, either way).

    Think brain surgery, high-res scans, super-acurate 3d models to further help reduce prototyping stages.

    Maybe Duke Nukem Forever, though, cause this monitor will be around $500 by the time that game comes out.

  • Me, ME ME!!! I'd rather have that monitor than 8 grand.

    Of course I'll bet that if I had 8 grand sitting in a bank account my tune would change, but since I'm unemployed that doesn't seem likely.

  • We saw two different examples of the IBM T221 (equivalent) at SIGGRAPH last year, driven by FireGLs and by a special Matrox card of some kind, IIRC.

    We oohed & aahed at the wonderful clarity of the picture. We marvelled at the sheer detail that 200dpi can give you. We were awed at the expansive range of the viewing angle. We were even impressed by the quantity of zeroes on the price tag. Then we saw the picture change.

    It does not update fast. In fact, one system took nearly a third of a second to draw the next picture, and the other took closer to two-thirds of a second. Worse, they updated in quadrants or vertical strips, and the effect was quite jarring. This is not a monitor you could use for animation.

    An AC posted elsewhere here that they can get up to 20 Hz updates. If so, that's a huge improvement over what I saw, but it still sucks big time for most usage.

  • but how will I realy see the quality of a 200 dpi monitor when that picture is only a small picture my 72 dpi monitor?

    If it wasn't 200 dpi monitor but only 100 dpi I probably couldn't see the difference.
  • So viewsonic can make 200 dot per inch 4000x2000 pixel 17 inch displays? How putting some 200 dot per inch displays on our PDAs, which currently have crappy 320x200 displays. With 200 dots per inch we could get PDAs with 800x600 displays. Okay, maybe I would need to get glasses to read them.

    Heck, give me on of those and you call me "four eyes".

  • ViewSonic sent a demo unit out to our group here at the NCSA. To answer a couple of questions I've seen in these posts, it has two dual-DVI inputs, so you can use two dual output PCI cards or one quad output card (The one we have is a GeForce4 Quadro, or something similar). It can be run in Linux, and the speed with OpenGL is just fine. Note that the refresh rate of the monitor is just 16Hz or so, which is fast enough for us (when we're rendering 100GB datasets, 16fps would be a godsend) but may make gamers think twice (or more).

    That said, it is beautiful to behold. It has a nice wide viewing angle, and is quite bright. They sent us the monitor with a CPU with windows and a gallery of images installed. The images looked very, very nice - you could barely see the pixels at all! But for some reason even though the images were all 3840x2400, we still had to pan around them. Guess what?

    OK - a trip to Monitor Properties and we are seeing it at native resolution for real this time. It was almost like looking at a very clear picture, as a previous poster wondered. We had some images of the moon's surface that were better looking than any I've ever seen before, even on print. I'd post screenshots for you if I could. It was so nice that one of my colleagues suggested to the ViewSonic people that they ship the monitor with a magnifying glass. He wasn't kidding either.

    About text, like people keep mentioning, it is awful. In windows, which I find uses quite small text by default anyway, the text on the start bar was illegible without getting up close and peering. In Linux, setting an xterm's text size to Huge makes it legible if you squint. It really is a problem that OS developers need to address because its starting to bug me, even with my 1600x1200 15" laptop screen. Are the physical dimensions of the monitor availible to the OS (using EDIDs or something)?

    Well, must go. -matt

  • Hate to point this out, but my Sharp 16" 1280x1024 screen is 112dpi across.. and it only cost $650.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...