Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

ViewSonic shows 200 dpi display 254

prostoalex writes "On Intel Developer Forum ViewSonic introduced its 200 dpi display. The 22.2 inch 3840x2400 monitor will sell for around $8,000." Maybe there's hope for all those obsessive folks trying to run Quake 3 at insane resolutions. Provided they'd rather have a monitor than eight grand!
This discussion has been archived. No new comments can be posted.

ViewSonic shows 200 dpi display

Comments Filter:
  • by ip_vjl ( 410654 ) on Wednesday September 11, 2002 @12:10PM (#4238036) Homepage
    Not that anyone cares, but it should be 200ppi (pixels per inch)

    DPI (dots per inch) more accurately describes print devices where a number of print dots are needed to accurately describe a single pixel.

    For example, to show a single 50% black square pixel - you'd need a 2x2 array of black dots (BWBW) - so if your image is 100PPI - you need a print device at least 200PPI to show the same resolution. For a monitor this doesn't really apply - as each pixel corresponds to a single pixel of image data. (Unless of course they were talking about the individual R G B elements - but the article seemed to indicate the contrary)

    ---

    Just a pet peeve, as its often hard to get people to understand that there ARE differences between DPI, PPI, and LPI in the print world.

    - vin

  • Re:Still not enough (Score:3, Informative)

    by sh4de ( 93527 ) on Wednesday September 11, 2002 @12:13PM (#4238052)

    You forget that printers and imagesetters don't render colour gradations the way monitors can. Any video card can feed the 200 DPI display with at least 8 bits per channel, effectively hiding the "low" resolution from your prying eyes.

    Inkjet printers do mix the primaries (CMYK) to produce different colours, but I'd be surprised if the number of gradient steps were nowhere near the 256 per primary that monitors enjoy.

    Imagesetters don't produce contone images at all. Each dot is either on or off. That's why you need 2400 DPI or more resolution to render a fine screen for high quality offset printing.

  • by Gryffin ( 86893 ) on Wednesday September 11, 2002 @12:20PM (#4238126) Homepage
    how high can I set the resolution with having to be able to squint to see the letters that I am typing. I can barely see the letters that I type at 1600x1200. I can imagine what 3840x2400 would look like.

    How high? That depends on whether or not OS developers get their sh*t together.

    Current, mainstream operating systems, or more properly, windowing systems (Windows, Mac OS X, X11) all tend to assume a screen resolution, or offer limited capability to change the resolution.

    • X Window System: for font scaling, allows you to choose from 75dpi or 100dpi. Woo whee.
    • Mac OS X: no capability to scale the display resolution at all, despite the fact that their Quartz rendering engine, with it's PostScript base, should be able to handle the chore in it's sleep.
    • Windows: While it allows the user to choose the DPI of her monitor, this seems to be applied to application fonts only; the fonts used in many user interface elements are not scaled, making it difficult to use many UI elements at high resolutions.

    None of these systems have truly separated the "internal" measurement of graphic objects with their display size; all rely on an assumed point-to-pixel ratio. The cost, of course, for this level of abstraction would be performance, i.e. display speed.

    But it seems to me that modern display adapters shold be more than capable of doing this. What are lacking are the APIs to make the graphics hardware do the math, and the OS support to enable this feature. I think Mac OS X already has most of the capability already; lets see if they actually take the next step.

  • Re:How high? (Score:2, Informative)

    by guacamolefoo ( 577448 ) on Wednesday September 11, 2002 @12:20PM (#4238128) Homepage Journal
    > The resolution can go as high as 3840x2400.
    > That is insane. I think the question is no

    On one hand, there's the (apochryphal?) saying floating around regarding memory: "640k is more than enough memory for anyone" or something to that effect.

    On the other hand, I think you have a legitimate point. To some extent, I think the CPU battle is basically over for most people. For an office environment, who the hell needs more than 500 mhz? My secretary (does word processing, some light spreadsheet stuff) doesn't need to have her 300 mhz machine upgraded ever, probably.

    > longer how high can the resolution go. But on
    > the otherhand, how high can I set the
    > resolution with having to be able to squint to
    > see the letters that I am typing.

    I think another legitimate issue is whether a monitor should be replaced if it absolutely does not need to. One issue of technology is "can X be done?" An often overlooked issue is "Should we bother implementing X?"

    I suspect that on newer machines fancier-pants monitors will be de rigeur as manufacturing techniques improve and prices ultimately make these consumer-priced models, but should you consider dumping an old monitor (and the many pounds of lead) into a landfill or send it to China for "recycling" when the marginal benefits of new technology are minimal?

    By all means, lets keep doing research and development. Let's let the market consider what to do with the technology that companies develop. Let's not forget that technology should serve us and not the other way around. At least until the machines start to think, and then all bets are off.

    guac-foo
  • Re:Too small (Score:2, Informative)

    by tomkarlo ( 15606 ) on Wednesday September 11, 2002 @12:46PM (#4238377) Homepage
    You're confusing the problem of raster icons with problems associated with higher screen resolutions.

    Many operating systems are already using some form of vector icon or considering moving to it (KDE,Mac OSX)... it takes more compute time but not a lot (you only have to compute the raster equiv once given the screen size.)

    Once that happens, you'll be happy using the high res screen and setting the icon size to say, 1 inch, while others might choose a 0.5 inch icon.
  • by 13Echo ( 209846 ) on Wednesday September 11, 2002 @12:50PM (#4238412) Homepage Journal
    That's nice, but from my own experience with 50+ inch HDTV units (we have one), they are cool for gaming, but are totally unsuitable for text. I am talking about using a real db-15 hooked up to the television with some proper discrete timing tweaks. It is still very rough on the eyes.

    The fact is, HDTV units (all of them) are still not "HD" enough for use as a monitor. They work, but my parent's mid-90's Packard Bell monitor had a much more crisp image than any projection HDTV that I have seen.

    A high quality projector will still get you a better image than any consumer grade HDTV.
  • DPI is NOT ppi (Score:3, Informative)

    by MadCow42 ( 243108 ) on Wednesday September 11, 2002 @12:59PM (#4238489) Homepage
    There's a difference... 200 pixels per inch is just that... a pixel can be any shade of any color.

    200 dots per inch can only really render about 25 pixels per inch (with full 24-bit color) because it takes an array of 8x8 "dots" OF EACH COLOR INK (on a printer) to be able to represent 256 shades of each color.

    So, to equal 200ppi resolution on an inkjet printer, you need somewhere around 1600dpi resolution (ok, there's some "tricks" that newer inkjets do to make it look higher with fewer dots, but that's besides the point).

    So, to answer your own question, a 200ppi monitor is much HIGHER resolution than a 1000dpi printer.

    madCow.
  • by Anonymous Coward on Wednesday September 11, 2002 @02:24PM (#4239110)
    To obtain warranty service, you will be required to provide
    The original dated sales slip
    Your name
    Your address
    The serial number of the product
    A description of the problem.

    I had a monitor (~2 years old) that just started smoking one day and I called Viewsonic up and told them I didn't have the receipt. They said that it was not a problem because they would use the "date of manufacture". I also didn't and they didn't require the box it came in.

    They took my credit card number (just in case I didn't send the broken one back) and drop shipped me a new monitor. They told me to ship the broken one back to them (with a shipping label they provided) sometime in the next 30 days.

    So to get a new monitor I was just out the shipping fee to have one sent to me. Not a bad deal IMO. I liked the way I was treated that day but it may depend on which CSR you get on the phone.

  • by Elazro ( 532810 ) <`mahall' `at' `uiuc.edu'> on Thursday September 12, 2002 @04:07AM (#4243247) Homepage
    ViewSonic sent a demo unit out to our group here at the NCSA. To answer a couple of questions I've seen in these posts, it has two dual-DVI inputs, so you can use two dual output PCI cards or one quad output card (The one we have is a GeForce4 Quadro, or something similar). It can be run in Linux, and the speed with OpenGL is just fine. Note that the refresh rate of the monitor is just 16Hz or so, which is fast enough for us (when we're rendering 100GB datasets, 16fps would be a godsend) but may make gamers think twice (or more).

    That said, it is beautiful to behold. It has a nice wide viewing angle, and is quite bright. They sent us the monitor with a CPU with windows and a gallery of images installed. The images looked very, very nice - you could barely see the pixels at all! But for some reason even though the images were all 3840x2400, we still had to pan around them. Guess what?

    OK - a trip to Monitor Properties and we are seeing it at native resolution for real this time. It was almost like looking at a very clear picture, as a previous poster wondered. We had some images of the moon's surface that were better looking than any I've ever seen before, even on print. I'd post screenshots for you if I could. It was so nice that one of my colleagues suggested to the ViewSonic people that they ship the monitor with a magnifying glass. He wasn't kidding either.

    About text, like people keep mentioning, it is awful. In windows, which I find uses quite small text by default anyway, the text on the start bar was illegible without getting up close and peering. In Linux, setting an xterm's text size to Huge makes it legible if you squint. It really is a problem that OS developers need to address because its starting to bug me, even with my 1600x1200 15" laptop screen. Are the physical dimensions of the monitor availible to the OS (using EDIDs or something)?

    Well, must go. -matt

The one day you'd sell your soul for something, souls are a glut.

Working...