ViewSonic shows 200 dpi display 254
prostoalex writes "On Intel Developer Forum ViewSonic introduced its 200 dpi display. The 22.2 inch 3840x2400 monitor will sell for around $8,000." Maybe there's hope for all those obsessive folks trying to run Quake 3 at insane resolutions. Provided they'd rather have a monitor than eight grand!
ebooks? (Score:1)
How high? (Score:1)
Re:How high? (Score:2, Funny)
What's the phone number of the patent office?
Re:How high? Depends on the OS (Score:5, Informative)
How high? That depends on whether or not OS developers get their sh*t together.
Current, mainstream operating systems, or more properly, windowing systems (Windows, Mac OS X, X11) all tend to assume a screen resolution, or offer limited capability to change the resolution.
None of these systems have truly separated the "internal" measurement of graphic objects with their display size; all rely on an assumed point-to-pixel ratio. The cost, of course, for this level of abstraction would be performance, i.e. display speed.
But it seems to me that modern display adapters shold be more than capable of doing this. What are lacking are the APIs to make the graphics hardware do the math, and the OS support to enable this feature. I think Mac OS X already has most of the capability already; lets see if they actually take the next step.
Re:How high? Depends on the OS (Score:2)
That's just for bitmapped fonts, since the X server knows the physical dimensions of the screen (and XFree86 seems to be the nicest implementation since it queries the monitor(s) via DDC and automatically computes the horizontal and vertical resolution). Just run xdpyinfo and look for the "dimensions" and "resolution" lines: you may be surprised.
If your display is in the 100dpi range, it makes more sense to scale bitmapped fonts using bitmaps thought for that resolution, instead that the ones though for a resolution of 72dpi (and vice-versa).
Of course, this is senseless for vectorial fonts, or if one specifies font sizes in pixels instead of tenths of a point.
Re:How high? Depends on the OS (Score:2)
I think the biggest problem on X11 is the font manager. It only seems to understand 75ppi or 100ppi; for any other resolution, I assume it either chooses the nearest, or tries to scale from the nearest.
I have little experience with the "official" X11, but IMHO, XFree86 font handling is still playing catch-up with Windows and Mac. It only recently gained decent scalable fonts (TrueType), and they're still problematic; the concept of scaling these arbitrarily to match screen resolution seems a long way off.
Re:How high? Depends on the OS (Score:2)
With XF86, you can use the DisplaySize setting in XF86Config to specify the physical size of your display. X will then use that and your current screen size in pixels to compute the number of dots per inch.
Re:Amen. (Score:2)
Oh, I agree. And how.
Ideally, you'd have apps specify all graphics (including font glyphs) in real-world units (points, inches, mm, furlongs, parsecs, whatever), and the window manager / display manager would do the translation to screen pixels. Problem there is, with current OSs, you'd need to toss out all your graphics, type, and UI APIs and start from scratch. And get all your developers to do the same. Not bloody likely.
The alternative is to internally translate the idealized "pixel" of the current graphics APIs to actual screne pixels. That way, the application never notices the difference; everything it draws to the screen is adjusted to the display resolution. Problemt here is, for most displays where the actual resolution is fairly close to the idealized resolution, scaling tends to introduce lots of very ugly artifacts: jagged edges, awkward kerning and letterspacing, "one pixel off" errors, etc.
Again, though, Mac OS X's Quartz rendering engine does a pretty remarkable job with this sort of scaling; try their Screen Zoom utility (System Prefs, under Universal Access) and see what I mean. That's one reason why I think Apple might be the first to make this sort of thing work. (Ironically, the same OS won't let you choose your UI fonts or sizes. Go figger.)
Re:Gender neutral (Score:2)
A more politically correct way of saying this would read: "it allows the user to choose the DPI of his or her monitor..." or even better "her [or] his monitor"
This is a common misconception. The English language is not strongly gendered, as are some Romance or Germanic languages. For example, English articles do not differentiate between the gender, or lack thereof, of their objects. We say "a man" and "a woman," and "the man" and "the woman." As such, we have no gender-neutral third-person animate pronoun. We have "it," which is a third-person inanimate pronoun, but native speakers will virtually universally reject calling a person of unspecified gender "it."
In informal speech, evidence of the third person plural being used as a neuter pronoun goes back for centuries, but such convention has never been adopted for formal or written speech. It's acceptable to say, "It allows the user to choose the DPI of their monitor" informally, according to some authorities, but it is frowned upon in formal speech, and it is never acceptable in writing.
The only correct usage of the third-person pronoun when speaking of a person (as opposed to an animal or a thing) or unspecified gender is to use the third person masculine form: "he," "his," "him."
The original speaker was only correct to say, "It allows the user to choose the DPI of her monitor," if referring specifically and exclusively to female users. The feminine third-person pronouns cannot be used in neutral context, unlike the masculine.
Saying "his or her monitor" is redundant, and should be avoided. The masculine pronoun applies to persons of either gender, which includes women.
People who find this offensive always amuse me, because I see it quite the opposite. When you say "he" or "him," you could be referring to just anybody, man or woman. But when you say "she" or "her," you're talking about a woman, and only a woman. The clear implication is that women are special, and that they deserve a category of their own. To find it any other way strikes me as backwards.
No it can't (Score:2)
For icons, OS X uses a scheme where each power-of-two size from 16x16 to 128x128 has a dedicated bitmap. The one closest to the display size is selected and final tweaking is done with interpolation.
This approach works for OS X icons because per the Aqua Human Interface Guidelines they are photorealistic. The blurring caused by the interpolation doesn't affect them; for widgets it would be unacceptable. This is why the icons can be scaled from bitmaps but widgets would have to be drawn as vector art.
My prediction is that instead of the utopia of totally boundless scaling, we will get the Palm resolution hack adapted to desktop systems. Just like in the 160x160 -> 320x320 migration of Palms, you can probably soon drive a 3840x2400 display with a "virtual" 1920x1200 desktop, with twice the letterform resolution, twice the widget resolution, and twice the OpenGL resolution. Bitmaps (on the web, etc) would just be pixel-doubled to appear the right size.
And then, with a special API, a program could tap into the physical resolution of the display and supply it with native pixels.
Re:How high? (Score:2, Informative)
> That is insane. I think the question is no
On one hand, there's the (apochryphal?) saying floating around regarding memory: "640k is more than enough memory for anyone" or something to that effect.
On the other hand, I think you have a legitimate point. To some extent, I think the CPU battle is basically over for most people. For an office environment, who the hell needs more than 500 mhz? My secretary (does word processing, some light spreadsheet stuff) doesn't need to have her 300 mhz machine upgraded ever, probably.
> longer how high can the resolution go. But on
> the otherhand, how high can I set the
> resolution with having to be able to squint to
> see the letters that I am typing.
I think another legitimate issue is whether a monitor should be replaced if it absolutely does not need to. One issue of technology is "can X be done?" An often overlooked issue is "Should we bother implementing X?"
I suspect that on newer machines fancier-pants monitors will be de rigeur as manufacturing techniques improve and prices ultimately make these consumer-priced models, but should you consider dumping an old monitor (and the many pounds of lead) into a landfill or send it to China for "recycling" when the marginal benefits of new technology are minimal?
By all means, lets keep doing research and development. Let's let the market consider what to do with the technology that companies develop. Let's not forget that technology should serve us and not the other way around. At least until the machines start to think, and then all bets are off.
guac-foo
Re:How high? (Score:4, Insightful)
You're approaching it the wrong way. On a 72-dpi screen, a 12-point character can be represented by 144 pixels (I know, I'm deliberately omitting the effects of subpixel aliasing / anti-aliasing, hinting, and all those other tricks that modern display technologies use to boost perceived resolution in order to make this easier to follow) On a 200 dpi screen, over 1100 pixels can be brought to bear on that very same character. This means that the character can be rendered with much greater fidelity, so that if it's rendered at the same height as on the 72DPI screen it should be far more readable. Of course, your OS has to be smart enough to compensate for the much smaller pixels, but modern GUIs have this one figured out, for the most part.
Re:How high? (Score:3, Insightful)
Re:How high? (Score:2)
My laptop has a 133 dpi screen (1600x1200 pixels, 15" diagonal), and text looks much better than it does on a regular CRT. I'd love a 200dpi display, although I can wait until the prices come down some :)
Re:How high? (Score:2)
So it is in fact you who are wrong.
it's sane, 1600x1200 w/ current tech may not be (Score:4, Insightful)
This isn't insane, although running a display at a resolution you claim to hardly be able to read might be. The extra resolution gives more dots, so you end up with easier to view type. It's easy to demonstrate how this affects things: Hold a piece of printed text with small but clearly readable text next to text o your monitor. You'll likely find (if you can read the text on your monitor) that the printed text is both smaller and more readable. The reason for this is that there is a greater dot density to the printed text, helping you to read it despite it's apparent small size. Most current monitors just don't have the dot density to match this, so once text shrinks beyond a certain point it's the compromise in pixel selection, not the actual small text, that makes it hard to read the type. A higher density monitor does help in this area. Of course, if you try to make characters the same number of pixels on he new screen then your problem only gets worse, but you can have both more pixels and smaller text, which can result in a very readable display.
Then again, maybe you just need reading glasses.
Re: 3840x2400 is still not good enough... (Score:2)
Did you know that to produce a reasonable quality printed page, you need an image with at least 300dpi resolution?
Once LCD monitors reach these resolutions, reading off a monitor will be as easy and relaxed for the eyes as reading from a piece of paper
Still not enough (Score:2)
Seriously. 200 DPI is still not enough.
Let's take a quick survey. All those of you who'd be happy with a 200 DPI priner, please raise your hand. Right--I thought so.
I'll say that displays have matured when they're at least 1,000 DPI--though most people can still tell the difference between 1,000 DPI and 2,000 DPI.
Yes, you can play games with AA. Yes, we need resolution-independent display mechanisms lest bitmapped graphics vanish. Folks, this has all been done before--with printers. When the display engineers catch up to the printer engineers (and, granted, their problems are much harder), those problems will also be solved.
Cheers,
b&
Re:Still not enough (Score:3, Informative)
You forget that printers and imagesetters don't render colour gradations the way monitors can. Any video card can feed the 200 DPI display with at least 8 bits per channel, effectively hiding the "low" resolution from your prying eyes.
Inkjet printers do mix the primaries (CMYK) to produce different colours, but I'd be surprised if the number of gradient steps were nowhere near the 256 per primary that monitors enjoy.
Imagesetters don't produce contone images at all. Each dot is either on or off. That's why you need 2400 DPI or more resolution to render a fine screen for high quality offset printing.
Re:Still not enough (Score:2)
Yeah, but you'll still need color proofs.
Re:Still not enough (Score:2)
Suggesting we need 1000 dpi monitors just doesn't make sense. Even 300 dpi would be better than common laser printer output. (yes, some now print at higher resolutions, but even these are usually run at the basic 300 dpi settings because of the quality vs. speed tradeoff.) 200 Dpi with all the possible gray levels for AA and sub-pixel font enhancing technology could give results that would contrast nicely to 300 dpi laser print output.
Of course, you can complain that even 1000 dpi is not good enough for you. And then if you ever get 2000 dpi you can complain about how slow the screen updates are. I'm more concerned with seeing that $8000 price come down.
Re:Still not enough (Score:2)
Now to try to imagine what it would be like to wait that long for your screen to update.
Now, I know that the bus between your video and monitor is faster than the bus to the printer, but the point is that as you increase resolution you significantly increase the amount of data which significantly decreases your frame rates.
Extremely rough calculations:
200 dpi is 40,000 pixels per square inch
My 20" monitor has a 15.5" by 11.75" viewable space, or 182.125 square inches
If my system were 200 dpi it would be displaying a total of 7,285,000 pixels instead of the 1,310,720 I have it displaying now, or a factor 5.55 times as much data.
Jumping to 1000 dpi as you suggest takes us 1,000,000 pixels per square inch, which on a screen the size of my monitor would equal 182,112,500 pixels, or a factor of 138.94 times as much data.
That's a heck of a lot of pixels to be calculating and transmitting and still maintaining a non-headache inducing flicker.
Re:Still not enough (Score:2)
VGA (arguably the most popular monitor connection standard), however, doesn't. The wonders of analog media continue.
Re:Still not enough (Score:2)
True, but your video card would still have to be able to handle 182,112,500 pixels regardless of what kind of monitor connection you have. I freely admit that I phrased my initial message very poorly (OK, downright stupidly), as there is no 'data' transfer between a video card and an analog monitor, but there is inside the PC between your software and your video card.
Re:Still not enough (Score:2)
Re:Another thing. (Score:2)
PostScript is a double-edged sword. On the positive side, it's making printer compatibility problems a thing of the past. On the negative side, it makes printers more expensive and slows printing times.
otoh... (Score:2)
The parent's point is well taken - at 200 dpi, the monitor is just at fax quality. (Albeit helped by the additional color information.)
DPI is NOT ppi (Score:3, Informative)
200 dots per inch can only really render about 25 pixels per inch (with full 24-bit color) because it takes an array of 8x8 "dots" OF EACH COLOR INK (on a printer) to be able to represent 256 shades of each color.
So, to equal 200ppi resolution on an inkjet printer, you need somewhere around 1600dpi resolution (ok, there's some "tricks" that newer inkjets do to make it look higher with fewer dots, but that's besides the point).
So, to answer your own question, a 200ppi monitor is much HIGHER resolution than a 1000dpi printer.
madCow.
Re:DPI is NOT ppi (Score:2)
Color is a property of area, not of a point. Just try seeing the exact shade of a tiny paint chip. Paint half a wall. Then try to exactly match the shade on the other half.
Resolution has to do with where or how many. A 1000 dpi printer can draw 500 lines in one inch. A 200 ppi monitor can draw 100 lines in one inch.
There are tricks that can be pulled on both sides, but translating between a 200ppi monitor and a 1000dpi printer loses in BOTH directions.
500 lines from 1000dpi? (Score:2)
I reckon the smallest that most printers can print dots is about 1/10mm, or about 1/250th of an inch. Therefore a 1000dpi printer is going to be able to do about 125 lines/inch. However I would imagine that the printer is better at doing shallow slopes and curves without aliasing, as it has better positional accuracy.
Re:Still not enough (Score:2)
However, a monitor can do both.
Not that this matters much: you seldomly need feature sizes that small.
mm height text? (Score:2)
On a 200dpi device - pixels/mm = 200dpi / 24.5 = 7.87
7.87 pixels/mm - I would say that you really need 5 pixels to have legible text, plus a gap line between the next line of text. So on a 200dpi device, the best you're going to get is text
To get 0.5mm text, you'd need at least 300pixels/inch, bare minimum. For nice 8x8 character sets you'd need 400pixels/inch.
Reviewer. (Score:5, Funny)
I have decided, today, to become a professional monitor reviewer. Please send me one of theses ASAP so I can get my new career started,
Thank you very much,
Undeg.
Re:Reviewer. (Score:3, Interesting)
I've heard cigarette manufacturers are a lot more giving than the beer companies, but I never tried it.
Re:That's fraud (Score:2)
who in their right mind would buy this (Score:1)
End of the bitmapped display (Score:2)
you probably don't want a full-blown postscript renderer, but something along the ideas of display postscript, or even quickdraw, would probably reduce bandwidth incredibly.
I've often mused about perhaps using proto-mpeg down the display connector as well, as scaling only needs to be done at the very last stage. However, this would be more complicated, as you'd need to be able to download new drivers into the monitor somehow. Perhaps you could write those drivers in postscript... (loop to top of post)
I wonder (Score:5, Funny)
Re:I wonder (Score:2)
They do it to advertise stereo systems on the radio.
Evidently it actually works.
Re:I wonder (Score:2)
It has to be said... (Score:2)
reinvention (Score:2, Interesting)
were trying to sell 300dpi monitors to the
desktop publishing set. No one bought them,
and they died. There's not much point in them
now, given the wide-spread use of anti-aliasing.
I wonder how useful this will be for CAD - won't
the thin lines be too difficult to see?
Re:reinvention (Score:2)
9.2 million pixel monitor loan (Score:3, Funny)
Try 9.2 million pixels, for one thing.
To Loan Officer: Ah, yes, I would like to take out a loan? ... ...
Loan Officer: Good, what type of loan are you interested in?
To Loan Officer: A Monitor Loan.
Loan Officer:
To Loan Officer: It has 9.2 million pixels
Loan Officer: Ahh, I'll...be right back...
Re:9.2 million pixel monitor loan (Score:2)
IBM T221 is $8400 (Score:4, Interesting)
http://commerce.www.ibm.com/cgi-bin/ncommerce/P
Apparently, IBM has even higher resolutions (Score:2)
Its pretty obvious what they meant, but what they meant is not what they said. I emailed them a few months back, but it remains unchanged to this day. Here's the text:
Display size is determined by the diagonal measurement of the TFT display (i.e. 14.1"), while resolution is the degree of sharpness of a displayed image. Resolution is also expressed in a matrix of dots such as 1024 X 768 representing the number of pixels per square inch.
* ThinkPad X Series 12.1" TFT display with resolution up to 1024x768 dpi (dots per inch)
* ThinkPad T Series up to 14.1" with resolution up to 1400x1050 dpi
* ThinkPad R Series up to 14.1" with resolution up to 1024x768 dpi
* ThinkPad A Series up to 15" with resolution up to 1600x1400 dpi
Video Card (Score:3, Insightful)
I'd rather go for a size 22" with a really good projector or something, instead of paying $8000 for a super-resolution display. As mentioned in the article, this would be pretty good for 3d design stuff... although the mini-pixels would probably hurt they eyes when you're trying to click on 1 little line or dot.
Then again, I only have a 15" monitor that I run at 1024x768, maybe I'm just outdated.
One of these days, my video card will have more RAM than my computer, I just know it - phorm
Re:Video Card (Score:2)
Re:Video Card (Score:2)
1) You haven't heard of Cleartype? Where exactly have you been living? Its a software technology that takes advantage of the design of LCDs to make text look sharper. http://grc.com/cleartype.htm
2) Ideally, it would be high-res AND multiple monitors. Personally, I'd prefer the high-res. Even with 133 dpi, ClearType makes text almost as nice to read as paper. Given that I stare at text for hours a day (coding) it is quite a luxury.
Incredible Waste of Cash (Score:2)
Problem solved.
Re:Incredible Waste of Cash (Score:3, Informative)
The fact is, HDTV units (all of them) are still not "HD" enough for use as a monitor. They work, but my parent's mid-90's Packard Bell monitor had a much more crisp image than any projection HDTV that I have seen.
A high quality projector will still get you a better image than any consumer grade HDTV.
Re:Incredible Waste of Cash (Score:2)
Re:Incredible Waste of Cash (Score:2)
Custom timings, custom resolutions, and HDTV. [digitalconnection.com]
Of course, it totally depends on the hardware you use.
Clarify - should be PPI (Score:5, Informative)
DPI (dots per inch) more accurately describes print devices where a number of print dots are needed to accurately describe a single pixel.
For example, to show a single 50% black square pixel - you'd need a 2x2 array of black dots (BWBW) - so if your image is 100PPI - you need a print device at least 200PPI to show the same resolution. For a monitor this doesn't really apply - as each pixel corresponds to a single pixel of image data. (Unless of course they were talking about the individual R G B elements - but the article seemed to indicate the contrary)
---
Just a pet peeve, as its often hard to get people to understand that there ARE differences between DPI, PPI, and LPI in the print world.
- vin
Re:Clarify - should be PPI (Score:2)
That may be true, but the standard way to describe monitors is dpi. Look out at the box of any monitor at your favorite computer store and they tell you the dpi.
Re:Clarify - should be PPI (Score:2)
Wrongo. That's not even close to how single-color printing works. A 50% black square pixel will appear, on the printed page, as a round spot occupying about 50% of the area of a square cell. The process of rendering continuous tone images, like black-and-white photographs, as a pattern of cells filled with spots of varying sizes is called halftoning. The size of the cells you use, in cells per linear inch, is called the line screen. That's where we get the idea of lines-per-inch (or "LPI") from.
In order to draw those round dots, the printing device you use (typically an imagesetter that exposes photographic film or a metal printing plate) will use a laser to expose tiny spots in a pattern that forms a dot. The perfect imagesetter would be able to draw 256 different sizes of dots, to print what is effectively 257 levels of color, counting 0% and 100%. Obviously, to draw dots this way the resolution of your imagesetter has to be many times your line screen. A good guideline is 16 : 1. So printing a line screen of 150 lpi requires an imagesetter capable of resolving at least 2,400 spots to the linear inch. This isn't a problem for modern laser imagesetters, but it's a bit much for your average Lexmark.
Of course, it's important to realize that the idea of lines-per-inch only applies to tints or continuous tone images. For printing areas of solid ink, such as letters on a page, the imagesetter draws the shapes at full 2,400 (or whatever) spot-per-inch resolution. Film and printing plates actually have jagged edges on curved and angled edges, but because the imagesetter resolution is so high, you can only see them through a magnifying glass. And once ink hits paper, it bleeds just enough to smooth out all of those jaggies anyway.
Re:Clarify - should be PPI (Score:2)
My example was at the simplest level to show that printer dot != pixel, and that it takes many printer dots to show one pixel.
The process of halftoning is far more complex than my example - but was enough to show the point for this discussion.
Using the formula of:
levels = (printer_res / screen_rule)^2 + 1
means that a print device at 2400dpi running 150lpi will be able to reproduce 256 (257 actually) so that is good - BUT, the image you are trying to reproduce doesn't need to be 2400PPI - since a good deal of the printer dots are used just to represent a single shade.
rule of thumb - image_res = screen_rule * 2
So my whole point was that, in the case of a grayscale image to a 2400dpi printer (running 150lpi) - you only need to send image data at 300+PPI.
Which means your final printout would be:
300 PPI
2400 DPI
150 LPI
- all three would be true, hence why it is important to distinguish between DPI and PPI. Since if somebody asked, what DPI is that picture, the answer is 2400, but the resolution of the image is no better than 300 PPI.
Nothing you said was wrong, I think you just missed my point because I had used a very simple example in my first post.
- vin
Re:Clarify - should be PPI (Score:2)
Uh-oh. You're confusing (deliberately or otherwise) "pixel" with "sample."
In context of this discussion, pixels are physical objects, spots on a screen, transistors in the LCD. In a different context, pixels are numbers representing the color or luminance of a point in a raster, but that's a different context.
Here, it's more accurate to talk about "samples-per-inch," rather than pixels-per-inch. When you send data to an imagesetter for halftoning, you should have four image data samples for every halftone cell. In other words, your linear sample resolution should be twice your linear halftone resolution, which in turn is usually 16 times the linear spot resolution of your imagesetter.
If you're trying to clarify things, you're not doing a very good job by using the ambiguous words "dot" and "pixel" over and over again.
Since if somebody asked, what DPI is that picture, the answer is 2400, but the resolution of the image is no better than 300 PPI.
Actually, once you halftone screen the image, you have completely thrown away the excess resolution that was originally captured in scanning. The resolution of a printed photograph is its line screen, full stop. If that line screen is 133 lpi, then the resolution of the image is 133 lpi, and no more.
Re:Clarify - should be PPI (Score:2)
The idea of LPI applies to halftoned images.
Bzzt. Wrong again, Mal. The idea of LPI has been applied to halftoned images. By the time the image (which was originally a tint or a contone) has been halftoned, its line screen is no longer relevant to anybody but people who correct newspapers with red pens.
Not "bleed," - "dot gain." "Bleed" is what one does outside of trim marks and if one doesn't know how to use a cork-backed ruler at the glass-topped light table. "Dot gain" is what happens when you print on paper stock that has the relative absorbency of a paper towel and the ink spreads out like a juice spill.
Hmm. Seems to me that when ink bleeds, dots gain. Wouldn't you say?
Re:Clarify - should be PPI (Score:2)
(Yeah I know dots per inch is the established standard and it's not going to go away any time soon. But I still think metric is the way to go. Alternatively you could measure dots per millimetre - but something measured in 'per unit of length' is a more complex measurement than just a unit of length. Still, often there's the feeling that higher is better (eg CPU speeds are measured in Hz, not cycle time) so maybe dots per millimetre is it.)
Hmmmm (Score:1)
IBM has had for over a year (Score:2)
Too small (Score:3, Interesting)
This thing may find a place in CAD work, but the raw resolution will be utterly useless in normal day-to-day applications.
Re:Too small (Score:2, Informative)
Many operating systems are already using some form of vector icon or considering moving to it (KDE,Mac OSX)... it takes more compute time but not a lot (you only have to compute the raster equiv once given the screen size.)
Once that happens, you'll be happy using the high res screen and setting the icon size to say, 1 inch, while others might choose a 0.5 inch icon.
Re:Too small (Score:5, Insightful)
Re:Too small (Score:2)
Re:Too small (Score:2)
The problem isn't the monitor, the problem is our windowing systems, be they MS Windows, X-Windows, MacOS, or otherwise. 150 dpi was the effective upper limit on resolution for so long that people started treating it like it was carved in stone.
A good windowing system (and any software under it), should assume that 1,500 dpi monitor might appear tomorrow. How will you make use of it? Don't just assume that I'll still want my text to be 30 pixels high. We should all be enjoying 150 dpi text on screen now, and looking forward to 300 dpi soon. Resolution improvements in printeres from 150 to 300 to 600 to 1200 were heralded as great improvements, but no one seems excited about nice crisp text on screen! Most modern systems handle fonts pretty well, but the setting isn't obvious or easy enough. (I've know too many people who keep MS Windows in 640x480 "because the text is bigger" instead of increasing the resolution and the font size.) Less correctly handled are icons and buttons. Apple has made some improvements by requiring high resolutions bitmaps for their new task manager bar thing. As a result, high resolution displays get nicer looking icons. Some systems support vector based icons that will scale to any size (Irix's default file manager comes to mind).
But if your windowing system shows you unusuable small icons or other widgets on high resolution monitors, complain to your windowing system/operating system provider!
OS X the solution? (Score:2)
Even on a 200dpi screen, that means an icon would be good for 0.7" on the screen.
Perhaps that's also why OS X is going with wysiwyg screen fonts, with the assumption that higher resolution displays will mean better font fidelity without additional font tweaking?
Why Viewsonic Sucks G0at Ass (Score:3, Insightful)
To obtain warranty service, you will be required to provide
The original dated sales slip
Your name
Your address
The serial number of the product
A description of the problem.
A dated sales slip? Even after 3 years? Come on! Ok well fine I can dig out an invoice. But they also want you to ship it back in the ORIGINAL box! Who has that after three years? This is rediculous. They wouldn't take it since I didn't have the original box! My yearly IT budget is only around $150,000 but rest assured they won't see a dime. After that I started buying HP monitors only, one goes bad, I call, no run around, they next day ship a replacement, and pay for the return shipping. Class act right there.
Re:Why Viewsonic Sucks G0at Ass (Score:2)
Have you ever shipped glasswear by freight? No?
Maybe the people who make and ship thousands of monitors made out of brittle glass know how to pack them, and maybe they want to eleminate another variable. My Daytek monitors require you use the original boxes and materials for warrantee and shipping, and I'm right with them on that. I don't want them broken by some drunken mover, nor do they want users claiming a monitor is broken because of improper shipping.
If you really had a bugdet of $150,000 USD, you could easily have the boxes unfolded and stored with the foam, put next to the room where you keep all those Windows licence papers.
Um, you can usually buy a box from them.. (Score:2)
If that didn't work, a couple more calls probably would have done the trick.
YMMV. I believe the company was viewsonic, but I might be wrong.
Video cards that can drive this? (Score:2)
Sweet! (Score:2)
Re:Sweet! (Score:2)
50pt print, btw, is 50/72's of an inch, not 50 pixels.
Re:Sweet! (Score:2)
Re:Poorly-written apps (Score:2, Insightful)
They get rewritten for the PHB with the fancy new monitor.
Re:Poorly-written apps (Score:2)
Re:Not everything in winapi is points based (Score:2)
>>>>>>>>
You know, I really shouldn't have to quote your own post for you.
Limitations of the Eye (Score:2)
So at what PPI do we surpass the ability of the human eye to distinguish the individual pixels? I run my desktop at 1600 x 1200 and it's *very* tough to see the individual pixels. At what point does it become impossible?
Re:Limitations of the Eye (Score:3)
If they are NOT touching, then you are right. But since most images are made up of colors that touch each other, it a very important phenomenon.
Also find something that draws a black like diagonally at 10 degrees across a white background and tell me that 1600 x 1200 is enough. The entire reason that there is so much attention paid to antialiasing in games, fonts and graphics programs is precisely because there is no where near enough resolution on a montior. 200 dpi is a step in the right direction but it'll be at least 300 dpi before computer displays start approaching the confort level of looking at a printed page.
Re:Limitations of the Eye (Score:2)
We face a similar issue with resolution. It's not whether we can see the individual dots, but what things look like in the aggregate because of the number of dots. Yes, there is a vanishing point, or at least an issue of diminishing returns, but we can continue to see improvements in overall quality when we can't actually focus on individual pixels anymore.
The visual threshold is not measured in PPI (Score:2)
He was asking what PPI = how many points _per_inch_.
That depends on how close the eye is to the object being viewed. Obviously, it only takes half the PPI to fool the eye for an object 50 cm away from the eye than for an object 100 cm away from the eye. The eye sees in radians, and the brain converts that to metres based on depth cues.
Great for no-glasses-required 3-D (lenticular) (Score:2)
Remember those cool little "flip cards" you got in Cracker-Jacks, where the image changed when you rotated the card? Well, that's lenticular imaging. This technology is also used for 3-d imagery because the image that you see depends on the angle at which you view the image. Because your eyes each see the same point on the screen from a slightly different angle, the screen shows each eye a different image (allowing proper 3-D).
Using this screen (200ppi) and a 40-line-per-inch lenticular screen, you could see 5 different images depending on the angle you are viewing from... not bad at all.
(BTW, I write "shareware" to produce lenticular images... http://www.lenticularshareware.com)
MadCow.
Re:Great for no-glasses-required 3-D (lenticular) (Score:2)
I would suggest just interleaving two images -- or at most three. I've written about this before, but can't be bothered to fight with slashsearch to find it.
4.7Ghz - wow (Score:2)
It'd make a neat statement.
Gates's Law (Score:2)
They need to get a PIV at 4.77Ghz and an 8088 at 4.77Mhz side by side. It'd make a neat statement.
Windows XP would still take about as long to boot on the 4.77 GHz machine as MS-DOS 2 would on the 8088.
Gates's law: The time taken to perform simple operations in mass-market software, measured in microprocessor clock cycles, will increase in subsequent versions of a software product at a rate roughly proportional to the increase in clocks per second of newer microprocessors. Thus, given a lack of funding for increasing hardware speed and a requirement to "keep up with the Joneses" dictated by changing proprietary file formats, the speed of software halves every 18 months [tuxedo.org].
Size, not resolution (Score:2)
Great, Another Model I Can RMA To Hell (Score:2)
I have little faith left in Viewsonic monitors. I suffered through eight RMA's on an 817 series monitor in the course of two years. All, except one, failed for the same reason, just went "poof". One monitor died with 8 hrs of recieving it. The final straw was when they shipped a monitor that looked like something had broken loose inside the tube and bounced around, scarring up the back of the panel.
Each and every time I called, they professed ignorance and told me that thier was no quailty control problem with the 817. But they had an abundent supply of refurbished 817's. And I had to pay freight for each and every return. At 71 lbs, those babies weren't cheap to ship.
After about the third return, I tried to convince them to ship me a different model. They wouldn't.
Funny though, my 815, which sits on the same table, has been lite for going on 5 years now, with not a problem.
Think "other than quake" people, please... (Score:2)
Think brain surgery, high-res scans, super-acurate 3d models to further help reduce prototyping stages.
Maybe Duke Nukem Forever, though, cause this monitor will be around $500 by the time that game comes out.
I'd rather the monitor than 8 grand (Score:2)
Me, ME ME!!! I'd rather have that monitor than 8 grand.
Of course I'll bet that if I had 8 grand sitting in a bank account my tune would change, but since I'm unemployed that doesn't seem likely.
Update speed sucks (Score:2)
We oohed & aahed at the wonderful clarity of the picture. We marvelled at the sheer detail that 200dpi can give you. We were awed at the expansive range of the viewing angle. We were even impressed by the quantity of zeroes on the price tag. Then we saw the picture change.
It does not update fast. In fact, one system took nearly a third of a second to draw the next picture, and the other took closer to two-thirds of a second. Worse, they updated in quadrants or vertical strips, and the effect was quite jarring. This is not a monitor you could use for animation.
An AC posted elsewhere here that they can get up to 20 Hz updates. If so, that's a huge improvement over what I saw, but it still sucks big time for most usage.
Nice picture (Score:2)
If it wasn't 200 dpi monitor but only 100 dpi I probably couldn't see the difference.
200 dpi PDA screens? (Score:2)
Heck, give me on of those and you call me "four eyes".
We're testing one of these (Score:2, Informative)
That said, it is beautiful to behold. It has a nice wide viewing angle, and is quite bright. They sent us the monitor with a CPU with windows and a gallery of images installed. The images looked very, very nice - you could barely see the pixels at all! But for some reason even though the images were all 3840x2400, we still had to pan around them. Guess what?
OK - a trip to Monitor Properties and we are seeing it at native resolution for real this time. It was almost like looking at a very clear picture, as a previous poster wondered. We had some images of the moon's surface that were better looking than any I've ever seen before, even on print. I'd post screenshots for you if I could. It was so nice that one of my colleagues suggested to the ViewSonic people that they ship the monitor with a magnifying glass. He wasn't kidding either.
About text, like people keep mentioning, it is awful. In windows, which I find uses quite small text by default anyway, the text on the start bar was illegible without getting up close and peering. In Linux, setting an xterm's text size to Huge makes it legible if you squint. It really is a problem that OS developers need to address because its starting to bug me, even with my 1600x1200 15" laptop screen. Are the physical dimensions of the monitor availible to the OS (using EDIDs or something)?
Well, must go. -matt
My screen is over half way there. (Score:2)
Re:Need a 128mb card (Score:2)
Brand new cards tend to be slowed a good deal by 1600x1200 with all the eye candy turned on in a new game. I think to run at that sort of resolution, you're really looking at being a generation or two of graphics cards (and probably CPUs) before something could really take advantage of it. You might be able to run an older game on it though.
I really don't think the advantage of these monitors is in gaming. I think you'll be much better gettting one of these monitors to do graphic arts or putting a whole shitload of code up on the screen at once.
Time to break out the cluepons.... (Score:2)
Re:I just want an affordable 1600x1200 LCD! (Score:2)
USB and normal features included, and it pivots to do 1200x1600 which is sort of used less often than you'd think, but nice for gee-whiz shows.
--
T
Re:I just want an affordable 1600x1200 LCD! (Score:2)