Super LCD Screens: 200 PPI 263
crovira wrote to us with an article from the NYT (free reg. req'd.) Apparently, both Toshiba and IBM are dissatisified with the current state of monitor development. To that end, they've created some wondrous toys like 16.5" LCD screens that display 200 PPI. They've run into a most curious problem however: Legacy software/drivers. Click thru to read more - and if IBM should want to send a couple screens my way, I'd see what we can whip up around the office *grin*.
This is news??? (Score:1)
Re:Windows already is (I bet you love that) (Score:1)
Apple's present PR engine's mantra is that Icons are represented up to 128x128, but by a bilinearly mipmapped graphic.. I.e. there is also a 64x64, 32x32, 16x16 (And possibly lower powers of 2) version, and it chooses and scales an icon based on these numbers and the closest version of the icon size-wise. No vectors. You really need 64x64 icons in order for vector icons to work, in my experience. Scaling lower than that just gives a mess.
Re:Windows already is (I bet you love that) (Score:1)
Re:Windows already is (I bet you love that) (Score:1)
John
Re:Resolution independent GUIs (Score:1)
John
Re:But a cheap hack is available now! (Score:1)
First, the font system is pixel based. Second, all the other drawing primatives are pixel based. There are no given res independant units to work in, and an extension won't help since not everybody will be using it. Get into the real world -- X is further behind than EVERYTHING else in the market today.
p.s. a target sized theme won't affect the appilcations, you just get a 'normal' sized window frame, and 'anything goes' inside it...
If you ask me, killing the dead horse is a small price to pay for 4000x3000.
John
Re:But a cheap hack is available now! (Score:1)
John
Re:But a cheap hack is available now! (Score:1)
John
PDF 1.3 is a subset of Postscript 3!!! (Score:1)
Basically, all the programming stuff is pre-executed, and the contents of each page, in terms of Postscript primatives, is canned at each showpage command (well, its a little more, but not much)
John
Re:Drivers are usually written by the manufacturer (Score:1)
John
Re:Coolness. Apple to the rescue? (Score:1)
John
Re:heh heh (Score:1)
John
Big problems with software... (Score:1)
Perhaps those gigantic 128x128 pixel icons in MacOS X don't seem so huge now. And with their Display PDF system, a properly designed icon should be scalable to any size you want.
And with the Mac's popularity in the imaging business, I wouldn't be surprised if the first major customer of these 200 dpi displays is Apple. Just imagine an iMac with a 17" display and a 200 dpi resolution.
Will bandwidth be a problem? (Score:1)
I see a weak spot though and thats the bandwidth required to keep screen response reasonably interactive. Comparing a high end 100 dpi monitor to a 200 dpi monitor means that 4 times the memory bandwidth would be required to maintain the status quo. This requirement is pushed all the way up from the underlying hardware into the software technologies such as the X servers or whatever it is that handles display primitives in Windows.
I don't follow the windowing developments under Linux very closely, with the V3.X servers running on a TNT with 16 megs of memory linux graphics are a lot slower performing than they were under Windows 98. Will V4.X achieve parity (or perhaps go beyond parity) and would it keep up with a 4X throughput increase?
Alternatively you could argue that adaptive resolution would be a useful approach at cutting bandwidth. Text is usually black and white and suffers greatly at todays resolutions while colour images are acceptable at todays resolutions. If the display could handle addressing multiple pixels for colour information to form a less fine resolution the overall bandwidth could be greatly reduced.
Re:Not a legacy driver problem, per se.. (Score:1)
*imagining pine, vim, lynx, zicq in 200x100 rxvt's*
And think of the console modes with the 2.2 kernel framebuffer support... *doing math*
with an 8x16 console font, 2048x1536 comes out to... 256x96... even 1600x1200 is 200x75. Mmmmm.
Now THAT is where it's at.
--Ben
Neat Tech, but what's with the opening paragraph? (Score:1)
What? Unless I'e been walking around in a trance, exactly the reverse is true.. Even a cheap monitor has a better quality picture then a fairly expensive TV.. Potention resolution, scan rate, dot pitch, all better on monitors then crappy NTSC TVs..<p>
Re:PostScript does that (Score:1)
Apple now has display PDF in their upcoming MacOS X.
Correct you are, sir. In fact, during the Rhapsody days, Apple engineers amazed themselves by putting together a Display PostScript interface for their latest effort (soon christened MacOS X Server) in five working days. Yes, the NeXT software development infrastructure was and is that good.
IIRC, the only reason why Apple got rid of Display PostScript was due to PDF's emerging dominance as a graphical layout standard. (Oh well. They had the spare week to code it anyway..)
Anecdotes like this make me hope that OS X Consumer will have some of the development tools available. Drool drool drool...
-----
GNUStep (Score:1)
Display PostScript (Score:1)
That was the whole impetuous behind Display PostScript, pioneered by the NeXT computer / OS. It made for a very cool display / printer integration, because what you saw on screen was exactly what you'd get on paper. It took care of all the scaling issues for you. NeXT was very much ahead of it's time in the late eighties.
Unfortunately, it was perhaps too computationally-expensive for the machines available at the time. You gotta remember that back in 1989, a 16MHz 68020 was considered a high-end microprocessor.
Display PostScript would be a lot more viable today, except for 3-D games. Most graphic accelerators now have way more power than they need to keep up with 2-D display tasks. And the CPU power is there too, even with a low-end, non-OC Celery.
RAM requirements 200dpi display are enormous (Score:1)
The RAM requirements of a 200DPI display are enormous.
I have 13", 17" and 21" monitors hooked up to my G3. Obviously, what I want/need is screen real-estate. I shudder at what a card for a decent sized (30"
And the acceleration needed to handle all those pixels per square inch... I'd need a water cooled side-car.
But Tomb Raider would really be worth playing.
Re:Windows already is (I bet you love that) (Score:1)
Gates always said: "Make it more Mac like"
Seriously, Display PostScript was a better solution since day none.
Microsoft suffers from IBM's old problem. "NIH"ilism. We'll see if it survives it when the world switches to 64-bit architectures.
Re:Drivers are usually written by the manufacturer (Score:1)
--
" It's a ligne Maginot [maginot.org]-in-the-sky "
Re:Quake MIGHT not look good (Score:1)
Re:SP25 Maybe? (Score:1)
technology "stuck" since 1983 (Score:1)
monitors in 1980 at $30K. The resolution jumped to
1280 x 1024 by 1983 and essentially remains stuck
there. The difference is the price has fallen
to $500 and the application software has grown
exponentially. (There have been a few specialty
monitors at 1600 or 2048 if you add another zero
to the price.) We were told for electron gun technology you just cant form clear pixels faster
than about 5 nanoseconds, hence the limit on those types of monitors. Multi-gun, mult-monitor
technology has been around- either kludgy or expensive. The IBM approach is looking at non-gun
monitors where there is still growth possibility.
Re:Problems with hi-res monitors (Score:1)
Re:What about E ink? (Score:1)
Re:IBM misses to boat -- again (Score:1)
The return of Sun's NEWS (Score:1)
Sounds like IBM and friends need to either fund conversion of open-source apps to X11-DPS, or form a new consortium to develop a new Open Source resolution- independent (or at least less-dependent) protocol and libraries.
Re:The more things change... (Score:1)
And this is off topic why? The original article stated that they were having difficulty with device drivers, etc. So I mention my own experiences, which would have been more successful if Linux and OpenGL had existed as they do now.
The whole point of my post is that if IBM and Toshiba are having problems, they can solve them faster by working with the community than without.
Re:Photolithography etching has got to be expesive (Score:1)
My favorite aircraft is definately the A10
Re:You CAN scale fonts in windows (Score:1)
Unfortunately, several windows programs will notice the custom font setting and refuse to run (with a kind little dialog box which says, basically: "Hey weirdo, fix your font setting, you're being different, stop it.")...
So windows can handle some of this, in theory, but it's basically useless in real life. (gee, sounds like windows in general...)
Re:Windows already is (I bet you love that) (Score:1)
As you mouse over that task bar in Quarters the bitmap icons grow and shrink. They don't just pop up larger and then pop down smaller they execute a smooth grow/shrink based on a vector used to scale the bitmap. If you select an icon in the task bar it doesn't just open an associated window/app it flows the information out from the width of the icon to the width of the app using two bezier curves to define the sides of the flow.
It's quite slick, as long as it doesn't suck CPU power like a ShopVac.
Re:Windows already is (I bet you love that) (Score:1)
Most Windows icons are 32x32 pixels with the tacit measurement assumption of 72pixels/inch. That means that at most used resolutions your Windows icons/toolbar icons will nominally appear as 1/2" squares. If your monitor isn't based on 72pixels/inch but 200ppi then your 32pixel icon will only take up a square area of about 1/6" on a side on your display. I don't know about you, but many times I find the current icons too small to impart any useful iconographic information. If those icons become 1/3 of their current size then there is proably no sense in having them.
It will take a fundimental change in the way Windows handles icons in order to get them to display properly on such a dense resolution monitor.
Since Windows likes to keep the icons in the actual executables or
Display Postscript and it's successor, the new Quartz "display Acrobat", that can modify bitmaps with vectors is the proper way to go to achieve *true* device independance for a GUI.
Re:Resolution independent GUIs (Score:1)
What was NextStep about? Didn't the neXT-Station write to the b/w Monitor (actually four levels of grayscale) through PostScript? So Prsentations on screen where really WYSIWYG as the same language was used to write to screen as to the printer. And we know of laserprinters having 300, 600 or 1200 dpi!
The tecnology exixts! Don't reinvent the wheel.
ms
Type1 may be a better choice (Score:2)
True Type tends to have better hinting for low resolution devices like a CRT screen, while Type1 gives better output on high resolution devices such as printers. With these high resolution LCD screens, Type1 may be a better choice.
Also, remember that while scrollbars, entry boxes and the like can already be any size you want them, on a high-res display, the 3D effects will get lost as they'll be too small to see, for example. Converting these to vectors will help. But bear in mind that not everything is suitable for conversion to vectors. Many enlightenment and gtk themes, for example, would be all but impossible to make into vectors.
I suspect what we'll end up with is a halfway house where theme designers will have to make multiple versions for various resolutions, much like we have 75 and 100dpi bitmaps fonts now.
Inflating Icons (Score:2)
I remember they did something similar with an earlier version of OS/2. It really pissed me off. I reconfigured a OS/2 box such that instead of the default 640x480 it was working in 1024x764, only to find that IBM in its infinite wisdom was now showing all the text at the same size as before, only with higher resolution.
Very clever, they no doubt thought, the 9 point text is still 9 point text - only smoother. Of course, what I actually wanted was more bloody text on the screen - and bugger the point size.
I'm pretty sure they played the same game with the icons too. Damn their eyes.
When I get my hands on one of these 200dpi LCDs I do not want one of IBM's inflating pixel drivers with it.
Regards, Ralph.
Vertical Horizontal refresh and LCD (Score:2)
First of all with my limited experience doing low level graphics I seem to remember only being able to write to the screen during the horizontal or vertical refresh of the monitor. Do these new LCDs emulate this to allow legacy software and etc...
Second is their a new all digital standard for connecting these to video cards. With no analog components this would seem to get better quality and allow me to have a longer monitor cord.
Re:Resolution independent GUIs (Score:2)
The display hardware is bitmapped anyway (I think it's harder to make a CRT that can steer the beam arbitrarily, as the old vector scopes did, than to make a raster CRT) - and intrinsically so if it's an LCD. There's going to be bitmapped graphics code somewhere at the bottom.
However, you can implement a resolution-independent layer above that, e.g. Display Postscript/NeWS or Display PDF.
Re:then get rid of icons (Score:2)
I am very happy that neither my work nor my home desktop have a display that involves steering an electron beam (Hint: what do the initials "LCD" in the title of the story to which these comments are all attached refer to?).
I am unconvinced that going back to vector CRT monitors would make things better.
Re:Not a legacy driver problem, per se.. (Score:2)
If you are having problems with 10pt fonts looking small then your system is broken. A point, by definition, is 1/72 inch high. This is regardless of resolution, so the system should be scaling the font's pixel representation based on the actual screen resolution.
X, at least, gets this right. I am running my 21" screen at 2000x1500 and all the fonts are fine. Everything in Netscape, application menus, and the window manager are perfectly readable and about the physical size I would expect from the point sizes I have chosen.
-jwb
Re:But a cheap hack is available now! (Score:2)
Maybe if you're only using gtk/qt based apps, someone could throw together a theme to size up their widgets. Anyone know if the theme support is flexible enough to allow an easy scaling up of all apps that use these toolkits?
Re:Resolution independent GUIs (Score:2)
In Berlin, rendering the vector-based description of the GUI to actual pixels is done at a very late stage, thus allowing almost anything (CRT, LCD, sport arena jumbotrons(!)) to act as visual device - just rewrite the rendering and output modules & off you go.
With the varying resolutions on screens (PDA->CRT->this new LCD), this is the natural next step in GUI development.
Re:Quake MIGHT not look good (Score:2)
I've got a Sharp Ultralight PC with a supurb LCD (TFT) screen. There is no ghosting. I've played right through Quake II on it, no problems (though I recommend MS's Dual Strike controller). My primary work screen is the LCD screen on this portable. I'm looking at it at least 8 hours a day. The definition, brightness and contrast are all very pleasant on the eyes. I have never suffered headaches.
The only "bad" thing I have noticed is that areas of heavy bands of BLACK and WHITE can cast a slightly noticable shadow above and below them on the screen. I think Fox's news site background graphic is an example. Probably something to do with an "array" of wiring (or something)...
Re:Not a legacy driver problem, per se.. (Score:2)
I can read xterm's "tiny" font with no problem on my SGI 1600 Flat Panel at 1600x1024 resolution. The same font is unreadable on a 21" CRT monitor at 1280x1024. Comparing resolutions between CRTs and plasmas is apples to oranges - the clarity of a plasma panel is akin to getting glasses after your eyesight has gradually deteriorated: you simply can't believe how much better you can see, and how much more seldom those headaches kick in.
You are correct in noting that the resolution increase is such that all but the largest point (pixel) fonts we're used to today would become unreadable, even on a flat panel. But you (or others, anyway) would be surprised at just how clear and readable tiny print is on a flat panel, as opposed to a continuously scanning (and distorting) CRT.
The solution is, as others have said, a level of abstraction between the physical world "length" and "area" screen measurements and the digital world "size" in pixels, preferably with user definable mappings between the two for "arbitrary" virtual resolution ranging from "unreadable" to "large enough for all but the completely blind."
Maybe IBM could ship off a display in the direction of the XFree86 group...
Re:PostScript does that (Score:2)
Two Words (Score:2)
NextStep = OS X
OS X is Apple's GUI on top of NextStep. Remember when Apple bought NEXT a few years back. They didn't reinvent the wheel. They own it.
Legacy software & drivers ?? (Score:2)
OPEN THE HARDWARE SPECS, and sit back to watch the magic happen.
If not CRT; lasers? (Score:2)
If you had a flourescent gas encased between two plastic sheets, and activated it via a combination of RGB lasers and priming lasers, you could conceiveable get infinite resolution displays, assuming the motors and laser pulses could accomodate the speed, and that the data bus could feed the device high enough.
And there is no reason for the lasers not to fire paralle to the surface of display; just use two intersecting lasers to 'activate' the gas.
Just a random thought
-AS
While I take your response on faith; (Score:2)
Can windows support WMF images instead of BMPs and ICOs in the dlls that generate all the desktop/widget icons?
-AS
Can Windows display WMF images instead of icons? (Score:2)
Though it obviously doesn't help that every windows program today has bmps and icos embedded within DLLs, and aren't in any sort of realistically scaleable format.
-AS
Coolness. Apple to the rescue? (Score:2)
I'd imagine an Apple handheld using a 3x4 inch 200dpi display(that's 800x600! wow) would be awesome. Or a 19" LCD for their pro series desktop, when they can scale to that size... 3000x2200 here! Or imagine nextgen gameboys with monocle displays; a 1x1" display would easily match today's gameboy resolution of 144x144 pixels. Or a PDA with a monocle display!
I wonder how Apple is going to try to capitalize on their ability to display 128x128 pixel icons and their Quartz Display PDF capabilities? Right off the bat the icons would be larger than 5/8, which is about right, if on the small size. I'm not sure it was explicitly mentioned, but Toshiba are only shipping 4" and 6" screens; were IBM's not limited to that size, by omission of detail?
Wow. Cool.
-AS
Re:PostScript does that (Score:2)
NeXT had display postscript over 10 years ago.
Apple now has display PDF in their upcoming MacOS X.
So your prayers are answered, and they have been done so by the man called Jobs. Fear.
Anyway, it may be that Apple can support this right off the bat, with their vector based display system, Quartz, and 128pix icons, etc. All of a sudden, it seems everything is going Apple's way. Hmmm.
-AS
Windows maxiconsize = 64x64? (Score:2)
Of course, that's actually still too small, and it doesn't scale very intelligently, but it's a step in the direction. I'm betting Apple will be able to leverage this new display technology and wow everyone =)
-AS
Would Display PDF help? (Score:2)
Re:Windows already is (I bet you love that) (Score:2)
Re:Huh? (Score:2)
Re:Windows already is (I bet you love that) (Score:2)
Re:Drivers are usually written by the manufacturer (Score:2)
Hmm.. better example: If you go into Adobe Photoshop and use the text tool to write "Hi Mom" and then resize the image (without flattening the layers) to be four times the pixel depth, which makes the image larger, you don't lose any image quality. It's not just zooming in, it's using the algorithms that define the curvature and lines of the letters (or vector graphics) to re-calculate the shape down to more specific pixels. It isn't "zooming in" in the traditional sense.
Re:resolution/crack rocks (Score:2)
This is because your brains do something similar: images are picked up in pixels (a limited number of light sensitive cells) and combined into an image. The individual pixels aren't important, but the overall image is. So your brain doesn't process the individual cells. It rather combines the signals from clusters of cells, with some cells physically located together, participating in different signals. Several of these signal channels (nerves) interact together.
So, the quality of screens depends on your definition of quality. Do you think it's better if it's more realistic, or better if you can put smaller elements on it, whether you can discern them or not? I think it's not a set criterium, but rather dictated by purpose and circumstances.
----------------------------------------------
Then scale the bitmaps! (Score:2)
heh heh (Score:2)
2,200x1100 pixels in antialiasing.
Smoooth like silk.
resolution/crack rocks (Score:2)
Not largely relevant to the rest of the article, but...
For picture quality, however, many computer screens are put to shame by the cheapest discount store portable TV.
Uuuuh.... what? Someone needs to lay off the crack rock. The average computer monitor is like
Anthony
Re:Windows maxiconsize = 64x64? (Score:2)
Of course, that's actually still too small, and it doesn't scale very intelligently, but it's a step in the direction. I'm betting Apple will be able to leverage this new display technology and wow everyone =)
Well, actually, Windows supports icons up to 2048x2048 in size (according to VC++), and in 32 bit color - and amazingly, it does this wonderful thing where it automatically bilinear interpolates them to the right size if the icons are too small. It'll also quite happily give you 72x72 icons for most things if you set it to do that - though it looks kind of weird. Using the Appearance tab, you can set everything so that it's a factor of X bigger than now, including font sizes (though for existing apps, a "large font" mode will be needed - and anyone who doesn't use StretchBlit in their apps to size things for the screen, and who doesn't use font-metrics rather than direct pixel metrics deserves to be shot).
It'll handle it quite happily. And, as I used to work there, I'll tell you right now that they were considering this and have plans in place to handle this when I was there (Oct 1998 - June 1999). But sorry, I can't tell you the details - you'll have to take it on faith.
Custom made for OSS (Score:2)
<i>At I.B.M., Dr. Wisnieff's laboratory rewrote the underlying code of Windows NT to inflate its display pixel counts. Lotus, a unit of I.B.M., also created a high-resolution-compatible version of its word processor.</i>
With the open availability of specs, OSS programers should be able to step in and provide a solution without something as drastic as a rewrite of NT. (The fact that an IBM lab has access to the NT source strikes me as odd for a whole other reason. Woder is they can fix the rest oif it while they are at it
It also should serve as a heads-up for all app developers, as something that will become an issue in the future. These monitors are going to be designed for the medium end desktop, and Linux has the ability to be ready for when these things arrive, without having to re-write everything.
Drivers are usually written by the manufacturer (Score:2)
Given this, why should this cause a holdup?
More info (Score:2)
They (IBM) also have an interesting bio up on Robert L. Wisnieff Manager, Advanced Display Technology Laboratory [ibm.com]. Interesting if you can ignore the market spin at least :).
Noel
RootPrompt.org -- Nothing but Unix [rootprompt.org]
Re:Problems with hi-res monitors (Score:2)
Dodgy Maths Mate...
17" by 80ppi is 1360 pixels wide, so you are using a screen roughly 12 pixels high? 14" by 80ppi is 1120, the resolution is 1360x1120 (or to be more useful, 1600x1200) which is 1,523,200 (1,920,000) pixels on the screen. Not 16320.
At 200ppi (3200x2400) there would be 7,680,000 pixels on the screen (not 40,800).
The simple solution to icons is to make them big (256x256) and truecolour, and then scale them down to fit in a certain area of the screen. Funnily enough though, at 200ppi, your biggest icon would be 1.25" on a side. I can live with 0.75" on a side icons, but any smaller... no way.
X will survive, but themes won't scale, and icons will have to be redone. I think that someone out there should start working on 256x256 icons for every program they can think of, 64x64 is just too small.
~~
Re:Problems with hi-res monitors (Score:2)
Following up to my own post...
256x256 truecolour icons would take up 256k of memory. My desktop currently has around 40 different icons on it.. that would be 10Mb of my memory, and I know that with several windows open and some applications running etc that could easily go up to 160 icons, which is 40Mb of my memory.
I think that even for the most complex of icons a vector representation would result in a smaller icon than 256k! Looking at the Word icon, hmmm, that could be done in around 1k (a rectangle, a 'W', some lines of text and a stamp). Yep, vector icons are the way to go, not bitmap.
Silly me! MacOS X could be outdated with its great technology - they won't be scaling their icons down, they will have to scale them up to be legible, even at 128x128! :-)
~~
Re:Not a legacy driver problem, per se.. (Score:2)
Probably why Apple has gone to display PDF in its next operating system then?
They have obviously seen this thing coming along - in a couple of years most laptops will have min 1600x1200 screens, probably 2048x1536 even, and do you want to hunt for those icons on that screen (shhh, don't even think of 200x100 character terminals).
This calls for a scalable desktop - one where things are specified in DPI and not pixels. Apple has done it. Windows hasn't, and won't until people start moaning (my taskbar is 1/8" high, I can't read the text...) but what about X?
~~
Re:Entirely new system (Score:2)
Re:Legacy software & drivers ?? (Score:2)
<br><br>
That doesn't even help you much. They can't sell stuff cheap enough for people to afford unless they can go for the big markets. That means Windows. And I don't see people fixing Windows to do pixel-quadrupiling-of-all-apps-that-don't-know-be
Re:Entirely new system (Score:2)
Indeed, video card technology needs to move forward a bit before this is useful. At these resolutions, you're using 7 GB/s of video memory bandwidth just to output at 60 fps, 24bpp!
Re:resolution/crack rocks (Score:2)
However you're right about the blurring effect smoothing video on a TV. If you try playing back VHS on a monitor it looks far worse than on a TV, but only because the monitor shows up the imperfections of VHS.
HH
Re:Vertical Horizontal refresh and LCD (Score:2)
Unfortunately, you don't specify this in the config file (a grevious oversight IMHO).
You spec it on the command line, thus:
startx -- -dpi 75
or within the startx file itself:
serverargs="-dpi 75"
Re:Vertical Horizontal refresh and LCD (Score:2)
Careful, you're dating yourself there. Modern systems use a technique called double-buffering, where the screen shows one frame of video memory, and the system writes to a second. You then flip on the vertical retrace interrupt. (Acutally, the old Atari 800, Commodore 64, Atari ST, and Amiga all could do this, it was just the PC that was limited to one video frame (at any decent resolution)).
The other limiting factor is memory bandwidth: If you have to pump "N" pixels a second out, that consumes much of the available bandwidth to the video RAM, leaving less for the CPU. However, modern video cards use very wide memory (128 bits or more) to allow more pixels per memory access, thus allowing more bandwidth to the main processor. So this too is less of a limiting factor.
However, I have experienced the very problem described in the article: I run 1600x1200 on my system at home, and certain foolish operating systems *cough*windows*cough* make teeny little icons. However, certain other systems *cough*xwindows*cough* allow me to tell the system the dots per inch, and most things work out well.
Re:24 bits! (Score:2)
While an analogue signal can theoretically represent an infinite number of colours, in the real world noise gets in the way. So an analogue signal can only represent a finite number of usable, distinguishable colours.
However, this is not what is meant by colour gamut. Every graphics signal will use some colour space to describe colours: modern monitors use RGB, meaning that the colour of a pixel is described as a triple of red, green and blue intensities, all of which must be positive. TVs use (for historical reasons) a stranger colour space of chroma, luminance and - er - something else. Printers use CMYK, etc etc.
The colour gamut is the set of colours that it is possible to express in a colour space. Because our vision system is a bit interesting, and because the world is never perfect, colour spaces that are physically realisable can never display every shade of every colour. Some have better gamuts that others, meaning they can display more colours than others.
So the previous poster was saying that the colour space used to display NTSC television happens to suck when trying to describe red, because while it is analogue, some shades of red are just outside its range.
Monitors are better, but you actually you can't make a device using three primary colours that can equal the real world. Even with a monitor, bright shades of cyan, say, will be displayed as mixtures of blue and green. A mixture is inherently less saturated than a pure colour. Now you can always get a brighter, more saturated cyan from a cyan-coloured laser, for instance, or a painter's pigment. This will be true for every colour that is a mixture of primaries.
Colour spaces have been defined, like the CIE space, that can describe all perceivable colours. They use some mathematical trickery to define "super-saturated" primaries, in other words red, green and blue that are brighter and more pure than physically possible. So you can't make a monitor to display it.
Re:Entirely new system (Score:2)
Multiple Alpha processors,(screw that SMP stuff. Three-way!) a G400 and two of these babies? Complete with MD, DVD-R and a *nix kernel with all of the 'legacy' support ripped sound good? We'll hand roll it all in assembler for performance!
But a cheap hack is available now! (Score:2)
Doing it wouldn't be hard. Just assume a 800x600 grid, regardless of screen size, and specify coords based on it. You wouldn't break legacy applications, because they'd see a 800x600 screen.
X is even easier! Toss a 'target sized theme' on BlackBox or Enlightenment. Or, better yet, write a userland application that dynamically resets the WM fonts, terminal fonts, and icons based on resolution. You'd have to restart the WM every resolution change to reread the altered settings, but that is a small price to pay for 4000x3000!!
Re:A little overboard! (Score:2)
After looking at the interesting stuff Transmeta has patented on round-robin streaming to arbitrated processors, I'd like to see one in action. And what processor would shine the best in an application unconcerned with x86 support? Well, the Sun Ultra or Compaq Alpha, and I just have a Alpha preference.
It was a silly response, granted, but it was a silly idea I was responding to..
Re:Entirely new system (Score:2)
2. The G400 represents the bleeding edge for 2d cards. It still may suck in terms of upper limit resolution, but it's probably the best we've got. (And it has the odd snafu with anything above 1880x1440, although the stated limit is much higher)
I don't think it's video card technology that will need to improve: It's memory speed! S/D/RDRAM isn't keeping pace with the processor wars. Processor speed almost doubled in the last year, while memory only got 20-30% faster. Don't get me started on the fact I've seen 8 year old SCSI setups toast UDMA/66.
Re:Not a legacy driver problem, per se.. (Score:2)
Re:Not a legacy driver problem, per se.. (Score:2)
Re:Entirely new system (Score:2)
would be quite a feat, but just think about the raw power you could get from new chips and equipment that didn't have to be compatible with old chips/code/peripherals.
Well unfortunately most people don't want to buy totally new things. I ceternally don't. I would love to get the maxium life out of anything until it is completely broken and then if I have to then buy something. I guess some people never really care.
Re:Huh? (Score:2)
Well that would work really good until you get massive concussions and skull factures because your flat pannel TV fel onto your cranium.
Re:Graphics Cards!!!!!!!! (Score:2)
of these babies might have 64 meg of VRAM. Egad. Without the D/A conversion, things like autoaliasing work differently too, so they will have some really low level impacts on the design of drivers. You probably won't be at all
happy if you just plug one of these babies into your box.
Well if they make one that fits into an ISA slot I will be a happy person.
Re:Can Windows display WMF images instead of icons (Score:2)
PostScript does that (Score:2)
Re:Not a legacy driver problem, per se.. (Score:2)
Re:Drivers are usually written by the manufacturer (Score:2)
You may be thinking that the driver could just automatically resize icons and bitmaps when drawing them, but that poses a problem for the rest of the display. If a program wants to draw a 202x202 box and then put a 200x200 inside it, it will be a problem if the driver decides on its own to inflate the bitmap to, say, 400x400. Suddenly it isn't fitting into its rectangle anymore. And if the driver inflates everything, scaling all coordinates to some more viable size, then you aren't really using the display's full resolution -- the rest of the system thinks the display is really 96 ppi or something, and the driver is inflating and interpolating (which takes time to do well, and doesn't look as good as a true 200 ppi image).
MS Windows is already capable of resizing icons to whatever size you like (in Win98 and Win2K, at least) -- play around with the icon size control in the Appearance tab of the Display control panel to see the effect. But this doesn't seem to affect all icons -- only those on the desktop, and presumably those of any other program that goes out of its way to look up the desired size in the registry and rescale all its icons. And it doesn't help with non-icon images. And the resulting images are kind of ugly.
So I really don't think that it answers the issues to say that the display manufacturer should deal with this problem at the driver level. The real issue is that everyone is used to dealing with computer displays in the range of 70-100 ppi, and the farther you go beyond that, the more problems you're going to have. Ultimately I think we need to get completely away from the notion of measuring things in pixels, or defining stored images as arrays of pixels. Once you do that, the problem more or less goes away.
I love NYT reporters (Score:2)
Funny, I think the resolution of broadcast quality NTSC video on a 30 inch television sucks ass compared to a finely tuned, 1600x1200 21" computer monitor. Modern monitors are far superior to consumer televisions for the simple reason that you sit a lot closer to them. That, and the fact that broadcast standards (ie NTSC, PAL, etc) are a bitch to make high quality.
Compatibility Mode (Score:2)
I mean really, most displays will at least display standard VGA... That way if you don't have a proper driver, you can go to the lowest common denominator.
Of course if you are running something such as XFree or Windows, there will probably be a driver. After all, if you have drivers for those two, you've got better than 90% of everything supported.
A small extrapolation (Score:2)
Photolithography etching has got to be expesive (Score:2)
This is going to keep the costs way up there for at least several years. If they are targeting the 30M+ Win customers they are going to need to get the price to come way down. Pretty cool though, I've seen some of those displays they have in the newer A-10 jets, they are very sharp and surprizingly tough, the overall resoulution was sorta like looking at a magazine photo.
SP25 Maybe? (Score:2)
Maybe it's that the fleet of developers are busy working on SP2??
Re:Not a legacy driver problem, per se.. (Score:3)
Nope, it's the assumptions that programmers make about screen resoultion. The solution might lie in using 'third generation 2d engines' as discussed in a story on slashdot one or two weeks ago. Apple's MacOS X will use such an engine. Basically it's vectorbased graphics they are using. This means that the GUI is resolution independent and that anything you display on it will scale nicely from a ancient 14 inch monitor to a modern laser printer.
It wouldn't surprise me if these screens will be available for mac os X first.
I don't think you've read the article (Score:3)
Not a legacy driver problem, per se.. (Score:3)
Quake ought to look good on that sucker..
2D Rendering/Acceleration (Score:4)
I don't really understand how everything under the hood works, but it appears to me that when a game needs to display 2D graphics, sprites, or entire 2D UI's, it makes calls to the 3D libraries, loading these graphics as textures that the accelerator can then scale, rotate, and all that other fun stuff. Load up Quake 3 and look at the main menu. The text is the same size no matter what resolution you're at and unless you're in some absurdedly high resolution, it doesn't pixelate, and the graphics card automatically anti-aliases the pixels.
Display Postscript, Display PDF, and other forthcoming display technologies rely a lot on software and still treat the computer display as a raster medium. These will be interim solutions, but if the proper hooks aren't put into these software components for hardware acceleration, things like Display PDF will overrun the processor at very very high resolutions.
How about this? Let's treat the video memory area in the graphics card as a canvas and draw vector graphics in it. These will naturally include bitmaps with physical size information, and even a legacy "raster canvas" mode to accomidate legacy applications. Then all that has to happen is the graphics card does the rasterizing to whatever absurd resolution you need and pipes it to the display.
A few things this implementation has the ability to do:
1) Support everything NOW. In legacy modes, you'd get anti-aliased scaling, much like when I change my resolution on this LCD monitor in front of me to 640x480. Sure, you wouldn't get the benifit of 200 ppi, but at least you would have a crisper screen to look at (one of the things that annoys me most about my flat panel is that you can see the individual Pixels and the grid in between them)
2) Take a lot of the graphics burden off of the CPU -- the performance increase from the modifications made to DirectX 7.0 to accomidate the GeForce and a "GPU" in general really beg for the same kind of GPU idea to migrate back to 2D, where it will be needed to support massive resolutions needed in the future. Gradients, scrolling, and moving entire solid windows and widgets around are some of the most CPU intensive operations that can be done (check your meter while running E with all the bells and whistles on!) This are things that should be done by the video card anyway. Let's see some people start putting hooks in there software and OS's to allow this! Stop thinking in pixels already!
3) Support legacy apps on new software/hardware in the future. Load a rater-drawn app into a texture buffer and a virtual "screen" that has pixel dimensions, then scale those pixels on the canvas appropriately.
So anyway, I guess we'll see now how this idea will hold up.
~GoRK
Resolution independent GUIs (Score:4)
For example, SGI has long provided scalable icons for its IRIX desktop file manager. Apple's new Quartz graphics API and Aqua GUI look like they may have the start of this flexibility. OpenGL may be a good base to work from as well.
As someone who spends way too much time reading text on computer monitors, I look forward to any improvements in their readability!
Tim