Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Super LCD Screens: 200 PPI 263

crovira wrote to us with an article from the NYT (free reg. req'd.) Apparently, both Toshiba and IBM are dissatisified with the current state of monitor development. To that end, they've created some wondrous toys like 16.5" LCD screens that display 200 PPI. They've run into a most curious problem however: Legacy software/drivers. Click thru to read more - and if IBM should want to send a couple screens my way, I'd see what we can whip up around the office *grin*.
This discussion has been archived. No new comments can be posted.

Super LCD Screens: 200 PPI

Comments Filter:
  • by Anonymous Coward
    I read about this more than a month ago. I thought the link came from slashdot, too. What's up with this?
  • No, it can't modify bitmaps with vectors. Think about it for a second, that is a type of Optical Recognition.

    Apple's present PR engine's mantra is that Icons are represented up to 128x128, but by a bilinearly mipmapped graphic.. I.e. there is also a 64x64, 32x32, 16x16 (And possibly lower powers of 2) version, and it chooses and scales an icon based on these numbers and the closest version of the icon size-wise. No vectors. You really need 64x64 icons in order for vector icons to work, in my experience. Scaling lower than that just gives a mess.
  • Well there is an option to use Large Icons. That might help. Display Settings, then Plus! tab. Of course the icons are just scaled up, so the quality isn't improved, but it is more readible.
  • What you don't get is the ability to say 'My display has a 14.5" visible diagonal, 4:3 apsect ratio, and 1280x1024 pixels' and expect things to be resized accordingly. This really should be here now, given PnP monitors.
    John
  • It's EASY to make a CRT with an aribtrarily targetted beam -- think CRO. It's basically impossible to do a colour one. It's also rather difficult to make the image persist (you have to keep redrawing an arbitrary list of vectors. (IIRC, there use to be vector displays, until raster ones took over)
    John
  • X11 is impossible to make resolution independant.

    First, the font system is pixel based. Second, all the other drawing primatives are pixel based. There are no given res independant units to work in, and an extension won't help since not everybody will be using it. Get into the real world -- X is further behind than EVERYTHING else in the market today.

    p.s. a target sized theme won't affect the appilcations, you just get a 'normal' sized window frame, and 'anything goes' inside it...

    If you ask me, killing the dead horse is a small price to pay for 4000x3000.
    John
  • When the target display architecture can't scale up the fonts properly even if asked to by the toolkit. Nor can it scale the line widths effectively. (If you take a look at Xlib, you'll notice that everything is done in terms of pixels, and that's what everything turns to in the end. And even with a theme, how do you synchronise the settings across distant hosts??)
    John
  • What you need is a float based 'internal positioning metric' with a fast grid fitting algorithm to position the lines.
    John
  • PDF 1.3 is a subset of Postscript 3!!!

    Basically, all the programming stuff is pre-executed, and the contents of each page, in terms of Postscript primatives, is canned at each showpage command (well, its a little more, but not much)
    John
  • NeXT didn't do it, Sun didn't do it, Apple didn't do it -- Adobe formed to pioneer the display independant display architecture, its use in printing came later.
    John
  • Blurry like a TV
    John
  • It's interesting that one of the big problems is with software. Those tiny 16x16 pixel icons all but dissapear when you have a 200dpi monitor.

    Perhaps those gigantic 128x128 pixel icons in MacOS X don't seem so huge now. And with their Display PDF system, a properly designed icon should be scalable to any size you want.

    And with the Mac's popularity in the imaging business, I wouldn't be surprised if the first major customer of these 200 dpi displays is Apple. Just imagine an iMac with a 17" display and a 200 dpi resolution.

  • I'd like to see monitors with high DPI, it'd make it more possible for me to avoid sending things to the printer down the hall. Resolution in the 200 dpi range might be enough to enable this though.

    I see a weak spot though and thats the bandwidth required to keep screen response reasonably interactive. Comparing a high end 100 dpi monitor to a 200 dpi monitor means that 4 times the memory bandwidth would be required to maintain the status quo. This requirement is pushed all the way up from the underlying hardware into the software technologies such as the X servers or whatever it is that handles display primitives in Windows.

    I don't follow the windowing developments under Linux very closely, with the V3.X servers running on a TNT with 16 megs of memory linux graphics are a lot slower performing than they were under Windows 98. Will V4.X achieve parity (or perhaps go beyond parity) and would it keep up with a 4X throughput increase?

    Alternatively you could argue that adaptive resolution would be a useful approach at cutting bandwidth. Text is usually black and white and suffers greatly at todays resolutions while colour images are acceptable at todays resolutions. If the display could handle addressing multiple pixels for colour information to form a less fine resolution the overall bandwidth could be greatly reduced.
  • 200x100 character terminals... oh, yes.
    *imagining pine, vim, lynx, zicq in 200x100 rxvt's*

    And think of the console modes with the 2.2 kernel framebuffer support... *doing math*
    with an 8x16 console font, 2048x1536 comes out to... 256x96... even 1600x1200 is 200x75. Mmmmm.

    Now THAT is where it's at.

    --Ben

  • <i>For picture quality, however, many computer screens are put to shame by the cheapest discount store portable TV. </i><p>

    What? Unless I'e been walking around in a trance, exactly the reverse is true.. Even a cheap monitor has a better quality picture then a fairly expensive TV.. Potention resolution, scan rate, dot pitch, all better on monitors then crappy NTSC TVs..<p>

  • NeXT had display postscript over 10 years ago.
    Apple now has display PDF in their upcoming MacOS X.

    Correct you are, sir. In fact, during the Rhapsody days, Apple engineers amazed themselves by putting together a Display PostScript interface for their latest effort (soon christened MacOS X Server) in five working days. Yes, the NeXT software development infrastructure was and is that good.

    IIRC, the only reason why Apple got rid of Display PostScript was due to PDF's emerging dominance as a graphical layout standard. (Oh well. They had the spare week to code it anyway..)

    Anecdotes like this make me hope that OS X Consumer will have some of the development tools available. Drool drool drool...
    -----

  • suddenly, the DPS support in GNUStep seems more important.
  • That was the whole impetuous behind Display PostScript, pioneered by the NeXT computer / OS. It made for a very cool display / printer integration, because what you saw on screen was exactly what you'd get on paper. It took care of all the scaling issues for you. NeXT was very much ahead of it's time in the late eighties.

    Unfortunately, it was perhaps too computationally-expensive for the machines available at the time. You gotta remember that back in 1989, a 16MHz 68020 was considered a high-end microprocessor.

    Display PostScript would be a lot more viable today, except for 3-D games. Most graphic accelerators now have way more power than they need to keep up with 2-D display tasks. And the CPU power is there too, even with a low-end, non-OC Celery.

  • I just had a second though. Dont laugh, it happens...

    The RAM requirements of a 200DPI display are enormous.

    I have 13", 17" and 21" monitors hooked up to my G3. Obviously, what I want/need is screen real-estate. I shudder at what a card for a decent sized (30" :-) flat-screen 200dpi monitor would cost me!

    And the acceleration needed to handle all those pixels per square inch... I'd need a water cooled side-car.

    But Tomb Raider would really be worth playing. :-)
  • Of course it does.

    Gates always said: "Make it more Mac like"

    Seriously, Display PostScript was a better solution since day none.

    Microsoft suffers from IBM's old problem. "NIH"ilism. We'll see if it survives it when the world switches to 64-bit architectures.

  • Ultimately I think we need to get completely away from the notion of measuring things in pixels, or defining stored images as arrays of pixels. Once you do that, the problem more or less goes away.
    This is a job for... DISPLAY POSTSCRIPT!!!
    --
    " It's a ligne Maginot [maginot.org]-in-the-sky "
  • The toshiba one will help this. The article states that there method of increasing resolution results in refresh rates that are 100 faster.
  • It's a "2" because the poster of the comment has enough karma to get an automatic +1 bonus every time he posts. Oh, and I thought it was actually pretty funny.
  • Our SciVi group started using 640 x 480 24 bit
    monitors in 1980 at $30K. The resolution jumped to
    1280 x 1024 by 1983 and essentially remains stuck
    there. The difference is the price has fallen
    to $500 and the application software has grown
    exponentially. (There have been a few specialty
    monitors at 1600 or 2048 if you add another zero
    to the price.) We were told for electron gun technology you just cant form clear pixels faster
    than about 5 nanoseconds, hence the limit on those types of monitors. Multi-gun, mult-monitor
    technology has been around- either kludgy or expensive. The IBM approach is looking at non-gun
    monitors where there is still growth possibility.
  • irix seems to do this already, all of icos are in vector format and can be scaled, dynamicly, with a little roller bar on the side of all the windows.
  • Last I checked, EInk's writing speed is *slow*. Not really suitable for frame rates. Dense LCDs have been around for at least 6 years; Xerox PARC had a set of gorgeous 200dpi units, b/w and color, around 1993. They tried to figure out how to manufacture them at a profit and failed (even with governmnet subsidy). (I also recall that they were seriously power hungry!)
  • You forget one thing. More pixels = more electricity (most likely) - a big challeng in portables is not sucking too much power out of it. I don't want to have to replace/recharge the batteries on my Palm every 4 hours. Likewise, I don't need that kind of resolution on my notebook at the moment, and I'd rather have longer runtimes than more pixels.
  • Anyone remember Sun Microsystem's NEWS ? (Network Extensible Window System; I found a very old page referencing it [catalog.com]; a web search might turn up more). It was an early contemporary to X windows which was based on Display Postscript. You could even upload code to the NEWS server (although god knows what sort of security risks that posed, nobody was worried back then).

    Sounds like IBM and friends need to either fund conversion of open-source apps to X11-DPS, or form a new consortium to develop a new Open Source resolution- independent (or at least less-dependent) protocol and libraries.

  • [Early moderation: off topic :-( ]

    And this is off topic why? The original article stated that they were having difficulty with device drivers, etc. So I mention my own experiences, which would have been more successful if Linux and OpenGL had existed as they do now.

    The whole point of my post is that if IBM and Toshiba are having problems, they can solve them faster by working with the community than without.

  • After all these years, the flying bathtub is still the best overall anti-tank weapon (well, anti ground based armored vehicle) there is. I mean, how many planes do you know of can still fly with half the wing blown off? They played a major part in desert storm several years ago, and whenever the next little/big war comes along, I'm sure they will then too.. stealths may be nice for buildings and all, but it really helps when you can visually identify your target and make sure your blowing up the enemy and not a friendly. (The first american/nato/allied/wtf casualty in desert storm was by one of our own apaches being off course and blowing away one of our own APCs! They did ask for confirmation of course, but HQ thought they were on course.. and they couldn't just fly over and see, like the warthog might risk..)

    My favorite aircraft is definately the A10 :)
  • Hmm, I used to do this a few years ago when I ran 95... I had a 12" monitor running at 1024x786...

    Unfortunately, several windows programs will notice the custom font setting and refuse to run (with a kind little dialog box which says, basically: "Hey weirdo, fix your font setting, you're being different, stop it.")...

    So windows can handle some of this, in theory, but it's basically useless in real life. (gee, sounds like windows in general...)
  • Go read the article about Quartz on Ars Technica. One thing they mention is that the bitmap icons can in fact be modified by applying vectors to them. I'm not talking about opticl recognition or even tracing (ala Adobe Streamline). I mean vectors as in "a vector in space".

    As you mouse over that task bar in Quarters the bitmap icons grow and shrink. They don't just pop up larger and then pop down smaller they execute a smooth grow/shrink based on a vector used to scale the bitmap. If you select an icon in the task bar it doesn't just open an associated window/app it flows the information out from the width of the icon to the width of the app using two bezier curves to define the sides of the flow.

    It's quite slick, as long as it doesn't suck CPU power like a ShopVac.

  • The display surface for Windows might be resolution independent but the icons are most certainly not.

    Most Windows icons are 32x32 pixels with the tacit measurement assumption of 72pixels/inch. That means that at most used resolutions your Windows icons/toolbar icons will nominally appear as 1/2" squares. If your monitor isn't based on 72pixels/inch but 200ppi then your 32pixel icon will only take up a square area of about 1/6" on a side on your display. I don't know about you, but many times I find the current icons too small to impart any useful iconographic information. If those icons become 1/3 of their current size then there is proably no sense in having them.

    It will take a fundimental change in the way Windows handles icons in order to get them to display properly on such a dense resolution monitor.

    Since Windows likes to keep the icons in the actual executables or .dlls you can't just scale up a directory of images. You'd have to find all of the icons on the system (herculean task), extract them all, scale them all up by a factor of 3 and then reinsert them into the myriad .exe and .dll files they came from.

    Display Postscript and it's successor, the new Quartz "display Acrobat", that can modify bitmaps with vectors is the proper way to go to achieve *true* device independance for a GUI.

  • I fully agree...

    ...but wait, doesn't the tecnology aready exist?
    What was NextStep about? Didn't the neXT-Station write to the b/w Monitor (actually four levels of grayscale) through PostScript? So Prsentations on screen where really WYSIWYG as the same language was used to write to screen as to the printer. And we know of laserprinters having 300, 600 or 1200 dpi!

    The tecnology exixts! Don't reinvent the wheel.

    :-)
    ms

  • let go of everything bitmap, and implement the GUI in vector and True Type Fonts.

    True Type tends to have better hinting for low resolution devices like a CRT screen, while Type1 gives better output on high resolution devices such as printers. With these high resolution LCD screens, Type1 may be a better choice.

    Also, remember that while scrollbars, entry boxes and the like can already be any size you want them, on a high-res display, the 3D effects will get lost as they'll be too small to see, for example. Converting these to vectors will help. But bear in mind that not everything is suitable for conversion to vectors. Many enlightenment and gtk themes, for example, would be all but impossible to make into vectors.

    I suspect what we'll end up with is a halfway house where theme designers will have to make multiple versions for various resolutions, much like we have 75 and 100dpi bitmaps fonts now.

  • At I.B.M., Dr. Wisnieff's laboratory rewrote the underlying code of Windows NT to inflate its display pixel counts.

    I remember they did something similar with an earlier version of OS/2. It really pissed me off. I reconfigured a OS/2 box such that instead of the default 640x480 it was working in 1024x764, only to find that IBM in its infinite wisdom was now showing all the text at the same size as before, only with higher resolution.

    Very clever, they no doubt thought, the 9 point text is still 9 point text - only smoother. Of course, what I actually wanted was more bloody text on the screen - and bugger the point size.

    I'm pretty sure they played the same game with the icons too. Damn their eyes.

    When I get my hands on one of these 200dpi LCDs I do not want one of IBM's inflating pixel drivers with it.

    Regards, Ralph.

  • Two questions

    First of all with my limited experience doing low level graphics I seem to remember only being able to write to the screen during the horizontal or vertical refresh of the monitor. Do these new LCDs emulate this to allow legacy software and etc...

    Second is their a new all digital standard for connecting these to video cards. With no analog components this would seem to get better quality and allow me to have a longer monitor cord.
  • That way there would be no large chunks of bitmap graphics in the os

    The display hardware is bitmapped anyway (I think it's harder to make a CRT that can steer the beam arbitrarily, as the old vector scopes did, than to make a raster CRT) - and intrinsically so if it's an LCD. There's going to be bitmapped graphics code somewhere at the bottom.

    However, you can implement a resolution-independent layer above that, e.g. Display Postscript/NeWS or Display PDF.

  • You know, someone ought to go back and do research on high-performance vector monitors...

    I am very happy that neither my work nor my home desktop have a display that involves steering an electron beam (Hint: what do the initials "LCD" in the title of the story to which these comments are all attached refer to?).

    I am unconvinced that going back to vector CRT monitors would make things better.

  • If you are having problems with 10pt fonts looking small then your system is broken. A point, by definition, is 1/72 inch high. This is regardless of resolution, so the system should be scaling the font's pixel representation based on the actual screen resolution.

    X, at least, gets this right. I am running my 21" screen at 2000x1500 and all the fonts are fine. Everything in Netscape, application menus, and the window manager are perfectly readable and about the physical size I would expect from the point sizes I have chosen.

    -jwb

  • That's great for window manager settings, but the tough part is applications. The icons under Netscape 4.x would become hideously small, as would text on a lot of these sites that hard code font size (heck, it even gets hard to read on "normal" monitors when you jack up the resolution).

    Maybe if you're only using gtk/qt based apps, someone could throw together a theme to size up their widgets. Anyone know if the theme support is flexible enough to allow an easy scaling up of all apps that use these toolkits?

  • Ah, an opportunity to plug the Berlin windowing system! How sweet!
    In Berlin, rendering the vector-based description of the GUI to actual pixels is done at a very late stage, thus allowing almost anything (CRT, LCD, sport arena jumbotrons(!)) to act as visual device - just rewrite the rendering and output modules & off you go.
    With the varying resolutions on screens (PDA->CRT->this new LCD), this is the natural next step in GUI development.
  • LCD screens have too much "ghosting" when things move around.
    I'm in a blunt mood, so I'll say; That's crap.

    I've got a Sharp Ultralight PC with a supurb LCD (TFT) screen. There is no ghosting. I've played right through Quake II on it, no problems (though I recommend MS's Dual Strike controller). My primary work screen is the LCD screen on this portable. I'm looking at it at least 8 hours a day. The definition, brightness and contrast are all very pleasant on the eyes. I have never suffered headaches.

    The only "bad" thing I have noticed is that areas of heavy bands of BLACK and WHITE can cast a slightly noticable shadow above and below them on the screen. I think Fox's news site background graphic is an example. Probably something to do with an "array" of wiring (or something)...

  • hese folks have upped the resolution to 3200x2400. That's nearly impossible to read even on a 24 inch SGI monitor

    I can read xterm's "tiny" font with no problem on my SGI 1600 Flat Panel at 1600x1024 resolution. The same font is unreadable on a 21" CRT monitor at 1280x1024. Comparing resolutions between CRTs and plasmas is apples to oranges - the clarity of a plasma panel is akin to getting glasses after your eyesight has gradually deteriorated: you simply can't believe how much better you can see, and how much more seldom those headaches kick in.

    You are correct in noting that the resolution increase is such that all but the largest point (pixel) fonts we're used to today would become unreadable, even on a flat panel. But you (or others, anyway) would be surprised at just how clear and readable tiny print is on a flat panel, as opposed to a continuously scanning (and distorting) CRT.

    The solution is, as others have said, a level of abstraction between the physical world "length" and "area" screen measurements and the digital world "size" in pixels, preferably with user definable mappings between the two for "arbitrary" virtual resolution ranging from "unreadable" to "large enough for all but the completely blind."

    Maybe IBM could ship off a display in the direction of the XFree86 group...
  • IIRC, the only reason why Apple got rid of Display PostScript was due to PDF's emerging dominance as a graphical layout standard.
    • If
    • I RC, Apple didn't get rid of Display PostScript for OS X. PDF is a superset of PostScript and therefore Display PDF is simply an extention of Display PostScript minus royalties to Adobe.
  • What was NextStep about?

    NextStep = OS X

    OS X is Apple's GUI on top of NextStep. Remember when Apple bought NEXT a few years back. They didn't reinvent the wheel. They own it.
  • Ok corporate, listen up..

    OPEN THE HARDWARE SPECS, and sit back to watch the magic happen.
  • Maybe I'm constrained by my understanding of technology...

    If you had a flourescent gas encased between two plastic sheets, and activated it via a combination of RGB lasers and priming lasers, you could conceiveable get infinite resolution displays, assuming the motors and laser pulses could accomodate the speed, and that the data bus could feed the device high enough.

    And there is no reason for the lasers not to fire paralle to the surface of display; just use two intersecting lasers to 'activate' the gas.

    Just a random thought

    -AS
  • Can you answer another question, hypothetically?

    Can windows support WMF images instead of BMPs and ICOs in the dlls that generate all the desktop/widget icons?


    -AS
  • I see what you're talking about, but it would require that Windows itself support .WMF images as an alternative format to the .BMP and and .ICO for desktop icons and widgets; I've never seen any mention that Windows has that kind of functionality, though I've never looked for it either. If there is support, at the OS level, for WMF icon files and data, then Windows would be, as you suggest, capable of going straight to vector-ish scalable displays.

    Though it obviously doesn't help that every windows program today has bmps and icos embedded within DLLs, and aren't in any sort of realistically scaleable format.



    -AS
  • It's not so much a legacy driver problem, so much as a lack of foresight on the OS people's side! NeXT has had a solution for this problem for over a decade, and it looks like MacOS X will also be able to overcome this hurdle very easily; Display Postscript and Quartz, the PDF version of the above. I guess you could call this legacy software...

    I'd imagine an Apple handheld using a 3x4 inch 200dpi display(that's 800x600! wow) would be awesome. Or a 19" LCD for their pro series desktop, when they can scale to that size... 3000x2200 here! Or imagine nextgen gameboys with monocle displays; a 1x1" display would easily match today's gameboy resolution of 144x144 pixels. Or a PDA with a monocle display!

    I wonder how Apple is going to try to capitalize on their ability to display 128x128 pixel icons and their Quartz Display PDF capabilities? Right off the bat the icons would be larger than 5/8, which is about right, if on the small size. I'm not sure it was explicitly mentioned, but Toshiba are only shipping 4" and 6" screens; were IBM's not limited to that size, by omission of detail?

    Wow. Cool.

    -AS
  • Yes it would be cool.
    NeXT had display postscript over 10 years ago.
    Apple now has display PDF in their upcoming MacOS X.

    So your prayers are answered, and they have been done so by the man called Jobs. Fear.

    Anyway, it may be that Apple can support this right off the bat, with their vector based display system, Quartz, and 128pix icons, etc. All of a sudden, it seems everything is going Apple's way. Hmmm.


    -AS
  • Isn't the large Icon's setting in Plus! allow for support of 64x64 icons?

    Of course, that's actually still too small, and it doesn't scale very intelligently, but it's a step in the direction. I'm betting Apple will be able to leverage this new display technology and wow everyone =)


    -AS
  • Just a question - if the problem is icons sized in pixels, would systems that use something like Display PDF help, since (presumably) they would know about point sizes, and thus render big enough to see? If so, Apple might be interested...
  • It's had that since Windows 95 at least.
  • by AJWM ( 19027 )
    Two other factors in why video looks "better" on a TV than on a monitor: TVs have a slightly different gamma than monitors, so video images tend to look duller on computer monitors unless this is compensated for; and the biggy: TV's sloppy analog signal processing effectively gives you antialiasing for free, smoothing out image imperfections that are glaringly obvious on a computer monitor. (Mind, the difference in bandwidths of the chroma and luma signals can give you weird color-crawl effects in closely striped areas on a TV screen.)
  • Windows resolution independant? You could write an app resoulution independantly, but that's not all. Just setting "Use large fonts" is not going to do a lot of good at 3200x2400.
  • By making icons and text vector graphics (or something along the same lines), you could make it entirely scaleable. That is, "rendered" real-time, although I'm sure that term isn't quite applicable.

    Hmm.. better example: If you go into Adobe Photoshop and use the text tool to write "Hi Mom" and then resize the image (without flattening the layers) to be four times the pixel depth, which makes the image larger, you don't lose any image quality. It's not just zooming in, it's using the algorithms that define the curvature and lines of the letters (or vector graphics) to re-calculate the shape down to more specific pixels. It isn't "zooming in" in the traditional sense.

  • That's exactly why picture quality on a television is better than a computer screen. Because tv-screens are so crude, they 'blend' together pixels, creating a smoother picture. Ordinary television has a resolution of about 600x400, in computer terms, yet it looks more realistic.

    This is because your brains do something similar: images are picked up in pixels (a limited number of light sensitive cells) and combined into an image. The individual pixels aren't important, but the overall image is. So your brain doesn't process the individual cells. It rather combines the signals from clusters of cells, with some cells physically located together, participating in different signals. Several of these signal channels (nerves) interact together.

    So, the quality of screens depends on your definition of quality. Do you think it's better if it's more realistic, or better if you can put smaller elements on it, whether you can discern them or not? I think it's not a set criterium, but rather dictated by purpose and circumstances.

    ----------------------------------------------
  • Scaling the graphics shouldn't be that difficult. But them sites that use pixel-specified tables may be in trouble.
  • 1280x1024 pixels in graphics.

    2,200x1100 pixels in antialiasing.

    Smoooth like silk.


  • Not largely relevant to the rest of the article, but...

    For picture quality, however, many computer screens are put to shame by the cheapest discount store portable TV.

    Uuuuh.... what? Someone needs to lay off the crack rock. The average computer monitor is like .27dp these days. Nice ones are .25 or lower. Television screens would be whole numbers on this scale...

    Anthony
  • Isn't the large Icon's setting in Plus! allow for support of 64x64 icons?

    Of course, that's actually still too small, and it doesn't scale very intelligently, but it's a step in the direction. I'm betting Apple will be able to leverage this new display technology and wow everyone =)


    Well, actually, Windows supports icons up to 2048x2048 in size (according to VC++), and in 32 bit color - and amazingly, it does this wonderful thing where it automatically bilinear interpolates them to the right size if the icons are too small. It'll also quite happily give you 72x72 icons for most things if you set it to do that - though it looks kind of weird. Using the Appearance tab, you can set everything so that it's a factor of X bigger than now, including font sizes (though for existing apps, a "large font" mode will be needed - and anyone who doesn't use StretchBlit in their apps to size things for the screen, and who doesn't use font-metrics rather than direct pixel metrics deserves to be shot).

    It'll handle it quite happily. And, as I used to work there, I'll tell you right now that they were considering this and have plans in place to handle this when I was there (Oct 1998 - June 1999). But sorry, I can't tell you the details - you'll have to take it on faith.
  • Seems like this is the ideal position for the OOS communtiy to jump in.

    <i>At I.B.M., Dr. Wisnieff's laboratory rewrote the underlying code of Windows NT to inflate its display pixel counts. Lotus, a unit of I.B.M., also created a high-resolution-compatible version of its word processor.</i>

    With the open availability of specs, OSS programers should be able to step in and provide a solution without something as drastic as a rewrite of NT. (The fact that an IBM lab has access to the NT source strikes me as odd for a whole other reason. Woder is they can fix the rest oif it while they are at it :-)

    It also should serve as a heads-up for all app developers, as something that will become an issue in the future. These monitors are going to be designed for the medium end desktop, and Linux has the ability to be ready for when these things arrive, without having to re-write everything.

  • In the Windows world, and increasingly the Unix world, drivers are written by the manufacturer.

    Given this, why should this cause a holdup?

  • The LCD pages have a news article [lcdpages.com] up about the IBM 200 PPI Display [ibm.com]. They call it "active matrix liquid crystal display (AMLCD)"

    They (IBM) also have an interesting bio up on Robert L. Wisnieff Manager, Advanced Display Technology Laboratory [ibm.com]. Interesting if you can ignore the market spin at least :).

    Noel

    RootPrompt.org -- Nothing but Unix [rootprompt.org]

  • Dodgy Maths Mate...

    17" by 80ppi is 1360 pixels wide, so you are using a screen roughly 12 pixels high? 14" by 80ppi is 1120, the resolution is 1360x1120 (or to be more useful, 1600x1200) which is 1,523,200 (1,920,000) pixels on the screen. Not 16320.

    At 200ppi (3200x2400) there would be 7,680,000 pixels on the screen (not 40,800).

    The simple solution to icons is to make them big (256x256) and truecolour, and then scale them down to fit in a certain area of the screen. Funnily enough though, at 200ppi, your biggest icon would be 1.25" on a side. I can live with 0.75" on a side icons, but any smaller... no way.

    X will survive, but themes won't scale, and icons will have to be redone. I think that someone out there should start working on 256x256 icons for every program they can think of, 64x64 is just too small.

    ~~

  • Following up to my own post...

    256x256 truecolour icons would take up 256k of memory. My desktop currently has around 40 different icons on it.. that would be 10Mb of my memory, and I know that with several windows open and some applications running etc that could easily go up to 160 icons, which is 40Mb of my memory.

    I think that even for the most complex of icons a vector representation would result in a smaller icon than 256k! Looking at the Word icon, hmmm, that could be done in around 1k (a rectangle, a 'W', some lines of text and a stamp). Yep, vector icons are the way to go, not bitmap.

    Silly me! MacOS X could be outdated with its great technology - they won't be scaling their icons down, they will have to scale them up to be legible, even at 128x128! :-)

    ~~

  • Probably why Apple has gone to display PDF in its next operating system then?

    They have obviously seen this thing coming along - in a couple of years most laptops will have min 1600x1200 screens, probably 2048x1536 even, and do you want to hunt for those icons on that screen (shhh, don't even think of 200x100 character terminals).

    This calls for a scalable desktop - one where things are specified in DPI and not pixels. Apple has done it. Windows hasn't, and won't until people start moaning (my taskbar is 1/8" high, I can't read the text...) but what about X?

    ~~

  • I agree that the G400 is up near the top of current 2d video technolgy, which only strengthens the original point about these 200dpi 16" LCD devices - the world's not ready for them!


  • <i>OPEN THE HARDWARE SPECS, and sit back to watch the magic happen.</i>
    <br><br>
    That doesn't even help you much. They can't sell stuff cheap enough for people to afford unless they can go for the big markets. That means Windows. And I don't see people fixing Windows to do pixel-quadrupiling-of-all-apps-that-don't-know-bet ter.. unless those people work for MS or Toshiba or IBM.
  • Multiple Alpha processors,(screw that SMP stuff. Three-way!) a G400 and two of these babies
    1. SMP does not mean two processors. It just means that the multiple processors do not take on master and slave roles (symmetric). Three alphas could be symmetric. (I dunno if there are any power of 2 constraints on current bus architectures.)
    2. The G400 only goes up to what, 2048x1536 or something? That's not enough for a 200dpi 16" monitor 2560x1920 pixels. Also, the second output of the G400 only goes as high as 1024x768.

    Indeed, video card technology needs to move forward a bit before this is useful. At these resolutions, you're using 7 GB/s of video memory bandwidth just to output at 60 fps, 24bpp!
  • I work in digital satellite TV in the UK. I'm sitting here with a 17" monitor (1280x1024@75Hz) and a brand new Sony Trinitron portable TV (approx 15"). Whilst TV's look OK at normal viewing distance (several feet away), up close the TV is abysmal. It's fuzzy and flickers really badly. If I display the same image on both displays, the one on the monitor looks infinitely better. The flicker on the TV is so bad, that I have to switch it off when not using it.

    However you're right about the blurring effect smoothing video on a TV. If you try playing back VHS on a monitor it looks far worse than on a TV, but only because the monitor shows up the imperfections of VHS.

    HH

  • Hey.. maybe i missed something in XF86Setup.. Where do I specify DPI? I'm running 16x12 (@85Hz) on a 19".

    Unfortunately, you don't specify this in the config file (a grevious oversight IMHO).


    You spec it on the command line, thus:

    startx -- -dpi 75

    or within the startx file itself:

    serverargs="-dpi 75"

  • I seem to remember only being able to write to the screen during the horizontal or vertical refresh of the monitor


    Careful, you're dating yourself there. Modern systems use a technique called double-buffering, where the screen shows one frame of video memory, and the system writes to a second. You then flip on the vertical retrace interrupt. (Acutally, the old Atari 800, Commodore 64, Atari ST, and Amiga all could do this, it was just the PC that was limited to one video frame (at any decent resolution)).


    The other limiting factor is memory bandwidth: If you have to pump "N" pixels a second out, that consumes much of the available bandwidth to the video RAM, leaving less for the CPU. However, modern video cards use very wide memory (128 bits or more) to allow more pixels per memory access, thus allowing more bandwidth to the main processor. So this too is less of a limiting factor.


    However, I have experienced the very problem described in the article: I run 1600x1200 on my system at home, and certain foolish operating systems *cough*windows*cough* make teeny little icons. However, certain other systems *cough*xwindows*cough* allow me to tell the system the dots per inch, and most things work out well.

  • I guess I'm missing something here. A 24-bit graphics card can produce 2^24 different colors. An analog signal can express an infinite number of colors. What particular number can you represent in 24 bits that I can't represent in analog? IE, given the same tube, what color is your graphics card going to produce that my analog signal can't?

    While an analogue signal can theoretically represent an infinite number of colours, in the real world noise gets in the way. So an analogue signal can only represent a finite number of usable, distinguishable colours.

    However, this is not what is meant by colour gamut. Every graphics signal will use some colour space to describe colours: modern monitors use RGB, meaning that the colour of a pixel is described as a triple of red, green and blue intensities, all of which must be positive. TVs use (for historical reasons) a stranger colour space of chroma, luminance and - er - something else. Printers use CMYK, etc etc.

    The colour gamut is the set of colours that it is possible to express in a colour space. Because our vision system is a bit interesting, and because the world is never perfect, colour spaces that are physically realisable can never display every shade of every colour. Some have better gamuts that others, meaning they can display more colours than others.

    So the previous poster was saying that the colour space used to display NTSC television happens to suck when trying to describe red, because while it is analogue, some shades of red are just outside its range.

    Monitors are better, but you actually you can't make a device using three primary colours that can equal the real world. Even with a monitor, bright shades of cyan, say, will be displayed as mixtures of blue and green. A mixture is inherently less saturated than a pure colour. Now you can always get a brighter, more saturated cyan from a cyan-coloured laser, for instance, or a painter's pigment. This will be true for every colour that is a mixture of primaries.

    Colour spaces have been defined, like the CIE space, that can describe all perceivable colours. They use some mathematical trickery to define "super-saturated" primaries, in other words red, green and blue that are brighter and more pure than physically possible. So you can't make a monitor to display it.

  • I propose

    Multiple Alpha processors,(screw that SMP stuff. Three-way!) a G400 and two of these babies? Complete with MD, DVD-R and a *nix kernel with all of the 'legacy' support ripped sound good? We'll hand roll it all in assembler for performance!
  • Windows tried, at least.. I remember discussion of using an alternate representative unit in Windows95.. They ended up using a modified, braindamaged 'TWIPS' system again..

    Doing it wouldn't be hard. Just assume a 800x600 grid, regardless of screen size, and specify coords based on it. You wouldn't break legacy applications, because they'd see a 800x600 screen.

    X is even easier! Toss a 'target sized theme' on BlackBox or Enlightenment. Or, better yet, write a userland application that dynamically resets the WM fonts, terminal fonts, and icons based on resolution. You'd have to restart the WM every resolution change to reread the altered settings, but that is a small price to pay for 4000x3000!!
  • The topic wasn't what was needed to look good, but complete cutting edge redesign

    After looking at the interesting stuff Transmeta has patented on round-robin streaming to arbitrated processors, I'd like to see one in action. And what processor would shine the best in an application unconcerned with x86 support? Well, the Sun Ultra or Compaq Alpha, and I just have a Alpha preference.

    It was a silly response, granted, but it was a silly idea I was responding to..
  • 1. As far as I know, all of the existant multiprocessor systems rely on the processor^2 scheme. (Well, the 'Amiga' PPC/68K assymetric scheme comes to mind, but that was a trade off) The only scheme I've heard of that can use odd # of processors is probably going to be wasted on the Crusoe II. (Transmeta has a cool arbitrated 'sharing' patent. It's what I was thinking of!)
    2. The G400 represents the bleeding edge for 2d cards. It still may suck in terms of upper limit resolution, but it's probably the best we've got. (And it has the odd snafu with anything above 1880x1440, although the stated limit is much higher)

    I don't think it's video card technology that will need to improve: It's memory speed! S/D/RDRAM isn't keeping pace with the processor wars. Processor speed almost doubled in the last year, while memory only got 20-30% faster. Don't get me started on the fact I've seen 8 year old SCSI setups toast UDMA/66.
  • I had Windows in mind whan I said that.. I run my display at work at 1600x1200, and have you tried reading the titlebars, at standard '8' font size? Not kind the eyes more than a few feet away. Same with Word. Unless you're print previewing, the font size is never adjusted to screen resolution.. Courier 6 point, is acceptable as printed but illegible on screen!!

  • The newer LCD/plasma displays are stunning! I've owned an IBM for a few years now with what was the best display offered on any 'light' laptop, and the displays on some of the new top of the line Dell and Compaq offerings are so crisp and so deep I said 'Wow' outloud at the display on the Inspiron 700.. I would take that display over my IBM, Sony and NEC 21's...
  • Make them for an entirely new system, designed from the ground up with the newest technologies. I think that it is about time that entirely new systems not based on the old x86 systems succeeded in the mainstream. I realize that it
    would be quite a feat, but just think about the raw power you could get from new chips and equipment that didn't have to be compatible with old chips/code/peripherals.


    Well unfortunately most people don't want to buy totally new things. I ceternally don't. I would love to get the maxium life out of anything until it is completely broken and then if I have to then buy something. I guess some people never really care.
  • Well, those people in the commercial moved it around the house pretty easily... Well, those people in the commercial moved it around the house pretty easily...

    Well that would work really good until you get massive concussions and skull factures because your flat pannel TV fel onto your cranium.
  • Many, if not most of these are _digital_, while the old monitor in front of you is analog--there is a D/A conversion in the graphics card or in the monitor box. Plus there are a whole lot more pixels. A maxed out graphics card for one
    of these babies might have 64 meg of VRAM. Egad. Without the D/A conversion, things like autoaliasing work differently too, so they will have some really low level impacts on the design of drivers. You probably won't be at all
    happy if you just plug one of these babies into your box.


    Well if they make one that fits into an ISA slot I will be a happy person.
  • I'm sure there can be some compatability mode for older applications while everyone starts putting vector image resources into their win32 modules.
  • PostScript has a lot of features in that respect. And since this display would require a video card with an ungodly amount of RAM and it'd have to push an ungodly number of pixels, why not build a fast PostScript interpreter right into the Display? That could be cool...
  • That's not true. Your computer has no idea what the size of your monitor is. 1600x1200 looks entierely different on a 17" monitor compared with a 21" monitor. There is no way for it to raise the size of 10pt text and have it look good on both monitors.
  • Because, as the article said, some objects on the display are sized in pixels, and shrink to ridiculously small sizes on a 200-ppi display. A 32x32 icon would be about 0.16 inches, too small for most people to make it out. Similar problems would occur with images in web pages. (I already have trouble, even on a 1024x768 laptop, reading text embedded in some web page images.)

    You may be thinking that the driver could just automatically resize icons and bitmaps when drawing them, but that poses a problem for the rest of the display. If a program wants to draw a 202x202 box and then put a 200x200 inside it, it will be a problem if the driver decides on its own to inflate the bitmap to, say, 400x400. Suddenly it isn't fitting into its rectangle anymore. And if the driver inflates everything, scaling all coordinates to some more viable size, then you aren't really using the display's full resolution -- the rest of the system thinks the display is really 96 ppi or something, and the driver is inflating and interpolating (which takes time to do well, and doesn't look as good as a true 200 ppi image).

    MS Windows is already capable of resizing icons to whatever size you like (in Win98 and Win2K, at least) -- play around with the icon size control in the Appearance tab of the Display control panel to see the effect. But this doesn't seem to affect all icons -- only those on the desktop, and presumably those of any other program that goes out of its way to look up the desired size in the registry and rescale all its icons. And it doesn't help with non-icon images. And the resulting images are kind of ugly.

    So I really don't think that it answers the issues to say that the display manufacturer should deal with this problem at the driver level. The real issue is that everyone is used to dealing with computer displays in the range of 70-100 ppi, and the farther you go beyond that, the more problems you're going to have. Ultimately I think we need to get completely away from the notion of measuring things in pixels, or defining stored images as arrays of pixels. Once you do that, the problem more or less goes away.

  • ultimedia Web sites and DVD drives are increasingly turning the computer into a new version of the television set. For picture quality, however, many computer screens are put to shame by the cheapest discount store portable TV.

    Funny, I think the resolution of broadcast quality NTSC video on a 30 inch television sucks ass compared to a finely tuned, 1600x1200 21" computer monitor. Modern monitors are far superior to consumer televisions for the simple reason that you sit a lot closer to them. That, and the fact that broadcast standards (ie NTSC, PAL, etc) are a bitch to make high quality.

  • It seems like that could have a couple of compatibilty modes.

    I mean really, most displays will at least display standard VGA... That way if you don't have a proper driver, you can go to the lowest common denominator.

    Of course if you are running something such as XFree or Windows, there will probably be a driver. After all, if you have drivers for those two, you've got better than 90% of everything supported.

  • This technology will significantly increase the potential feeling of reality derived from a virtual environment. The computer screen will seem like a window into another world, and computer art will be able to ascend to grand new heights. However, computer games addiction rates might also increase. As all computing technology improves, people will lose all capacity to handle the world outside of a computer. They will be reduced to formless, drooling blobs, their eyes never budging an inch from the images on their screens. Or did I just describe you?

  • Much of their initial work focused on the screen's wiring, which is, in fact, etched onto the surface of a piece of glass. One problem with scaling up from current screens is that their wires are created using molybdenum and tungsten, metals that are relatively poor conductors of electricity. Those two metals were originally chosen because they worked well in the wire etching process. After some work, the I.B.M. lab eventually found a way to create faster wiring by developing a new photolithography etching process that substitutes aluminum and copper.

    This is going to keep the costs way up there for at least several years. If they are targeting the 30M+ Win customers they are going to need to get the price to come way down. Pretty cool though, I've seen some of those displays they have in the newer A-10 jets, they are very sharp and surprizingly tough, the overall resoulution was sorta like looking at a magazine photo.

  • (A Microsoft spokeswoman said the company had no definite timetable for introducing a solution in Windows.)

    Maybe it's that the fleet of developers are busy working on SP2??

  • by jilles ( 20976 ) on Thursday February 10, 2000 @05:32AM (#1289156) Homepage
    "The problem isn't in the display driver!"

    Nope, it's the assumptions that programmers make about screen resoultion. The solution might lie in using 'third generation 2d engines' as discussed in a story on slashdot one or two weeks ago. Apple's MacOS X will use such an engine. Basically it's vectorbased graphics they are using. This means that the GUI is resolution independent and that anything you display on it will scale nicely from a ancient 14 inch monitor to a modern laser printer.

    It wouldn't surprise me if these screens will be available for mac os X first.
  • by Frac ( 27516 ) on Thursday February 10, 2000 @06:13AM (#1289157)
    The issue at hand is not that "corporate" is not willing to open up the specs. The real problem is the fundamental way most OSes realize their GUI on the display. They were designed with the "traditional" monitors in mind. With new monitors from IBM/Toshiba that supports double the resolution in the same amount of space, everything - icons, menu bars, EVERYTHING looks half as small on the monitor. What we realy need is let go of everything bitmap, and implement the GUI in vector and True Type Fonts.
  • by technos ( 73414 ) on Thursday February 10, 2000 @05:19AM (#1289158) Homepage Journal
    The problem isn't in the display driver! A good 16 LCD at 80ppi will pull 1280x1024. These folks have upped the resolution to 3200x2400. That's nearly impossible to read even on a 24 inch SGI monitor; everything is miniscule! Imaging trying to read that image, complete with tiny 10pt fonts and 80x80 icons, condensed into 16.5 inches! Thats the problem!

    Quake ought to look good on that sucker..
  • by GoRK ( 10018 ) on Thursday February 10, 2000 @09:45AM (#1289159) Homepage Journal
    OK. Here's another take. Many newer games using 3D API's have to be written with resolution independence. For instance, on an old 3DFX Voodoo, you're not going to be able to push 30fps in the newest games unless you run at 640x480 or maybe 800x600. But on your brand spanking new GeForce 256 DDR you can do 1280x1024 at like 60 fps.

    I don't really understand how everything under the hood works, but it appears to me that when a game needs to display 2D graphics, sprites, or entire 2D UI's, it makes calls to the 3D libraries, loading these graphics as textures that the accelerator can then scale, rotate, and all that other fun stuff. Load up Quake 3 and look at the main menu. The text is the same size no matter what resolution you're at and unless you're in some absurdedly high resolution, it doesn't pixelate, and the graphics card automatically anti-aliases the pixels.

    Display Postscript, Display PDF, and other forthcoming display technologies rely a lot on software and still treat the computer display as a raster medium. These will be interim solutions, but if the proper hooks aren't put into these software components for hardware acceleration, things like Display PDF will overrun the processor at very very high resolutions.

    How about this? Let's treat the video memory area in the graphics card as a canvas and draw vector graphics in it. These will naturally include bitmaps with physical size information, and even a legacy "raster canvas" mode to accomidate legacy applications. Then all that has to happen is the graphics card does the rasterizing to whatever absurd resolution you need and pipes it to the display.

    A few things this implementation has the ability to do:

    1) Support everything NOW. In legacy modes, you'd get anti-aliased scaling, much like when I change my resolution on this LCD monitor in front of me to 640x480. Sure, you wouldn't get the benifit of 200 ppi, but at least you would have a crisper screen to look at (one of the things that annoys me most about my flat panel is that you can see the individual Pixels and the grid in between them)

    2) Take a lot of the graphics burden off of the CPU -- the performance increase from the modifications made to DirectX 7.0 to accomidate the GeForce and a "GPU" in general really beg for the same kind of GPU idea to migrate back to 2D, where it will be needed to support massive resolutions needed in the future. Gradients, scrolling, and moving entire solid windows and widgets around are some of the most CPU intensive operations that can be done (check your meter while running E with all the bells and whistles on!) This are things that should be done by the video card anyway. Let's see some people start putting hooks in there software and OS's to allow this! Stop thinking in pixels already!

    3) Support legacy apps on new software/hardware in the future. Load a rater-drawn app into a texture buffer and a virtual "screen" that has pixel dimensions, then scale those pixels on the canvas appropriately.

    So anyway, I guess we'll see now how this idea will hold up.

    ~GoRK
  • by tdanner ( 31731 ) on Thursday February 10, 2000 @05:25AM (#1289160) Homepage
    The time has definitely come for GUI architects to begin the changeover to a resolution independent architecture. When 72 dpi are available, it makes sense to plan things down to the pixel. When 200+ dpi are available, the user should be presented with a completely scalable GUI.

    For example, SGI has long provided scalable icons for its IRIX desktop file manager. Apple's new Quartz graphics API and Aqua GUI look like they may have the start of this flexibility. OpenGL may be a good base to work from as well.

    As someone who spends way too much time reading text on computer monitors, I look forward to any improvements in their readability!

    Tim

Talent does what it can. Genius does what it must. You do what you get paid to do.

Working...