Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

On the Subject of OpenGL 2.0 126

zendal writes "The danger with pixel shaders and vertex shaders is that there is no standard for programmability of graphics hardware. A schism has formed within DirectX between the competing demands of GPU makers Nvidia and ATI. Noted analyst Jon Peddie gives THG an exclusive first look at a White Paper on how OpenGL 2.0 is trying to bring stability and open standards to programmable graphics and GPUs."
This discussion has been archived. No new comments can be posted.

On the Subject of OpenGL 2.0

Comments Filter:
  • by syzxys ( 557810 ) on Friday February 22, 2002 @10:42AM (#3051101)

    ...has always been that driver support is buggy. nVidia is notoriously bad at this; their DirectX drivers are quite stable, but OpenGL blue screens left and right (especially with a lot of detail in the scene graph). I always wondered why they even bothered to include OpenGL support in their drivers, although I suppose with such a major standard they have pretty much no choice.

    Now, with OpenGL 2.0, if they have to support three different API's, isn't driver quality going to suffer even more? Oh well, ATI has been getting a lot better recently, I guess we can always switch to them. :-)

    ---
    Crash Windows XP with just a simple printf! [zappadoodle.com]
  • by Jonathan Blocksom ( 139314 ) on Friday February 22, 2002 @10:49AM (#3051135) Homepage
    I don't think so. The 2.0 proposal was brought up at the September 2001 OpenGL ARB meeting -- about five months ago. And the OpenGL 2.0 White Paper has been since at least November. While this stuff is important, there's nothing new about it. (Good thing, too; good standards take time.)
  • by syzxys ( 557810 ) on Friday February 22, 2002 @11:56AM (#3051564)

    "Blue Screens" are caused by a fault in the Kernel or something writing to memory it's not meant to be writing to.

    This is almost correct. Blue screens are caused by a fault in *kernel mode* (Ring 0 on Intel architecture), which is not equivalent to "in the kernel." WDM drivers [amazon.com] (like the nVidia graphics drivers), as well as all NT drivers [amazon.com] and in fact the entire USER and GDI subsystem [amazon.com] (since NT4), all run in kernel mode. None of these components are technically the kernel. Btw, wild pointer writes are a kind of "fault in kernel mode."

    Assuming normal user processes can only write to their own memory space, then it is a fault of the kernel.

    No argument there. See also this page [zappadoodle.com]. But as I already pointed out, the nVidia driver runs in kernel mode, not user mode, so this argument is not relevant.

    Sure, Open GL might be buggy,

    OpenGL can't be buggy, it's just a specification. nVidia's implementation is buggy, like I said. This is especially apparent considering that the blue screen errors have the name of nVidia's kernel mode driver in them.

    but it's your Windows kernel that's causing the blue screen.

    Again, confusing "the Kernel" with "kernel mode." Hey, I hate Windows as much as the next guy, but that's no reason to post incorrect technical information about it and hope nobody will realize you're blowing smoke out your ass. Next time, do a little more research [amazon.com] first.

    ---
    Windows 2000/XP stable? safe? secure? 5 lines of simple C code say otherwise! [zappadoodle.com]
  • by Namarrgon ( 105036 ) on Friday February 22, 2002 @01:03PM (#3052212) Homepage
    I've read so many comments on the high quality of nVidia's OpenGL drivers over the years - from people I tend to believe, like John Carmack & Brian Hook. Things like "it just works", "best in the industry", "better than any other [consumer?] vendor's", etc.

    What exactly leads you to say otherwise? Presumably personal experience, rather than just a desire to trash nVidia, but compared to what? Given that 3D game luminaries have repeatedly stated they prefer nVidia's OpenGL drivers to those from ATI or (shudder) Matrox, that really only leaves the few remaining "professional space" vendors (sgi, 3DLabs), and I can't imagine they're universally perfect either.

    Perhaps your perspective needs widening? Or perhaps you're running into the same bug over & over and have not bothered to notify nVidia about it? (or perhaps they just think it too isolated a case to get a high priority)

  • by syzxys ( 557810 ) on Friday February 22, 2002 @02:27PM (#3053015)

    That's true, but I already admitted I was wrong about that completely irrelevant (to the original post) detail in this reply [slashdot.org] to this helpful comment [slashdot.org]. Thanks for pointing it out again though, it *was* stupid of me to post that and then bitch about someone else doing the exact same thing later in the thread.

    ---
    Windows 2000/XP stable? safe? secure? 5 lines of simple C code say otherwise! [zappadoodle.com]
  • by spitzak ( 4019 ) on Friday February 22, 2002 @04:12PM (#3053785) Homepage
    I think you misunderstand shader languages.

    They are used extensively in film graphics. All other major renderers, not just RenderMan, have shader languages. Ie vMantra, Maya, LightWave, etc.

    Shaders do not "replace" texture maps. One of the most-used functions in a shader is to look up a given uv coordinate in a texture map and use the resulting color to control the shader. In fact most of the shaders we write involve manipulating texture maps, which were (as you said) painted by hand. We can do much more interesting things with textures other than just using them to color the surface!

    I agree about the pack/unpack mess. I think all useful image formats could be described by these items: number of bits per sample (limited to powers of 2), number of samples per pixel, delta between each pixel (so they can be further apart than the number of samples or you can trivially mirror it with negative numbers), delta between each line (allows a "window" to be cut out of a larger image, allows flipping upside-down, and allows 90 degree rotations by adjusting both deltas). There is no need to describe what the samples are, that can be determined from the count and what function you are calling, we can insist on RGBA order for normal images.

  • by himi ( 29186 ) on Saturday February 23, 2002 @02:56AM (#3056264) Homepage
    They don't sell gaming boards, they well professional boards - they're competing with NVidia's Quadro cards, not with the GeForce cards.

    3DLabs are actually (I believe) the best selling make of professional graphics cards - they're not a wannabe by any stretch of the imagination.

    himi

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...