Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Graphics Software

On the Subject of OpenGL 2.0 126

zendal writes "The danger with pixel shaders and vertex shaders is that there is no standard for programmability of graphics hardware. A schism has formed within DirectX between the competing demands of GPU makers Nvidia and ATI. Noted analyst Jon Peddie gives THG an exclusive first look at a White Paper on how OpenGL 2.0 is trying to bring stability and open standards to programmable graphics and GPUs."
This discussion has been archived. No new comments can be posted.

On the Subject of OpenGL 2.0

Comments Filter:
  • by 4im ( 181450 ) on Friday February 22, 2002 @10:40AM (#3051092)

    A most interesting point is right at the end of the article:

    One of the key points stressed by the ARB is that the "open" needs to go back into OpenGL. The group has pledged that all ideas submitted for OpenGL, if adopted, are then open for use and not licensable as IP.

    So, they won't pull a "Rambus" here... hopefully.

  • by TechnoVooDooDaddy ( 470187 ) on Friday February 22, 2002 @10:45AM (#3051127) Homepage
    sounds really great, but i don't see it happening... nVidia, ATI, Voodoo, whomever will alway wanna do the next cool great thing and that's why the extensions are available...

    And we all know MS wants DirectX to rule them all. OpenGL works, and is an open standard by definition. Extensions in there make life interesting certainly, but you pretty much know what you're getting into when you try NV_texture_rectangle or NV_texture_shader. (hint, the NV stands for NVidia) sure you can find out in directx if the hardware supports XYZ before you call it, but i find the naming convention of OpenGL a bit more coder friendly. it's readily obvious if you're trying something that's not supported across the specification.

  • by Aknaton ( 528294 ) on Friday February 22, 2002 @10:51AM (#3051147)
    >how OpenGL 2.0 is trying to bring
    >stability and open standards to
    >programmable graphics and GPUs

    The problem is that there are quite a few companies out there that do not want open standards because it gives them a competitive edge over other companies, end users, and organizations; even if it actually helps them as well. (And you know who you are).
  • by boltar ( 263391 ) on Friday February 22, 2002 @10:59AM (#3051189)
    X11R6 is old enough now so how about X12 that has OpenGL (and lots of other improvements) build in?
    So Xlib would have it incorporated and it would be much faster than as is done now of building it
    on top of Xlib and extensions.
  • by EnglishTim ( 9662 ) on Friday February 22, 2002 @11:03AM (#3051207)
    I think this is a cunning move by 3Dlabs - Their business is being threatened by nVidia and their Quadro range - it'll be interesting to see how unbiased they can manage to be when generating the spec.

    Otherwise, I think it's a good idea. It'd be nice to see OpenGL keeping up with (or even outshining) DirectX...
  • by bribecka ( 176328 ) on Friday February 22, 2002 @11:18AM (#3051298) Homepage
    I can't speak much to DirectX, but you can also query out what extensions the hardware (or at the very least ICD) supports in OpenGL. But even if you don't, an unsupported op is treated like a non-op anyway, so even if you call an NV_ extension on an ATI board, it shouldn't do any harm (or am I incorrect?).

    Regardless, the extensibility of OGL is a double edged sword. Really, it does make features board-specific, which is not the point of OGL. On the other hand, allowing these extensions does drive the future of the standard. It lets everyone throw what they got out in the public and see what sticks.

    Personally, I'm glad that OGL has huge gaps in standards updates, unlike DX. After all, it *is* a standard, and should be relatively static. Each new version of the standard should be absolutely positively a good standard. Anything missing can be used as an extension in the meantime and added to the next version 4-6 years down the road. This is the strongest point of OpenGL vs DirectX, there is a controlling body of many companies, rather than one main controller (MS). Bringing together the experience of co's like MS, SGI, NVIDIA, ATI, etc not only makes the standard better but adds a level of comfort to the users of the standard.

    Basically, my point is that OGL standards *should* take a long time to finalize. Everyone seems to forget that standards should be able to last a long time. OGL is now on v1.2/1.3 after an evolution of around 10 years, while DX is nearing DX9 since, what, 1995 or so?
  • by jaavaaguru ( 261551 ) on Friday February 22, 2002 @11:19AM (#3051302) Homepage
    "Blue Screens" are caused by a fault in the Kernel or something writing to memory it's not meant to be writing to. Assuming normal user processes can only write to their own memory space, then it is a fault of the kernel. Sure, Open GL might be buggy, but it's your Windows kernel that's causing the blue screen. A remotely stable kernel would just complain that the game or whatever was using OpenGL did something nasty, then continue with its work as usual. If an OpenGL app on Linux crashes (yes it can happen) it could mess up a display driver (take NVidia's ones before the 23.13 GLX one for example) and cause X to restart. Notice that it doesn't cause a kernel panic.

    always wondered why they even bothered to include OpenGL support in their drivers

    If i was a video card driver writer, I'd be concentrating on making the existing one more stable instead of contemplating why we even bother producing it. Doesn't the PS2 (playstation) use OpenGL? Macs use OpenGL. What do SGI machines use? DirectX may be most popular among Windows games programmers, but it's not the only industry standard worth following. If software developers for PCs concentrate purely on DirectX they would never have had some of the amazing games that originated in the console market, as they never had DirectX support to start with.
  • by CDWert ( 450988 ) on Friday February 22, 2002 @11:21AM (#3051315) Homepage
    IMHO, its time to SCRAP OpenGL and start over,

    Modern hardware support is critical, interfacing with what seems to be a diminishing number of compatile drivers is critical. Keeping the spec out of MS control is critical.

    There are MANY options to a ground up rewrite of OpenGL supporting CURRENT hardware, working with the Hardware vendors directly is the key.

    I understand this is not an undertaking to be taken lightly, I have been working on options and looking for cross platform alternatives for a couple of months up to now, there are several promising alternatives, I hope to present these in a short time.

    OpenGL's time is over it seems, MS is working with vendors explicitly to limit their support, there remain major differences of opinon in it developer base and schisms seem to be forming in it goals.
  • by FrostyWheaton ( 263146 ) <> on Friday February 22, 2002 @12:01PM (#3051615) Homepage
    OpenGL 2.0 seems to be in the right place at the right time. If the 2.0 standard can sanely standardize the programming of pixel and vertex shaders, OpenGL might be given the boost it needs to start overtaking Direct3D's hold on the gaming market. And once development moves to OpenGL, it's one step closer to widespread releases on (insert favorite OpenGL friendly operating system here) and a loosening of the MS stranglehold on computer games.

    On the other hand, if this OpenGL extension craziness continues on as it has been, the project might collapse into a tangled and unsalvageable mess. The OpenGL standards people have one shot at doing this right, and whether or not they pull it off will determine their long term success or failure.
  • Absolutely (Score:3, Insightful)

    by epepke ( 462220 ) on Friday February 22, 2002 @12:10PM (#3051702)

    It's especially problematic for graphics standards. GKS, which was hideously based on ideas from pen plotters, still dominated much of the 1980's. Open GL has been pretty good, but it's stuck in some 1980's ideas. For example, the strict ordering of primitives makes sense in a world of bit blasters where double-buffering and Z-buffering are expensive, but it makes little sense on modern hardware and even worse makes it impossible to implement some of the better modern techniques (such as hierarchical global scanline algorithms). Some day, we're going to have systems that cost less than a million dollars that can do real-time ray-tracing, radiosity, and other solutions of the Rendering Equation.

  • by Alsee ( 515537 ) on Friday February 22, 2002 @12:20PM (#3051784) Homepage
    all ideas submitted for OpenGL, if adopted, are then open for use and not licensable

    And the next sentence is an important explanation:

    what we're seeing is a recognition ...that graphics... is pretty much the whole point, and a commitment to that is a commitment to success in graphics.

    W3C take note! The same goes for internet standards. If it can't be used by everyone, then it's not a standard, it's proprietary. Anyone who wants to make/use something proprietary is free to do so, but standards bodies should never impose them.

  • Extensions (Score:4, Insightful)

    by Error27 ( 100234 ) <error27@gmai l . c om> on Friday February 22, 2002 @01:53PM (#3052715) Homepage Journal
    The "Specification Overview" pdf from the 3dlabs white paper page is pretty interesting. It has a list of over 250 opengl extensions and what happens to them in opengl 2.

    Basically they all disapear.

    Some have already become part of the standard. Some are added to the standard in opengl 2. Some just disapear altogether.

    But the large majority of them are not needed anymore when you have programability, memory management and opengl objects etc.

    To me that means that opengl 2 is way more flexible. Flexible enough so that we won't need as many extensions in the future.

    And that's pretty cool.

    (BTW: Brian Paul is a member of the ARB. He wrote on the mesa list that he hasn't been following the opengl 2 process very closely but that he expected that they would probably want him to write a free implementation).
  • Re:No. (Score:2, Insightful)

    by otri_dev ( 561295 ) on Saturday February 23, 2002 @06:18AM (#3056576)

    Sorry, I disagree about the mess with OpenGL.

    I think the OpenGL API is exceptionally clean, and generally stable. Yes, there are core code paths in OpenGL that are not accelerated by everyone, nor every driver. However, I find the general implementation far better for developing real 3D applications. 99% of the time I am not developing for other hardware, I'm working to get a concept rolling at 15fps on whatever I'm developing on. Same concept that John Carmack uses to develop his next generation game engines. Just push the boundary till it crawls, because in the timeframe it takes to refine the product to a shippable solution will unfailingly show a shift in feature support. This means, you make it WORK at 15fps or even slower then, over the next year, design code paths that optimize for existing hardware.

    Both Direct3D and OpenGL require 'special' code paths to handle the tons of strange hardware out there; there is no escaping that fact. Therefore the various application code paths must be designed to switch over to alternative paths, primarily for optimization purposes.

    Otis, you know from DemoGL how hard it is to handle all the tons of alternative hardware out there through OpenGL, but have you done the same with Direct3D? I find the CAPS structures in Direct3D far more difficult to write support code because I end up dealing with the problem of supporting various BITS of Direct3D which may kill my code dead, instead of reliably knowing OpenGL's core code path always works albeit slowly. Another interesting consideration, once your application has demand to work correctly by over even a hundred users because it looks cool, expect the next driver iteration to support the code path better. ATI has been highly responsive to my requests in the past, and so has nVidia.

    Extensions are there mainly to optimize code paths for specific purposes. With the ability to transfer frame buffer data back into texture data, allows for nearly everything to be accomplished in multiple passes, even extremely slowly. Only some very specialized pixel operation absolutely require cube mapping, generalized shader, or bump support. However, they are generally cosmetic, and should be treated as dress-up accelerated features. i.e., Grab a Toy Story DVD and watch the 'behind the scenes' stuff, it's amazing to see how the visual process works. Animate simple blocky characters, apply lighting and moderate texture detail, drop in finer meshes, dress-up the scene, and polish. That's how programming game engines work too.

Logic is a pretty flower that smells bad.