Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
X GUI Software Linux

XGL Development Opens Up 174

An anonymous reader writes "David Reveman has made the latest XGL source code available for download. This comes a few weeks after development of the project was criticized for being done 'behind closed doors'. There have been huge changes to XGL, the most significant being restructuring of the code, allowing XGL's GLX support to function on other drivers than the proprietary Nvidia one. Xcompmgr can currently be run under XGL with full acceleration provided that the proprietary ATI or Nvidia drivers are used. An OpenGL based compositing manager, 'Compiz' is currently in the works and a release is expected in February. David intends to get the code into freedesktop CVS as soon as possible, after which the code should eventually merge with Xorg."
This discussion has been archived. No new comments can be posted.

XGL Development Opens Up

Comments Filter:
  • by jZnat ( 793348 ) on Tuesday January 03, 2006 @02:40AM (#14382632) Homepage Journal
    No free (gratis) software should be proprietary; that's just a general rule! If you're giving your software away free of charge, people generally would like to contribute back whether it be in donations, patches, QA, etc. With a closed-source model, you're blocking off the useful traffic of free bugfixes! If your software is useful in the corporate world, it's also likely that some companies will contribute back if they tinker around with it enough.
    • by heatdeath ( 217147 ) on Tuesday January 03, 2006 @02:46AM (#14382650)
      Well, there are certain circumstances where it makes sense - if you're not sure where you want the project to go, but you want to give people the benefit of your code for right now. If you opensource it, you're pretty much condemning any potential you had for making money off of the code. Of course, there are those on this forum that would claim that that's wrong...but it's still a valid reason to keep your source proprietary.
      • How do you explain how TrollTech [trolltech.com] makes money with a GPL'd program (Qt and its official frameworks)? Or how CodeWeavers [codeweavers.com] makes money off of CrossOver Office when WINE [winehq.com] is Free in both ways? Or how RedHat [redhat.com] makes money off of providing a Linux distro + support when there is Fedora Core, their fully Free distro of RedHat?

        Old business models die hard, and the new methods are proving to be a success. Even Novell, IBM, Apple, Sun, and others are benefitting financially from Free software.
        • by Kjella ( 173770 ) on Tuesday January 03, 2006 @04:23AM (#14382919) Homepage
          How do you explain how TrollTech makes money with a GPL'd program (Qt and its official frameworks)? Or how CodeWeavers makes money off of CrossOver Office when WINE is Free in both ways? Or how RedHat makes money off of providing a Linux distro + support when there is Fedora Core, their fully Free distro of RedHat?

          1. Commercial versions for closed source, "free sample"
          2. Need for constant upgrades to make new software work
          3. Need for constant upgrades to make new software work
          4. Repackaging the works of others, "free sample" of RHEL

          There are countless applications where you'd barely be able to scrape together a living if it were OSS. Seriously, how many of the OSS applications you have on your computer have you bought support for? I can tell you mine is a big fat zero. Particularly if you're competing against a good user community for providing support. For a more typical project you may get the odd paypal donation but I sure wouldn't want to rely on that for a living...
          • by Bert64 ( 520050 ) <bert AT slashdot DOT firenzee DOT com> on Tuesday January 03, 2006 @05:13AM (#14383080) Homepage
            Most sales come from corporate users, and corporate users are also more likely to buy support.. Joe enduser is not likely to buy support, and is also likely to copy any software his friends have..
            Selling software to end consumers is a lot of hassle, and far less profitable than selling to corporate users, so these companies don't sell to end users, they give it to them for free.
            • Buying support pays for the support staff. What pays for the software developers? If the purchasing of the support is enough to fund your developers, chances are you have very few developers, or a lot working for free out of their parents pockets. Which is pretty much the case for OSS projects big and small. It doesn't scale at all and results in pretty dodgy development. Development (proper development, not quick hacks done by joe in his basement) is extremely slow and poor like this - it will never be abl
              • Well, what redhat/ibm/etc make from support, seems to fund a lot of their developers.. They maintain, improve and bugfix the software that they support, and if you want extra features you can pay for them.. Because the development model is distributed, companies benefit from work done by each other, so each company needs less developers..
                You seem to assume that a single company has to develop an entire product themselves, which is the case for proprietary software.. In a distributed model like this, improve
          • I'm not going to make myself any popular by this, but... the OSS community needs to learn to view and promote their work from a market perspective.

            The majority of OSS projects is mainly running on proud, and that won't get you anywhere. Now, don't get me wrong! Being proud of your work is absolutely essential. It's essential for getting the job done, for being able to implement your ideas, for, well, being motivated.

            But proud is proven to become a killer the instant you're going to sell your product - co
          • "There are countless applications where you'd barely be able to scrape together a living if it were OSS."

            Exactly. Welcome to the competitive marketplace, which is finally returning to software after a long dry spell of monopoly. The OSS folks will be unable to make a living on these "countless" apps, and so will closed-source vendors; arbitrage will drive the price users are willing to pay toward zero. App vendors in niche markets better get used to the idea, because it's already starting to get here.

      • " Well, there are certain circumstances where it makes sense..."

        There may be certain circumstances where it makes sense, but the example you give is not one.

        Releasing your free program as Free Software does not preclude you from making money on it now or in the future.

        For one example of where keeping the source closed while giving away the binaries "made sense" think of what MS did to netscape with the release of a free IE.

        To come closer to your example, if you think you may want to go the non-Free, non-fre
    • There can be licensing issues which prevent the release of code, depending on the legacy of the code itself. That seems to come up with games a lot...
  • How will this impact the development of XEGL?
    • How will this impact the development of XEGL?

      It won't. The Xegl has been dead ever since Jon Smirl quit working on it. Development will begin again (it seems) when EXA is ready for primetime. So.... years.

  • huh? (Score:5, Interesting)

    by bcrowell ( 177657 ) on Tuesday January 03, 2006 @02:53AM (#14382675) Homepage
    Can anyone translate this into English? What is XGL and why should we care?
    • Re:huh? (Score:4, Informative)

      by MichaelSmith ( 789609 ) on Tuesday January 03, 2006 @02:57AM (#14382690) Homepage Journal
      What is XGL and why should we care?

      I googled around for it. It seems to be an openGL based X server. I know of a large HMI development project running at the moment which may wind up deploying on windows (as opposed to Linux) due to the faster OpenGL implementation under windows.

      Something like this could tip the balance.

      • Re:huh? (Score:4, Informative)

        by vandan ( 151516 ) on Tuesday January 03, 2006 @03:44AM (#14382837) Homepage
        No.

        If you want a faster OpenGL implementation, then you want to optimise Mesa and the individual video card drivers.

        XGL is an X Server that runs on OpenGL. It won't make your OpenGL drivers faster - it's simply an OpenGL client ... ie an application that runs on an OpenGL system.
        • Arguably, though, a greater reliance on OpenGL for desktop applications could lead to more and better OpenGL driver implementations. It's a knock-on effect at best though, yes.

      • XGL is... (Score:5, Informative)

        by CarpetShark ( 865376 ) on Tuesday January 03, 2006 @03:47AM (#14382851)

        The point of XGL is to take the 3D engine in most graphics cards and use it as the basis for X's acceleration.

        Before, the 2D acceleration engine was used, but 2D has fallen behind in terms of performance, and 3D can do everything 2D can do, plus more. XGL uses OpenGL to render bitmaps, as well as to render video, composite alpha-channeled windows, rotate and deform windows, etc. I think font antialiasing will benefit, via a (potentially) faster XRender implementation. I gather it's also integrated with glitz already, which means that vector graphics like SVG and scalable icons, buttons, widget themes, etc. will also be done via OpenGL.

        The one remaining gap (that I know of) is hardware support. The Novell guy releasing this (sorry, I forget his name right now) seems to say that it works with relatively minor changes on Free Software DRI drivers. I know that was always an intention, at least. So, hopefully, we'll see more drivers trying to support DRI as base level of driver compliance, rather than as an afterthought. The X desktop will be faster, smoother, and more featureful... so long as desktop developers don't go overboard and expects everyone to have next-generation 3D engine performance just to run a wordprocessor ;)

        All in all, a very good thing :)

        • desktop developers don't go overboard and expects everyone to have next-generation 3D engine performance just to run a wordprocessor

          I recently built a news office system for my wife to use. I bought a new video card, the cheapest AGP card I could find. It is called an HIS Excalibur 3D graphics card, with the slogan "Power up, Gamers!", which I know is rubbish. It is a cheap, generic card.

          Interestingly this card has two video outputs. I wonder if I can get XCinerama working on it...

        • The greatest use of accelerated 3d graphics would be the independence of the GUI display from the physical screen resolution without loss of detail (resolution permitting, of course). But the X protocol is pixel-based, and therefore OpenGL is almost useless. Windows can be treated like textures, but GUIs would be much better if they were vector-based.
          • Agreed. I tried to implement something like this back in my Amiga days. Truly resolution-independant graphics are well overdue.

            However, I'd also like to see the 3D engine (and other specialised chips like audio DSPs etc.) becoming more like standard system resources, used as an when possible, for whatever they can be used for. This idea of having specialised chips that just sit there unless something needs to be drawn, while the CPU simulates a weather system is a bit wasteful.

            • Indeed. I was an Amiga user too, so I can understand your rationale. Every computer needs specialized chips for the most difficult tasks that are repeated inside a program...array math is one of those tasks, and the Pentium-class CPUs are supposed to have it, but they nowhere near reach the performance of those chips inside GPUs.
      • Do they trust that Windows will continue to have good OpenGL support? Maybe it won't [blogspot.com].
    • Re:huh? (Score:3, Informative)

      It lets the system offload graphical operations onto the GPU, like Quartz Extreme on MacOS. Things like transparency get a lot easier to do.
    • Re:huh? (Score:5, Informative)

      by coolGuyZak ( 844482 ) on Tuesday January 03, 2006 @03:02AM (#14382703)
      XGL is a hardware accelerated x-server, which uses the 3D pipeline to render graphics. We should care because it will allow future developers to create whole manner of different effects (vector graphic scaling & rendering, interesting window effects, pixel shaded effects (bump mapping buttons), etc).

      If you are in the windows or mac worlds, there's not much of a reason to get excited... OSX already does this, and Vista will as well. But for those of us in *nix world who want eye-candy, it's quite A Good Thing (tm).

      • by JoeBuck ( 7947 ) on Tuesday January 03, 2006 @03:39AM (#14382814) Homepage
        .. is that future video cards might well be 3D-only, and the old 2D interfaces that X relies on won't be available. You'll have cards designed pretty tightly around the OpenGL spec and related specs, and if you don't have a way to do X with such a beast, forget using the card with Linux.
        • .. is that future video cards might well be 3D-only, and the old 2D interfaces that X relies on won't be available.

          ...and x86 will be replaced by RISC-instructions. Perhaps internally on the chip 2D will be nothing more than translation to 3D instructions, but I'm willing to bet they'll be available for decades. I imagine the circuitry would make up <<1% of the transistors.
    • Re:huh? (Score:5, Informative)

      by ArbitraryConstant ( 763964 ) on Tuesday January 03, 2006 @03:05AM (#14382711) Homepage
      I should add that OpenGL support is already available, but XGL allows the X server to transparently use it in a way that's compatible with existing applications.
    • Re:huh? (Score:2, Informative)

      by Anonymous Coward
      XGL is the missing piece of the puzzle which will allow the Linux desktop to have fancy effects to equal or exceed OS X and Windows Vista.

      Here is a video demo of the types of effects which become possible with XGL. [gnome.org] Note the translucent video playing while being warped and composited with the background window, and simultaneously being displayed live in miniature in the workspace pager to the right.

      OS X's "genie" effect, Windows Vista's "frosted" window effect, *real* transparent terminal windows, cra

      • Hey ! I ran Emacs in an Xterm in twm on my 486 50 and I liked it !
        Oh yes, I forgot...

        You insensitive clod !
    • Re: huh? (Score:2, Interesting)

      by O_D_Evans ( 763044 )
      This was posted a while ago on /.,can't find the original link. I did save the video demo from that article and have posted it here:

      http://media.putfile.com/xgl_wanking [putfile.com]
    • Can anyone translate this into English? What is XGL and why should we care?

      Back in the early 90's I used a 3D library from Sun called xgl. It was Sun's attempt to compete with SGI's GL. Quite nice too. I wonder why it was never release as software libre.

  • Unfree (Score:4, Insightful)

    by Brandybuck ( 704397 ) on Tuesday January 03, 2006 @02:54AM (#14382678) Homepage Journal
    Xcompmgr can currently be run under XGL with full acceleration provided that the proprietary ATI or Nvidia drivers are used.

    What good is Open Source if it's inextricably tied to proprietary software? Where do I send my money to get someone to write a Free Software video driver?
    • Re:Unfree (Score:5, Insightful)

      by Ruie ( 30480 ) on Tuesday January 03, 2006 @03:14AM (#14382742) Homepage
      Two things you can do (in no particular order):
      • Ask (politely) ATI to provide 3d specs
      • Work on DRI [sf.net] project (r300 driver for example)
    • Re:Unfree (Score:5, Insightful)

      by Anonymous Coward on Tuesday January 03, 2006 @03:15AM (#14382746)
      > Where do I send my money to get someone to write a Free Software video driver?

      I don't know, and I wish there was one too, but:

      I think people generally misunderstand the sheer amount of work put into those proprietary graphics drivers. It's not something where you can throw a few bucks at some garage coders and turn out the same thing. These are done by large teams of highly payed developers (I think 100 developers is the right order of magnitude, plus or minus), working for years. It takes *serious* amounts of money to fund that sort of development staff, and it's not something you and me and a few other likeminded folks are going to be able to fund.

      Can you get *some* working graphics driver for a lot less money? Of course. But you can't get what the proprietary drivers do, in terms of performance and functionality, on the cheap.

      Just tryin' to inject some reality into the picture here :D

      • Re:Unfree (Score:3, Interesting)

        by nathanh ( 1214 )

        I think people generally misunderstand the sheer amount of work put into those proprietary graphics drivers. It's not something where you can throw a few bucks at some garage coders and turn out the same thing. These are done by large teams of highly payed developers (I think 100 developers is the right order of magnitude, plus or minus), working for years. It takes *serious* amounts of money to fund that sort of development staff, and it's not something you and me and a few other likeminded folks are goi

      • Re:Unfree (Score:5, Informative)

        by vandan ( 151516 ) on Tuesday January 03, 2006 @03:52AM (#14382860) Homepage
        I disagree.

        The opensource R100, R200 & R300 drivers were written by the DRI developers. ATI provided some incomplete and contradictory documentation for the R100 & R200 to some select developers who had to sign an NDA. All the coding has been done by DRI developers. The R300 has been completely reverse-engineered.

        Now. Check out all 3 drivers. They not only work, but they work incredibly well. In fact they are faster and more stable than ATI's drivers, except for in some key areas ... usually areas where more documentation is required.

        The simple fact is that the very thing you're saying is impossible - opensource developement of top-quality drivers - has already happened. Not only that, despite your suggestion that they're not up to it, R300 developers continue to reverse-engineer and code for the current and upcoming cards from ATI. Pretty neat, eh? Check out the list of apps the R300 can run - you'll be surprised.
        • Re:Unfree (Score:4, Informative)

          by poofyhairguy82 ( 635386 ) on Tuesday January 03, 2006 @04:24AM (#14382924) Journal
          Now. Check out all 3 drivers. They not only work, but they work incredibly well. In fact they are faster and more stable than ATI's drivers, except for in some key areas ... usually areas where more documentation is required.

          Really? Everything I read tells me that the crappy closed ATI drivers are still faster when it comes to 3D [thinkwiki.org] than the open source drivers.

          I mean...its cool that at least one set of cards with decent 3D hardware has open driver, but those drivers are not for gamers to use. Its for me to use to get EXA.

          • Don't bother buying an ATI card if you plan on using their driver for any GL work. Your box will lock up hard. Constantly. You won't be able to play UT2004 for more than 30 minutes on a good day.

            After 2 years of fighthing with the crappy ATI drivers and my Radeon 9000 I finally gave up and bought an Nvidia. It's been three months since the purchase and my system has not locked up once. It is rock solid. I put the Radeon 9000 in the kids computer which runs Windows XP and it's much more stable there so

        • Re:Unfree (Score:2, Interesting)

          by msormune ( 808119 )
          Oh, just put a lid on it. Open Source R300 drivers are so far behind when compared to the official ATI drivers, it's hard mention them in the same sentence with a straight face. R300 driver is still pretty much a hack, and is very slow when compared to the binary drivers. It doesn't support RENDER extension, and OpenGL support is still also lacking. People are also complaining about lockups in a regular desktop situation and newer hardware support is also in the works (hopefully).
          • People are also complaining about lockups in a regular desktop situation...

            Oh...in that case, they've achieved parity with ATI's proprietary driver. I got sick of the problem with nVidia's proprietary drivers causing X to hang and eat 99+% of CPU (there's a two-year-old thread on their forum on the problem), bit the bullet and got an ATI card sufficiently recent that ATI deigns to provide a proprietary Linux driver for it--and a couple of days ago woke up to find the same problem my wife's computer had with
      • Re:Unfree (Score:4, Funny)

        by BillyBlaze ( 746775 ) <tomfelker@gmail.com> on Tuesday January 03, 2006 @04:04AM (#14382887)
        I have an idea - nVidia could subsidize the tremendous task of writing open source video drivers with some sort of side business, like, oh I don't know, manufacturing video hardware.
      • Re:Unfree (Score:4, Insightful)

        by Lussarn ( 105276 ) on Tuesday January 03, 2006 @04:58AM (#14383039)
        It takes *serious* amounts of money to fund that sort of development staff, and it's not something you and me and a few other likeminded folks are going to be able to fund.

        You are talking about a different issue. When he said he wanted a free software driver he did not said the developers working on it shouldn't be payed. Nvidia and ATI can throw 1000 paid developers on the problem for all I care and still develop a Free software driver.

        Nvidia would still sell the hardware even if the driver where Free software. What good is the driver without the hardware?

        Now, you would maybe say Nvidia can't open source the driver because they don't own all of it. I say bullshit, If there is a will there is a way. The will just isn't there today, but the future might change that.
      • Re:Unfree (Score:2, Insightful)

        by BobFunk ( 620191 )
        I don't know, and I wish there was one too, but: I think people generally misunderstand the sheer amount of work put into those professional operating systems. It's not something where you can throw a few bucks at some garage coders and turn out the same thing. These are done by large teams of highly payed developers (I think 100 developers is the right order of magnitude, plus or minus), working for years. It takes *serious* amounts of money to fund that sort of development staff, and it's not something y
    • Re:Unfree (Score:3, Informative)

      by PitaBred ( 632671 )
      How much money do you have? Because you're gonna have to buy the IP of a LOT of companies in order to open source their stuff. Lots of proprietary stuff in the chips and the drivers, from what I hear.
      And FYI, the older Radeons (up to 8500 I believe?) have a decent open source driver.
      • Lots of proprietary stuff in the chips and the drivers, from what I hear.

        Which isn't "need-to-know". To write drivers you need specs on what the chip can do, not how. If you have the documentation for the Linux API and the documention for the chip, you can find your own way from A to B. Have you looked at what developers are able to achieve just by reverse engineering? If they could stop wasting time on that and getting straight down to coding up optimized algorithms, OSS drivers would take a huge leap ahea
      • Re:Unfree (Score:4, Insightful)

        by Jah-Wren Ryel ( 80510 ) on Tuesday January 03, 2006 @04:52AM (#14383008)
        How much money do you have? Because you're gonna have to buy the IP of a LOT of companies in order to open source their stuff. Lots of proprietary stuff in the chips and the drivers, from what I hear.

        That's the same tired old line we've been hearing since the days before XFree86, when it was just X386. And you know what? It's all bullshit.

        All the cards through-out the years that vendors have kept proprietary, they all eventually received 3rd party open-source drivers and you don't hear a word about those 3rd parties being sued or otherwise harassed for violating anyone's IP. All it took was time and effort for people to reverse engineer the proprietary ms-windows drivers.
    • Re:Unfree (Score:5, Informative)

      by poofyhairguy82 ( 635386 ) on Tuesday January 03, 2006 @03:36AM (#14382809) Journal
      Where do I send my money to get someone to write a Free Software video driver?

      You don't. Nvidia and ATI could not care about you money- the only reason they made drivers for Linux in the first place was to sell high end cards to render 3D scenes.

      If you really want to support open drivers, buy an ATI 9250 and help test EXA and Xgl on there. That is the best card we have with open drivers, and it seems like it will be on top for a LONG time.

      • Re:Unfree (Score:3, Informative)

        by c0l0 ( 826165 )
        Actually, the best (as in "fastest") card with free hardware glx support of the R2X0-series of Radeon chips is the "original" Radeon 8500 with 275MHz clock on GPU and RAM. The Radeon 9250 is just some stripped down (in terms of pixel-pipelines or so) version of the original R250.
         
        However, XOrg 7.x comes with a driver called "r300", supporting glx on more recent hardware, ranging from the Radeon 9500-series up to the X850, I believe. There's just no way to utilize the cards' shaders, yet.
    • Go out and buy a voodoo 5. Unfortunately they're getting older (and rarer) every year.
    • There are lots of free video-drivers. X.org ships with quite many. There are free drivers for NVIDIA as well. The reason why XGL only works properly with those proprietary drivers can be numerous, but that doesn't mean that free drivers don't exist.
    • I don't get it. I have Matrox G550 with 100% Open Source driver. It allows me to play Quake3 at almost 50+ fps. And somehow this card is unable to draw few rectangles for a desktop? Something is not right here.
      • The way XGL works is that it represents window buffers are textures. This requires the OpenGL driver to support the pbuffers extension, which (IIRC), none of the open source drivers do.
  • by Phariom ( 941580 ) on Tuesday January 03, 2006 @02:56AM (#14382683)
    "Compared to the xserver module code in freedesktop CVS a lot have changed. The new code contains an uncounted number of bug fixes, some major restructuring and a few additional features."

    ...if they didn't count them?
  • Not opened up (Score:3, Interesting)

    by Anonymous Coward on Tuesday January 03, 2006 @02:56AM (#14382686)
    To say the development of XGL opened up is to assume it was closed before, which is absolutely untrue.

    Dave did major changes to XGL (as you can read in his post), and it's simply not possible to merge the code back while in the middle of a transition such as that. On top of that the X.org tree was pretty much frozen to allow the transition to modular X and the release of 7.0.

    The "Novell closed XGL" conspiracy came from people with their own personal agenda against Novell (and Ximian).
    • Well, they DID close the developement. Was the work done in public? No. Could people outside Novell watch as things progressed? No. Could people outside Novell test XGL? No. Could people outside Novell submit patches? No.

      In my book that means that the developement was closed. No-one except the company developing the software had access to it.

      Dave did major changes to XGL (as you can read in his post), and it's simply not possible to merge the code back while in the middle of a transition such as that. On to

  • Does it matter? (Score:4, Interesting)

    by Stalyn ( 662 ) on Tuesday January 03, 2006 @03:50AM (#14382856) Homepage Journal
    Does the source open during development matter? Look how much David Reveman got done by himself "behind closed doors". Really what matters is the source is available upon release to the public. Before that it doesn't really matter. The truth is the majority of the Xorg community doesn't believe in an OpenGL accelerated desktop. Look at the mailing list. The only people who do are a small group of coders who most likely do not have the time to actually achieve something worth using.

    However if a company like Novell did pick up the project and paid developers to work on it full time but the source would be closed until release... well tough luck. In reality the only reason David released the code now was to get it into the Xorg tree. That way they can continue to "code-drop" to a tree that can be used by everyone, instead of kdrive which is for developers.

    Also the Xorg developers seem to be concerned with Xegl which David isn't even working on. I dont care either way. Just get it done.
    • Does the source open during development matter? Look how much David Reveman got done by himself "behind closed doors". Really what matters is the source is available upon release to the public.

      Perhaps I'm just being a cynical bastard, but even if he doesn't accept a single patch I prefer the code to be publicly available. There's been at least a few incidents where OSS developers have spent a lot of time, usually delaying for more than a normal release and then pull a bait-and-switch and close the source, l
    • Re:Does it matter? (Score:5, Informative)

      by Anonymous Coward on Tuesday January 03, 2006 @06:04AM (#14383213)
      Two things:

      1. XGL wasn't developed in-house for Novell.

      XGL was started by independent free software programmers. Novell highered some of them and then took the developement 'in-house'.

      It started off open, Novell paid some of them to concentrate on it on the 'inside', now they are openning it up again.

      2. You don't understand the relationship between Xegl and XGL.

      XGL is a _toy_, it's a pre-view. It's a beginning. It's forming the basis for future X servers, but it is not actually usefull itself.

      XGL requires another X server to run on. Similar to Xnest.

      Xegl is based on XGL (again started and worked on originally without any Novell involvement), it is a standalone X server that will actually be used.

      You see one is useless without the other. XGL is worthless outside of developement. Xegl is worthless without the basis.

      Xegl is called Xegl because it takes OpenGL and add the EGL extensions to it. These extensions were originally designed for embedded work, but can be used with a full-fledged OpenGL system like Linux has. What it does is allow OpenGL programs to send signals to change screen resolution and things like that. These extensions will 'fill out' the OpenGL API so that you can use it effectively for a basis of a standalone system.

      Originally Linux's OpenGL stuff was like this. With original Mesa solo you didn't use X to run 3d accelerated applications. With things like GLX (open sourced from SGI) to 'mix' or manage OpenGL applications on a X server.

      There still are some gaps though.. Indirect rendering isn't very hot, for instance. That is when you run a application remotely (X Windows is a networking protocol after all, like HTTP or whatnot) you can't get OpenGL acceleration working on it.

      This, combined with other advances such as 'Modular X', 'X Damage', 'X composite', and 'XGL'/'xegl', is helping to move the X system from the 1980's era technology (were it is now) to the 2010's technology (where it will be in a couple more years).

      Hopefully it will allow you to do things like display your desktop applications on your laptop or handheld (which it can do now) but also allow you to easily transfer applications between devices while they are running, and to display to display. All with nice acceleration with complex window managers. Oh and don't forget Vector-based graphics (which we will have with next releases of Gnome and KDE), which will be OpenGL themselves accelerated in a year or two.

      EXA will help this a bit.

      As X server switch over to EXA for the time being and applications utilize it's acceleration more and more.. this EXA stuff translates suprisingly well to OpenGL.

      Also it will have the added benifit of moving X off of the hardware.

      Right now with X.org's X server you have all this extra crap it has to do with hardware drivers and such. By moving to pure OpenGL then each OS can handle the protocol stack on themselves. You can have Linux framebuffer with Mesa-based DRI drivers, propriatory drivers or have software Mesa on Netbsd, some sort of weird embedded stuff, or have Window's OpenGL stuff.. It doesn't matter. Let the OS manage the hardware itself and run X windows on OpenGL, just like any other OpenGL application.

      Right now we have Framebuffer, DRI, VGAcon, EXA and such that all have to fight over the hardware at the same time.

      That's 4 independant drivers from multiple independant vendors.. some from DRI, some from Linux kernel, some from X.org, that all have to use the _same_peice_of_fucking_hardware.

      Think about this:
      1 peice of hardware, 4 drivers.

      How many devices do you expect to function properly like that?

      With OpenGL-based X server, then you have only one driver that can do everything. It can even do console if you want.. (although I don't expect Linux to drop vgacon as long as video cards support legacy vga mode)

      Also if your a disapointed programmer wanting to work on X then I suggest you look strongly at XCB.
      • XGL was started by Dave Reveman. Actually all XGL code including Xegl starts with, "Copyright 2004 David Reveman". Also it might have been true XGL was a 'toy' in August but alot has changed. However it is true you still need an X server to run XGL on top of it, this might be changing.

        However I do agree Xegl is probably the better solution but the project has had little work on it done since Jon Smirl left.
      • by Hard_Code ( 49548 )
        "So don't worry about that."

        That's sort of hard in this alphabet soup of acronyms for myriad projects and libraries.

        I really really hope, and hope somebody can confirm this, that at the end of the day there is a STRONG inclination to:

        * developer a SINGLE (SINGLE! (SINGLE!! (i mean it))) X server binary which can either render through hardware acceleration OR software, which can be determined dynamically at startup (through configuration or auto-detection), as well as the slew of other acronyms. A separate
  • Yay for Complaining! (Score:3, Interesting)

    by poofyhairguy82 ( 635386 ) on Tuesday January 03, 2006 @04:13AM (#14382904) Journal
    I am glad to see that development opened a little. Not because I think it will speed the process up, but because I want to compile and use the newest Xgl! I am an Eye Candy addict.

    Hopefully this WILL make development more transparant. The Xgl is needed for the future Linux desktop and I am glad Novell decided to play ball with everyone.

    Oh course, the Xgl is still YEARS away from being shipped as the default on the desktop of a major distro. But we have to start somewhere, and people like me need the new eye candy fix!

    • Hopefully this WILL make development more transparant.

      From what I hear, not only will it be more transparent, but it will have smooth, alpha-blended shadows.

      Sorry, couldn't resist. :-)

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...