Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
X GUI Graphics Software

Xgl Developer Calls it Quits 85

nosoupforyou writes "Jon Smirl, one of two main developers for Xgl and Xegl (a version of X layered on top of OpenGL and rendering directly to the linux framebuffer, similar to Apple's Quartz Extreme) is calling it quits. Citing two years of effort without pay, a shortage of interest from developers, and no hope of release for more than a year, Jon is moving on."
This discussion has been archived. No new comments can be posted.

Xgl Developer Calls it Quits

Comments Filter:
  • Told you so! (Score:1, Interesting)

    by Anonymous Coward
    Xgl has been heavily backed by Novell/RedHat and the gnome fanboys (well, by 'backed' I mean hyped as if this was going to save gnome's arse), but the real work in X is being payed for by that favorite FOSS company that gives so much, but everyone loves to hate: Trolltech.

    Here you have it folks. It is now time for the gnome fanboys to jump off of the Xgl bandwagon. And thank Trolltech for coming up with exa that will allow composite to work with todays existing stuff. Of course, with Trolltech employing
    • So far all I've seen Trolltech do is hire on someone to work on X. Where exactly are they going? Wouldn't working on Xgl, something in development for 2 years, something that BOTH desktops could support and enjoy, be in the best interest of everyone?
      • Re:Told you so! (Score:1, Informative)

        by Anonymous Coward
        Read why the xgl guy is leaving. He says it is because of exa and because more demand for exa.

        Now, guess who created exa. That's right. A Trolltech employee who they hired to work on X.
    • Re:Told you so! (Score:5, Informative)

      by civilizedINTENSITY ( 45686 ) on Thursday August 11, 2005 @10:26PM (#13300274)
      I think you might be comparing apples and oranges, no? EXA stands for eyecandy X architecture, which is "based on KAA (KDrive acceleration architecture) it's designed to be an alternative to the currently used XAA (XFree86 acceleration architecture) with better acceleration of XRender which is used by composite managers for desktop eyecandy effects."

      XGL is "an X-Server layered on top of OpenGL."

      "The way things are heading is completely drop support for 2d acceleration and build a userspace X server that runs completely on a extended (currently EGL) OpenGL api. That way any OS that has any support for OpenGL, even if it's just thru a ported Mesa software rendering library, can run the X server."
      • Does anyone else see this post as being what is fundamentally wrong with the video archetecture X has pushed us to adopt?

        I've been trying to learn to code around X for years, and through my bouts with it I've learned one thing: X is the most confusing, hacked together monster of a piece of software that I have ever seen.

        I'm not saying X doesn't work.. I'm saying X needs to be replaced with something much more on the side of elegance, but since there is no inertia to replacing it, it won't ever be. Ins
  • considering the dearth of posts on slashdot i'd say no one cares that he quit.
  • This sucks (Score:4, Insightful)

    by Punboy ( 737239 ) on Thursday August 11, 2005 @10:01PM (#13300172) Homepage
    I was really looking forward to the completion of this project. This is what we all need to accomplish the goal of bringing Linux to the desktop. We need to be able to make a, what we're calling at Plasma, a "designer desktop" that everyone will love and enjoy.

    I'm surprised that Trolltech hasn't looked into and started contributing to this. They recently hired someone specifically to work on the enhancement of X and bringing its eye-candy and performance capabilities up to the point where it can compete with things like MacOS X without slowing down horrible.

    Trolltech, save us! :-p
    • I'm surprised that Trolltech hasn't looked into and started contributing to this. They recently hired someone specifically to work on the enhancement of X and bringing its eye-candy and performance capabilities up to the point where it can compete with things like MacOS X without slowing down horrible.


      Maybe it's because they aren't that big a company. I'm surprised that one of the larger companies interested in linux hasn't hired someone for this.
  • Good (Score:2, Insightful)

    by keesh ( 202812 )
    Hopefully now more effort will be made towards core functionality and better drivers rather than wasting time on annoying eye candy CPU drain. Mod me troll if you will, but I for one would rather have time devoted to proper usable drivers for modern graphics cards than some silly extras that eat all my CPU and RAM but contribute nothing towards functionality or productivity.
    • Re:Good (Score:5, Informative)

      by bug1 ( 96678 ) on Thursday August 11, 2005 @10:44PM (#13300354)
      And if you were financially contributing to the developers in question they might give a crap what you think they should be doing.

      If you havent noticed, open source software doesnt exist just to give you what you want.
      • That's what I love about Slashdot. An entire site devoted to expressing, sharing, and discussing people's opinions. And then someone expresses a slightly different opinion and gets flamed for it. Lovely.

        • What's wrong with flaming someone for having an opinion that disrespects the developers that work on the software they (presumably) use, at no cost?
    • Re:Good (Score:5, Insightful)

      by trevorcor ( 177535 ) on Thursday August 11, 2005 @11:23PM (#13300580)
      A good deal of the point of XGL, once you get past the hype-machine eye-candy business, is that it means only *one* driver for every video card -- a DRI one. Eliminating the dichotomy between the 2D X drivers and the 3D DRI drivers will only improve driver support for both 2D and 3D -- the effort needed to support both will be halfed, and 3D support will be required to support 2D.

      There's even been talk about porting the Linux kernel framebuffer drivers to the DRI interface (possibly in userspace, run from initramfs).

      Have you ever tried to get an X server, accelerated 3D, and a framebuffer console to get along on the same machine? It's ugly.

      Add in multi-monitor support, and you can't even do it. So it's not possible to have all of accelerated 3D, multiple monitors and a console on platforms like the Macintosh that don't have a text mode in hardware.

      Jon Smirl was also doing work on DRI/DRM, an area of the Linux video "architecture" that's much in need of love. I'm really sad to hear that he's giving all his video work up -- I was really looking forward to the day the whole Linux video mess got cleaned up for good.
      • Have you ever tried to get an X server, accelerated 3D, and a framebuffer console to get along on the same machine? It's ugly.
        Too true. And people wonder why there's slow adoption of Linux for non-power users.
      • Have you ever tried to get an X server, accelerated 3D, and a framebuffer console to get along on the same machine? It's ugly.

        Maybe it's just me, but my ThinkPad T40 has all of the above, and it was pretty painless to get working. If using Gentoo, just make sure you load radeonfb and not vesafb. 3D acceleration was a zero configuration matter, as the Radeon 7500 is already supported natively by X.org.

        My Desktop machine also has all of the mentioned features using a GeForce 3. The framebuffer console "j

        • Yep, radeonfb and friends get it right because BenH sweats bits to make it so -- otherwise open-source 3D wouldn't be possible on any recent Macintosh (it's not possible on the nVidia-chipped Macs in any case).

          The amount of voodoo involved in getting three video drivers to get along on all the different versions and revisions of the Radeon is insane. I keep tabs on linux-ppc-devel, the LKML, and from time to time dri-devel. It seems he's tweaking one component or another every week.

          It shouldn't be that di
    • Bad (Score:3, Insightful)

      by 2nd Post! ( 213333 )
      You don't know anything do you?

      1) Modern video cards are 3d accelerated
      2) 3d is a generic superset of 2d
      3) GPUs are nearly more powerful (if not clearly) than CPUs
      4) More graphics == more information
      5) More information == more productivity

      Assertion: 3d accelerated UIs reduce CPU usage (because more/all user feedback is handled by the GPU instead of the CPU, point 1, 2, 3, and 4), and provide improved usability (points 4 and 5).

      The loss of this effort also has negative consequences: Driver development is sta
      • by renoX ( 11677 )
        You forget to include the bad point of using OpenGL for X: for this to be efficient, you need to use the GPU 3D acceleration, which means currently using proprietary drivers from NVidia or ATI..

        No wonder, developpers are not so interested helping XGL..
        • That's still the developer's loss, because both Apple and Microsoft are using "proprietary" drivers from NVIDIA and ATI
          • "Developers Loss"? Say what?

            I used to manage at ATI. And *I* don't use closed source
            drivers. So, I don't have 3D
            acceleration. So sad. I do
            not consider it a loss; it's
            the price of a stable system

            I will not use a proprietary
            driver from anyone. The only
            "proprietary" part may be
            firmware load for the card.

          • by renoX ( 11677 )
            Developer loss?
            Which developers?
            Certainly the Linux kernel developpers would be unhappy if they only received tainted bug reports (or very happy as these report goes probably in /dev/null very fast)..
            So at least those developpers are quite happy if the users do not taint their kernel!
      • 5) More information == more productivity

        I was right with you on the rest of them, but this is so clearly wrong in the vast majority of cases it's not even funny.

        More information == more time wasted in decision making == less productivity

        • There's a fairly clear distinction between data and information.

          More information == more productivity
          More data == more work

          You need to process data before it can become information. Raw video feeds is data. Filtered, highlighted, and selectively cut video is information (since the act of filtering, highlighting, and cutting tunes the data towards specific requirements).

          So if more GPU power can be borne on raw data to provide real information, you get more productivity because you are actually reducing the d
          • I don't buy it.

            More information just means you have more filtering to do before the proper course of action can be decided upon. Some jobs may require more information in order to be performed properly, but in no way do I see that as a correlation between information and productivity. If any correlation exists between the two, I am fairly certain that it is inverse in nature.

            The problem, I suppose, is that, despite your assertion, there is no clear deliniation of the point at which data becomes information,
            • I understand your point. There is a point of diminishing returns where information ceases to be useful (any information above and beyond what is necessary to avoid hitting another car is useless, for example; but that same data provided before hand to avoid the unfortunate situation in the first place is information).

              I used Expose as a specific example because it increases the amount of information, and not data, available to you on a fairly common example. Window overload. On my workstation, at work, I ca
    • Saying "mod me troll if you will" should not, of itself, make you immune to being modded a troll.

      Something to think about.

    • Re:Good (Score:2, Interesting)

      by ironfroggy ( 262096 )
      Actually layering X over OpenGL would result in less CPU drain, because the rendering would be hardware accelerated by your graphics card, instead of all processed by the CPU. Get you facts right, and stop being stubborn and not looking into the facts, first! That sort of mentality is the problem with the FOSS community!
  • by rarrar ( 671411 ) on Thursday August 11, 2005 @11:28PM (#13300622)
    Forgive my appeal to authority but,

    Nat Friedman: "Xgl opens up a whole world of hardware acceleration, fancy animations, separating hardware resolution from software resolution, and more"

    To those moaning about the lack of better video drivers, From wikipedia: "Structuring all rendering on top of Opengl should simplify modern video driver development and not have the separation of 2D and 3D acceleration." That means vendors would have an easier time giving you your "better drivers".

    And of course OS X and Longhorn have already gone this route, placing FOSS behind the times.

    And finally, you can have both improved current X and Xegl. Witness the recent Exa buzz (replacement X acceleration architecture); current X is getting a boost already, Xegl doesn't slow this in any way, however Exa is slowing Xegl apparently.
  • i find it a shame that these projects, which seem to be the only things that could save us from x11 (along with its 16-bit code), are going downhill from negligence.

    its all complicated code that's hard to contribute too, mixed with slow progress that will lose fans. ive been looking at y-windows [y-windows.org] ever since it appeared on slashdot a couple years ago, but its seemed to have died down almost to the point of giving up.

    it'd be nice if groups like Trolltech or Red-hat could fund these groups, to help one o
    • Xegl isn't dead, there's still Dave Reveman who iirc is employed by Novell. I think the shame is that most people agree that Xgl in some form is the future of X. I think this simply means the future is NOT now, and won't be any time soon. However there are advances being made, Exa being the one that's stealing Xegls steam currently.
    • by jd ( 1658 )
      ...lots of good ideas have bit the dust. Berlin (a GUI built on OpenGL and CORBA) was a good idea, but is no more. KGI has bit the dust repeatedly, but again seems to be more "what could have been" than anything. GGI is making progress, but agonizingly slowly.

      Part of the problem is resources. There just isn't enough being spent by graphics folks on, well, graphics. SGI had some amazing graphics. Once. OpenGL was revolutionary. Once. Then they kinda lost their way and are now on the verge of extinction - iro

  • by reg ( 5428 )

    I know how it feels to have people neglect your work... But it's a pity that he's throwing in the towel at this point.

    Cairo, the main consumer of Xgl/Xegl, is just nearing version 1.0, and will be used by the new releases of Gtk/Gnome. Also, once Gecko 1.8 is out the door the plan is to move the entire Gecko GFX architecture to Cairo. It already uses it for SVG rendering. So some of the big boys are coming to the party!

    Hopefully things will pick up and he'll return to it soon...

    Regards,
    -Jeremy

  • Why is everybody bemoaning the demise of Xgl?

    The main point, to me, is the reason nobody's interested any more: X11 is getting better and, with recent extensions such as EXA and all that composite stuff, has caught up in terms of eyecandiness. The niche for the project no longer exists as Xorg-X11 proper is starting to fill it. And that's a good thing.
    • Yeah but X feels so slow unresponsive and sluggish ! that it's a pain ! I don't know if this can ever be fixed. We where told that with soft real time added to the Linux Kernel X responsive will be greatly improved, but I admit I have seen stricly no improvement resulting from this.
      • Try running "nice" on it. Something like
        su - -c 'nice -n -10 su - user -c "startx"'
        or something along those lines will make it feel a lot faster.
      • Ignoring the fact that "feels" is a terribly non-quantitative term, my X11 desktop on my 4-year-old Athlon 1.33GHz "feels" much faster to me than WinXP on my 4-month-old P4 1.7GHz laptop. To make matters worse, the video card on my X11 desktop is a 6-year-old nvidia card, while the laptop has a pretty new radeon 7500. And yet, the X11 box still "feels" faster.

        Go figure.
    • Ever want to mix OpenGL and Composite on the same display? Not going to happen outside of Xgl.
  • Judging by the replies in the actual thread he posted on, and the low number of /. replies...who cares?

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...